Mar 20 08:33:48.574964 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 20 08:33:49.317259 master-0 kubenswrapper[3976]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:33:49.317259 master-0 kubenswrapper[3976]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 08:33:49.317259 master-0 kubenswrapper[3976]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:33:49.317259 master-0 kubenswrapper[3976]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:33:49.317259 master-0 kubenswrapper[3976]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 08:33:49.317259 master-0 kubenswrapper[3976]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:33:49.319015 master-0 kubenswrapper[3976]: I0320 08:33:49.318620 3976 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 08:33:49.326405 master-0 kubenswrapper[3976]: W0320 08:33:49.326343 3976 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:33:49.326405 master-0 kubenswrapper[3976]: W0320 08:33:49.326384 3976 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:33:49.326405 master-0 kubenswrapper[3976]: W0320 08:33:49.326397 3976 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:33:49.326405 master-0 kubenswrapper[3976]: W0320 08:33:49.326411 3976 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:33:49.326405 master-0 kubenswrapper[3976]: W0320 08:33:49.326423 3976 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326435 3976 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326445 3976 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326454 3976 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326464 3976 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326473 3976 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326482 3976 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326491 3976 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326503 3976 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326513 3976 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326524 3976 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326534 3976 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326545 3976 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326555 3976 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326564 3976 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326573 3976 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326582 3976 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326591 3976 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326600 3976 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:33:49.326738 master-0 kubenswrapper[3976]: W0320 08:33:49.326608 3976 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326617 3976 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326640 3976 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326649 3976 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326658 3976 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326667 3976 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326676 3976 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326685 3976 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326694 3976 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326706 3976 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326717 3976 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326730 3976 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326740 3976 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326750 3976 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326760 3976 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326770 3976 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326780 3976 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326790 3976 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326800 3976 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326811 3976 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:33:49.327593 master-0 kubenswrapper[3976]: W0320 08:33:49.326822 3976 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326833 3976 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326845 3976 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326856 3976 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326867 3976 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326879 3976 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326890 3976 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326901 3976 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326913 3976 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326924 3976 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326935 3976 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326946 3976 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326957 3976 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326967 3976 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326978 3976 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326987 3976 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.326995 3976 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.327007 3976 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.327018 3976 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.327029 3976 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:33:49.328424 master-0 kubenswrapper[3976]: W0320 08:33:49.327039 3976 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: W0320 08:33:49.327050 3976 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: W0320 08:33:49.327065 3976 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: W0320 08:33:49.327074 3976 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: W0320 08:33:49.327083 3976 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: W0320 08:33:49.327092 3976 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: W0320 08:33:49.327103 3976 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: W0320 08:33:49.327113 3976 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: W0320 08:33:49.327122 3976 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: I0320 08:33:49.327352 3976 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: I0320 08:33:49.327374 3976 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: I0320 08:33:49.327393 3976 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: I0320 08:33:49.327406 3976 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: I0320 08:33:49.327419 3976 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: I0320 08:33:49.327430 3976 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: I0320 08:33:49.327443 3976 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: I0320 08:33:49.327456 3976 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: I0320 08:33:49.327466 3976 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: I0320 08:33:49.327476 3976 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: I0320 08:33:49.327489 3976 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: I0320 08:33:49.327500 3976 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: I0320 08:33:49.327510 3976 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 08:33:49.329228 master-0 kubenswrapper[3976]: I0320 08:33:49.327521 3976 flags.go:64] FLAG: --cgroup-root="" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327531 3976 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327541 3976 flags.go:64] FLAG: --client-ca-file="" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327551 3976 flags.go:64] FLAG: --cloud-config="" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327562 3976 flags.go:64] FLAG: --cloud-provider="" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327572 3976 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327605 3976 flags.go:64] FLAG: --cluster-domain="" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327619 3976 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327634 3976 flags.go:64] FLAG: --config-dir="" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327647 3976 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327662 3976 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327679 3976 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327693 3976 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327706 3976 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327717 3976 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327727 3976 flags.go:64] FLAG: --contention-profiling="false" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327737 3976 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327747 3976 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327758 3976 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327769 3976 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327784 3976 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327794 3976 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327804 3976 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327815 3976 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327825 3976 flags.go:64] FLAG: --enable-server="true" Mar 20 08:33:49.329917 master-0 kubenswrapper[3976]: I0320 08:33:49.327836 3976 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.327853 3976 flags.go:64] FLAG: --event-burst="100" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.327864 3976 flags.go:64] FLAG: --event-qps="50" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.327874 3976 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.327884 3976 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.327894 3976 flags.go:64] FLAG: --eviction-hard="" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.327907 3976 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.327917 3976 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.327929 3976 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.327940 3976 flags.go:64] FLAG: --eviction-soft="" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.327950 3976 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.327962 3976 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.327972 3976 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.327982 3976 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.328012 3976 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.328025 3976 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.328036 3976 flags.go:64] FLAG: --feature-gates="" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.328049 3976 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.328060 3976 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.328071 3976 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.328082 3976 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.328093 3976 flags.go:64] FLAG: --healthz-port="10248" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.328103 3976 flags.go:64] FLAG: --help="false" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.328114 3976 flags.go:64] FLAG: --hostname-override="" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.328128 3976 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.328141 3976 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 08:33:49.330836 master-0 kubenswrapper[3976]: I0320 08:33:49.328154 3976 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328167 3976 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328179 3976 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328222 3976 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328234 3976 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328250 3976 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328265 3976 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328278 3976 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328292 3976 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328305 3976 flags.go:64] FLAG: --kube-reserved="" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328318 3976 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328330 3976 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328343 3976 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328354 3976 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328366 3976 flags.go:64] FLAG: --lock-file="" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328378 3976 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328391 3976 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328408 3976 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328454 3976 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328468 3976 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328499 3976 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328513 3976 flags.go:64] FLAG: --logging-format="text" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328526 3976 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328540 3976 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328554 3976 flags.go:64] FLAG: --manifest-url="" Mar 20 08:33:49.331821 master-0 kubenswrapper[3976]: I0320 08:33:49.328566 3976 flags.go:64] FLAG: --manifest-url-header="" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328583 3976 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328596 3976 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328612 3976 flags.go:64] FLAG: --max-pods="110" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328625 3976 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328639 3976 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328652 3976 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328686 3976 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328701 3976 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328715 3976 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328728 3976 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328759 3976 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328772 3976 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328785 3976 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328798 3976 flags.go:64] FLAG: --pod-cidr="" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328812 3976 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328834 3976 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328846 3976 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328859 3976 flags.go:64] FLAG: --pods-per-core="0" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328871 3976 flags.go:64] FLAG: --port="10250" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328885 3976 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328897 3976 flags.go:64] FLAG: --provider-id="" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328911 3976 flags.go:64] FLAG: --qos-reserved="" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328923 3976 flags.go:64] FLAG: --read-only-port="10255" Mar 20 08:33:49.332618 master-0 kubenswrapper[3976]: I0320 08:33:49.328938 3976 flags.go:64] FLAG: --register-node="true" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.328951 3976 flags.go:64] FLAG: --register-schedulable="true" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.328963 3976 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.328986 3976 flags.go:64] FLAG: --registry-burst="10" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.328999 3976 flags.go:64] FLAG: --registry-qps="5" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329013 3976 flags.go:64] FLAG: --reserved-cpus="" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329027 3976 flags.go:64] FLAG: --reserved-memory="" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329043 3976 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329056 3976 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329069 3976 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329081 3976 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329094 3976 flags.go:64] FLAG: --runonce="false" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329107 3976 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329120 3976 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329133 3976 flags.go:64] FLAG: --seccomp-default="false" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329146 3976 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329158 3976 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329174 3976 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329225 3976 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329241 3976 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329255 3976 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329268 3976 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329282 3976 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329295 3976 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329308 3976 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 08:33:49.333440 master-0 kubenswrapper[3976]: I0320 08:33:49.329321 3976 flags.go:64] FLAG: --system-cgroups="" Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: I0320 08:33:49.329335 3976 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: I0320 08:33:49.329359 3976 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: I0320 08:33:49.329372 3976 flags.go:64] FLAG: --tls-cert-file="" Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: I0320 08:33:49.329384 3976 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: I0320 08:33:49.329401 3976 flags.go:64] FLAG: --tls-min-version="" Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: I0320 08:33:49.329413 3976 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: I0320 08:33:49.329427 3976 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: I0320 08:33:49.329457 3976 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: I0320 08:33:49.329471 3976 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: I0320 08:33:49.329484 3976 flags.go:64] FLAG: --v="2" Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: I0320 08:33:49.329501 3976 flags.go:64] FLAG: --version="false" Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: I0320 08:33:49.329518 3976 flags.go:64] FLAG: --vmodule="" Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: I0320 08:33:49.329549 3976 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: I0320 08:33:49.329563 3976 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: W0320 08:33:49.329851 3976 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: W0320 08:33:49.329869 3976 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: W0320 08:33:49.329880 3976 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: W0320 08:33:49.329894 3976 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: W0320 08:33:49.329907 3976 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: W0320 08:33:49.329920 3976 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: W0320 08:33:49.329934 3976 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: W0320 08:33:49.329946 3976 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:33:49.334353 master-0 kubenswrapper[3976]: W0320 08:33:49.329958 3976 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.329973 3976 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.329985 3976 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.329997 3976 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330008 3976 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330023 3976 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330037 3976 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330049 3976 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330060 3976 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330072 3976 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330090 3976 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330101 3976 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330112 3976 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330123 3976 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330134 3976 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330145 3976 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330156 3976 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330167 3976 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330177 3976 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330237 3976 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:33:49.335062 master-0 kubenswrapper[3976]: W0320 08:33:49.330249 3976 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330261 3976 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330271 3976 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330282 3976 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330293 3976 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330304 3976 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330329 3976 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330340 3976 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330350 3976 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330363 3976 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330375 3976 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330387 3976 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330397 3976 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330408 3976 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330420 3976 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330446 3976 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330462 3976 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330476 3976 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330488 3976 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330498 3976 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:33:49.335767 master-0 kubenswrapper[3976]: W0320 08:33:49.330509 3976 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330521 3976 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330539 3976 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330550 3976 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330561 3976 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330572 3976 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330598 3976 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330612 3976 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330623 3976 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330634 3976 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330645 3976 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330668 3976 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330683 3976 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330696 3976 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330707 3976 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330742 3976 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330753 3976 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330764 3976 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330775 3976 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330786 3976 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:33:49.336418 master-0 kubenswrapper[3976]: W0320 08:33:49.330796 3976 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:33:49.337049 master-0 kubenswrapper[3976]: W0320 08:33:49.330811 3976 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:33:49.337049 master-0 kubenswrapper[3976]: W0320 08:33:49.330825 3976 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:33:49.337049 master-0 kubenswrapper[3976]: W0320 08:33:49.330837 3976 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:33:49.337049 master-0 kubenswrapper[3976]: I0320 08:33:49.330873 3976 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 08:33:49.345757 master-0 kubenswrapper[3976]: I0320 08:33:49.345670 3976 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 20 08:33:49.345757 master-0 kubenswrapper[3976]: I0320 08:33:49.345728 3976 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 08:33:49.345975 master-0 kubenswrapper[3976]: W0320 08:33:49.345913 3976 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:33:49.345975 master-0 kubenswrapper[3976]: W0320 08:33:49.345930 3976 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:33:49.345975 master-0 kubenswrapper[3976]: W0320 08:33:49.345940 3976 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:33:49.345975 master-0 kubenswrapper[3976]: W0320 08:33:49.345950 3976 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:33:49.345975 master-0 kubenswrapper[3976]: W0320 08:33:49.345960 3976 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:33:49.345975 master-0 kubenswrapper[3976]: W0320 08:33:49.345969 3976 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:33:49.345975 master-0 kubenswrapper[3976]: W0320 08:33:49.345978 3976 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.345988 3976 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.345998 3976 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346007 3976 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346016 3976 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346025 3976 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346033 3976 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346042 3976 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346050 3976 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346059 3976 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346067 3976 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346076 3976 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346084 3976 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346093 3976 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346102 3976 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346110 3976 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346118 3976 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346127 3976 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346135 3976 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346144 3976 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:33:49.346372 master-0 kubenswrapper[3976]: W0320 08:33:49.346154 3976 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346162 3976 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346173 3976 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346225 3976 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346235 3976 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346247 3976 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346258 3976 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346267 3976 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346280 3976 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346291 3976 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346300 3976 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346310 3976 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346319 3976 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346328 3976 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346338 3976 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346347 3976 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346355 3976 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346364 3976 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346373 3976 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:33:49.347395 master-0 kubenswrapper[3976]: W0320 08:33:49.346382 3976 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346390 3976 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346402 3976 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346416 3976 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346427 3976 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346438 3976 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346449 3976 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346458 3976 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346467 3976 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346476 3976 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346485 3976 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346493 3976 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346502 3976 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346510 3976 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346519 3976 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346527 3976 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346535 3976 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346544 3976 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346553 3976 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:33:49.348822 master-0 kubenswrapper[3976]: W0320 08:33:49.346561 3976 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:33:49.349891 master-0 kubenswrapper[3976]: W0320 08:33:49.346570 3976 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:33:49.349891 master-0 kubenswrapper[3976]: W0320 08:33:49.346578 3976 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:33:49.349891 master-0 kubenswrapper[3976]: W0320 08:33:49.346587 3976 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:33:49.349891 master-0 kubenswrapper[3976]: W0320 08:33:49.346598 3976 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:33:49.349891 master-0 kubenswrapper[3976]: W0320 08:33:49.346607 3976 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:33:49.349891 master-0 kubenswrapper[3976]: W0320 08:33:49.346617 3976 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:33:49.349891 master-0 kubenswrapper[3976]: W0320 08:33:49.346626 3976 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:33:49.349891 master-0 kubenswrapper[3976]: I0320 08:33:49.346672 3976 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 08:33:49.349891 master-0 kubenswrapper[3976]: W0320 08:33:49.346962 3976 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:33:49.349891 master-0 kubenswrapper[3976]: W0320 08:33:49.346978 3976 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:33:49.349891 master-0 kubenswrapper[3976]: W0320 08:33:49.346987 3976 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:33:49.349891 master-0 kubenswrapper[3976]: W0320 08:33:49.346997 3976 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:33:49.349891 master-0 kubenswrapper[3976]: W0320 08:33:49.347006 3976 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:33:49.349891 master-0 kubenswrapper[3976]: W0320 08:33:49.347014 3976 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:33:49.349891 master-0 kubenswrapper[3976]: W0320 08:33:49.347023 3976 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347032 3976 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347040 3976 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347049 3976 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347057 3976 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347065 3976 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347074 3976 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347083 3976 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347091 3976 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347100 3976 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347108 3976 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347116 3976 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347125 3976 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347134 3976 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347142 3976 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347151 3976 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347159 3976 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347169 3976 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347206 3976 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347218 3976 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:33:49.350747 master-0 kubenswrapper[3976]: W0320 08:33:49.347228 3976 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347241 3976 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347252 3976 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347262 3976 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347271 3976 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347280 3976 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347290 3976 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347300 3976 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347309 3976 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347318 3976 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347326 3976 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347335 3976 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347343 3976 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347353 3976 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347362 3976 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347370 3976 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347379 3976 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347388 3976 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347398 3976 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:33:49.351942 master-0 kubenswrapper[3976]: W0320 08:33:49.347406 3976 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347418 3976 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347429 3976 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347439 3976 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347449 3976 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347459 3976 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347469 3976 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347478 3976 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347486 3976 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347495 3976 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347504 3976 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347512 3976 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347521 3976 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347529 3976 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347538 3976 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347547 3976 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347559 3976 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347570 3976 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347579 3976 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:33:49.353401 master-0 kubenswrapper[3976]: W0320 08:33:49.347588 3976 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:33:49.354792 master-0 kubenswrapper[3976]: W0320 08:33:49.347597 3976 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:33:49.354792 master-0 kubenswrapper[3976]: W0320 08:33:49.347605 3976 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:33:49.354792 master-0 kubenswrapper[3976]: W0320 08:33:49.347614 3976 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:33:49.354792 master-0 kubenswrapper[3976]: W0320 08:33:49.347623 3976 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:33:49.354792 master-0 kubenswrapper[3976]: W0320 08:33:49.347633 3976 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:33:49.354792 master-0 kubenswrapper[3976]: W0320 08:33:49.347642 3976 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:33:49.354792 master-0 kubenswrapper[3976]: W0320 08:33:49.347651 3976 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:33:49.354792 master-0 kubenswrapper[3976]: I0320 08:33:49.347663 3976 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 08:33:49.354792 master-0 kubenswrapper[3976]: I0320 08:33:49.347989 3976 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 08:33:49.354792 master-0 kubenswrapper[3976]: I0320 08:33:49.352469 3976 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 08:33:49.354792 master-0 kubenswrapper[3976]: I0320 08:33:49.353556 3976 server.go:997] "Starting client certificate rotation" Mar 20 08:33:49.354792 master-0 kubenswrapper[3976]: I0320 08:33:49.353594 3976 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 08:33:49.354792 master-0 kubenswrapper[3976]: I0320 08:33:49.354716 3976 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 08:33:49.382453 master-0 kubenswrapper[3976]: I0320 08:33:49.382367 3976 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 08:33:49.385428 master-0 kubenswrapper[3976]: I0320 08:33:49.385363 3976 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 08:33:49.387489 master-0 kubenswrapper[3976]: E0320 08:33:49.387375 3976 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:49.405156 master-0 kubenswrapper[3976]: I0320 08:33:49.405077 3976 log.go:25] "Validated CRI v1 runtime API" Mar 20 08:33:49.410869 master-0 kubenswrapper[3976]: I0320 08:33:49.410814 3976 log.go:25] "Validated CRI v1 image API" Mar 20 08:33:49.413457 master-0 kubenswrapper[3976]: I0320 08:33:49.413379 3976 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 08:33:49.419880 master-0 kubenswrapper[3976]: I0320 08:33:49.419492 3976 fs.go:135] Filesystem UUIDs: map[4a66d702-cf3e-4c68-968a-18f659b89ac6:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 20 08:33:49.419880 master-0 kubenswrapper[3976]: I0320 08:33:49.419543 3976 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Mar 20 08:33:49.443648 master-0 kubenswrapper[3976]: I0320 08:33:49.443139 3976 manager.go:217] Machine: {Timestamp:2026-03-20 08:33:49.44131192 +0000 UTC m=+0.710135457 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:7cbba5bd4cad48d397925286776799f2 SystemUUID:7cbba5bd-4cad-48d3-9792-5286776799f2 BootID:2d4df506-7881-4563-b01f-2840d2bdb60b Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:c7:39:2c Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:52:c7:57:df:ad:1d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 08:33:49.443648 master-0 kubenswrapper[3976]: I0320 08:33:49.443610 3976 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 08:33:49.444048 master-0 kubenswrapper[3976]: I0320 08:33:49.443939 3976 manager.go:233] Version: {KernelVersion:5.14.0-427.113.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202603021444-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 08:33:49.444898 master-0 kubenswrapper[3976]: I0320 08:33:49.444844 3976 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 08:33:49.445420 master-0 kubenswrapper[3976]: I0320 08:33:49.445344 3976 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 08:33:49.445850 master-0 kubenswrapper[3976]: I0320 08:33:49.445426 3976 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 08:33:49.446640 master-0 kubenswrapper[3976]: I0320 08:33:49.446599 3976 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 08:33:49.446640 master-0 kubenswrapper[3976]: I0320 08:33:49.446633 3976 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 08:33:49.446815 master-0 kubenswrapper[3976]: I0320 08:33:49.446779 3976 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 08:33:49.446864 master-0 kubenswrapper[3976]: I0320 08:33:49.446833 3976 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 08:33:49.447086 master-0 kubenswrapper[3976]: I0320 08:33:49.447043 3976 state_mem.go:36] "Initialized new in-memory state store" Mar 20 08:33:49.447281 master-0 kubenswrapper[3976]: I0320 08:33:49.447242 3976 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 08:33:49.453006 master-0 kubenswrapper[3976]: I0320 08:33:49.452959 3976 kubelet.go:418] "Attempting to sync node with API server" Mar 20 08:33:49.453006 master-0 kubenswrapper[3976]: I0320 08:33:49.453001 3976 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 08:33:49.453142 master-0 kubenswrapper[3976]: I0320 08:33:49.453051 3976 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 08:33:49.453142 master-0 kubenswrapper[3976]: I0320 08:33:49.453078 3976 kubelet.go:324] "Adding apiserver pod source" Mar 20 08:33:49.453142 master-0 kubenswrapper[3976]: I0320 08:33:49.453109 3976 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 08:33:49.458684 master-0 kubenswrapper[3976]: I0320 08:33:49.458620 3976 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 20 08:33:49.459056 master-0 kubenswrapper[3976]: W0320 08:33:49.458978 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:49.459245 master-0 kubenswrapper[3976]: E0320 08:33:49.459211 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:49.459431 master-0 kubenswrapper[3976]: W0320 08:33:49.459341 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:49.459513 master-0 kubenswrapper[3976]: E0320 08:33:49.459475 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:49.462056 master-0 kubenswrapper[3976]: I0320 08:33:49.462006 3976 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 08:33:49.462459 master-0 kubenswrapper[3976]: I0320 08:33:49.462414 3976 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 08:33:49.462522 master-0 kubenswrapper[3976]: I0320 08:33:49.462474 3976 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 08:33:49.462522 master-0 kubenswrapper[3976]: I0320 08:33:49.462500 3976 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 08:33:49.462522 master-0 kubenswrapper[3976]: I0320 08:33:49.462519 3976 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 08:33:49.462628 master-0 kubenswrapper[3976]: I0320 08:33:49.462534 3976 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 08:33:49.462628 master-0 kubenswrapper[3976]: I0320 08:33:49.462570 3976 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 08:33:49.462628 master-0 kubenswrapper[3976]: I0320 08:33:49.462585 3976 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 08:33:49.462628 master-0 kubenswrapper[3976]: I0320 08:33:49.462598 3976 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 08:33:49.462628 master-0 kubenswrapper[3976]: I0320 08:33:49.462615 3976 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 08:33:49.462628 master-0 kubenswrapper[3976]: I0320 08:33:49.462629 3976 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 08:33:49.462835 master-0 kubenswrapper[3976]: I0320 08:33:49.462650 3976 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 08:33:49.462835 master-0 kubenswrapper[3976]: I0320 08:33:49.462675 3976 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 08:33:49.463722 master-0 kubenswrapper[3976]: I0320 08:33:49.463685 3976 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 08:33:49.464576 master-0 kubenswrapper[3976]: I0320 08:33:49.464546 3976 server.go:1280] "Started kubelet" Mar 20 08:33:49.464871 master-0 kubenswrapper[3976]: I0320 08:33:49.464778 3976 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 08:33:49.464930 master-0 kubenswrapper[3976]: I0320 08:33:49.464899 3976 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 20 08:33:49.469975 master-0 kubenswrapper[3976]: I0320 08:33:49.469904 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:49.470449 master-0 kubenswrapper[3976]: I0320 08:33:49.470415 3976 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 08:33:49.470697 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 20 08:33:49.470873 master-0 kubenswrapper[3976]: I0320 08:33:49.470464 3976 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 08:33:49.472658 master-0 kubenswrapper[3976]: I0320 08:33:49.472584 3976 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 08:33:49.472750 master-0 kubenswrapper[3976]: I0320 08:33:49.472664 3976 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 08:33:49.473137 master-0 kubenswrapper[3976]: I0320 08:33:49.473087 3976 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 08:33:49.473137 master-0 kubenswrapper[3976]: I0320 08:33:49.473131 3976 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 08:33:49.473378 master-0 kubenswrapper[3976]: E0320 08:33:49.473061 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:33:49.474789 master-0 kubenswrapper[3976]: I0320 08:33:49.474765 3976 factory.go:55] Registering systemd factory Mar 20 08:33:49.474884 master-0 kubenswrapper[3976]: I0320 08:33:49.474793 3976 factory.go:221] Registration of the systemd container factory successfully Mar 20 08:33:49.476254 master-0 kubenswrapper[3976]: I0320 08:33:49.476206 3976 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 20 08:33:49.481574 master-0 kubenswrapper[3976]: I0320 08:33:49.481549 3976 reconstruct.go:97] "Volume reconstruction finished" Mar 20 08:33:49.481711 master-0 kubenswrapper[3976]: I0320 08:33:49.481693 3976 reconciler.go:26] "Reconciler: start to sync state" Mar 20 08:33:49.483668 master-0 kubenswrapper[3976]: E0320 08:33:49.483602 3976 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 20 08:33:49.483888 master-0 kubenswrapper[3976]: I0320 08:33:49.483851 3976 factory.go:153] Registering CRI-O factory Mar 20 08:33:49.483888 master-0 kubenswrapper[3976]: I0320 08:33:49.483885 3976 factory.go:221] Registration of the crio container factory successfully Mar 20 08:33:49.484019 master-0 kubenswrapper[3976]: I0320 08:33:49.483977 3976 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 08:33:49.484076 master-0 kubenswrapper[3976]: I0320 08:33:49.484028 3976 factory.go:103] Registering Raw factory Mar 20 08:33:49.484076 master-0 kubenswrapper[3976]: I0320 08:33:49.484055 3976 manager.go:1196] Started watching for new ooms in manager Mar 20 08:33:49.485124 master-0 kubenswrapper[3976]: I0320 08:33:49.485074 3976 manager.go:319] Starting recovery of all containers Mar 20 08:33:49.485244 master-0 kubenswrapper[3976]: E0320 08:33:49.483760 3976 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189e7fa03f53b38f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.464499087 +0000 UTC m=+0.733322404,LastTimestamp:2026-03-20 08:33:49.464499087 +0000 UTC m=+0.733322404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:33:49.485519 master-0 kubenswrapper[3976]: I0320 08:33:49.485483 3976 server.go:449] "Adding debug handlers to kubelet server" Mar 20 08:33:49.485592 master-0 kubenswrapper[3976]: W0320 08:33:49.485466 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:49.485651 master-0 kubenswrapper[3976]: E0320 08:33:49.485582 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:49.487791 master-0 kubenswrapper[3976]: E0320 08:33:49.487740 3976 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 20 08:33:49.505378 master-0 kubenswrapper[3976]: I0320 08:33:49.505343 3976 manager.go:324] Recovery completed Mar 20 08:33:49.518527 master-0 kubenswrapper[3976]: I0320 08:33:49.518499 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:49.520634 master-0 kubenswrapper[3976]: I0320 08:33:49.520549 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:49.520690 master-0 kubenswrapper[3976]: I0320 08:33:49.520644 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:49.520690 master-0 kubenswrapper[3976]: I0320 08:33:49.520663 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:49.521646 master-0 kubenswrapper[3976]: I0320 08:33:49.521622 3976 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 08:33:49.521716 master-0 kubenswrapper[3976]: I0320 08:33:49.521703 3976 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 08:33:49.521789 master-0 kubenswrapper[3976]: I0320 08:33:49.521779 3976 state_mem.go:36] "Initialized new in-memory state store" Mar 20 08:33:49.574333 master-0 kubenswrapper[3976]: E0320 08:33:49.574180 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:33:49.588079 master-0 kubenswrapper[3976]: I0320 08:33:49.588052 3976 policy_none.go:49] "None policy: Start" Mar 20 08:33:49.589917 master-0 kubenswrapper[3976]: I0320 08:33:49.589900 3976 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 08:33:49.590039 master-0 kubenswrapper[3976]: I0320 08:33:49.590025 3976 state_mem.go:35] "Initializing new in-memory state store" Mar 20 08:33:49.644981 master-0 kubenswrapper[3976]: I0320 08:33:49.639495 3976 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 08:33:49.644981 master-0 kubenswrapper[3976]: I0320 08:33:49.641623 3976 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 08:33:49.644981 master-0 kubenswrapper[3976]: I0320 08:33:49.641680 3976 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 08:33:49.644981 master-0 kubenswrapper[3976]: I0320 08:33:49.641711 3976 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 08:33:49.644981 master-0 kubenswrapper[3976]: E0320 08:33:49.641760 3976 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 08:33:49.644981 master-0 kubenswrapper[3976]: W0320 08:33:49.642944 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:49.644981 master-0 kubenswrapper[3976]: E0320 08:33:49.643020 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:49.663501 master-0 kubenswrapper[3976]: I0320 08:33:49.663410 3976 manager.go:334] "Starting Device Plugin manager" Mar 20 08:33:49.663501 master-0 kubenswrapper[3976]: I0320 08:33:49.663474 3976 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 08:33:49.663501 master-0 kubenswrapper[3976]: I0320 08:33:49.663492 3976 server.go:79] "Starting device plugin registration server" Mar 20 08:33:49.664124 master-0 kubenswrapper[3976]: I0320 08:33:49.664076 3976 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 08:33:49.664267 master-0 kubenswrapper[3976]: I0320 08:33:49.664100 3976 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 08:33:49.664355 master-0 kubenswrapper[3976]: I0320 08:33:49.664280 3976 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 08:33:49.664525 master-0 kubenswrapper[3976]: I0320 08:33:49.664442 3976 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 08:33:49.664525 master-0 kubenswrapper[3976]: I0320 08:33:49.664454 3976 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 08:33:49.665610 master-0 kubenswrapper[3976]: E0320 08:33:49.665570 3976 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 20 08:33:49.685043 master-0 kubenswrapper[3976]: E0320 08:33:49.684989 3976 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 20 08:33:49.742139 master-0 kubenswrapper[3976]: I0320 08:33:49.742069 3976 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 20 08:33:49.742139 master-0 kubenswrapper[3976]: I0320 08:33:49.742158 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:49.743147 master-0 kubenswrapper[3976]: I0320 08:33:49.743083 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:49.743147 master-0 kubenswrapper[3976]: I0320 08:33:49.743112 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:49.743147 master-0 kubenswrapper[3976]: I0320 08:33:49.743124 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:49.744267 master-0 kubenswrapper[3976]: I0320 08:33:49.743259 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:49.744267 master-0 kubenswrapper[3976]: I0320 08:33:49.743545 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:33:49.744267 master-0 kubenswrapper[3976]: I0320 08:33:49.743609 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:49.744267 master-0 kubenswrapper[3976]: I0320 08:33:49.743831 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:49.744267 master-0 kubenswrapper[3976]: I0320 08:33:49.743846 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:49.744267 master-0 kubenswrapper[3976]: I0320 08:33:49.743856 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:49.744267 master-0 kubenswrapper[3976]: I0320 08:33:49.743944 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:49.744267 master-0 kubenswrapper[3976]: I0320 08:33:49.744207 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:33:49.744267 master-0 kubenswrapper[3976]: I0320 08:33:49.744280 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:49.745702 master-0 kubenswrapper[3976]: I0320 08:33:49.744740 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:49.745702 master-0 kubenswrapper[3976]: I0320 08:33:49.744763 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:49.745702 master-0 kubenswrapper[3976]: I0320 08:33:49.744772 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:49.745702 master-0 kubenswrapper[3976]: I0320 08:33:49.744794 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:49.745702 master-0 kubenswrapper[3976]: I0320 08:33:49.744815 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:49.745702 master-0 kubenswrapper[3976]: I0320 08:33:49.744827 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:49.745702 master-0 kubenswrapper[3976]: I0320 08:33:49.745003 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.745702 master-0 kubenswrapper[3976]: I0320 08:33:49.745038 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:49.745702 master-0 kubenswrapper[3976]: I0320 08:33:49.745042 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:49.745702 master-0 kubenswrapper[3976]: I0320 08:33:49.745109 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:49.745702 master-0 kubenswrapper[3976]: I0320 08:33:49.745164 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:49.745702 master-0 kubenswrapper[3976]: I0320 08:33:49.745225 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:49.746003 master-0 kubenswrapper[3976]: I0320 08:33:49.745729 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:49.746003 master-0 kubenswrapper[3976]: I0320 08:33:49.745766 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:49.746003 master-0 kubenswrapper[3976]: I0320 08:33:49.745787 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:49.746081 master-0 kubenswrapper[3976]: I0320 08:33:49.746000 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:49.746081 master-0 kubenswrapper[3976]: I0320 08:33:49.746019 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:49.746081 master-0 kubenswrapper[3976]: I0320 08:33:49.746035 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:49.746435 master-0 kubenswrapper[3976]: I0320 08:33:49.746165 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:49.746435 master-0 kubenswrapper[3976]: I0320 08:33:49.746378 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.746435 master-0 kubenswrapper[3976]: I0320 08:33:49.746413 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:49.747082 master-0 kubenswrapper[3976]: I0320 08:33:49.747053 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:49.747082 master-0 kubenswrapper[3976]: I0320 08:33:49.747079 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:49.747154 master-0 kubenswrapper[3976]: I0320 08:33:49.747086 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:49.747154 master-0 kubenswrapper[3976]: I0320 08:33:49.747106 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:49.747154 master-0 kubenswrapper[3976]: I0320 08:33:49.747091 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:49.747286 master-0 kubenswrapper[3976]: I0320 08:33:49.747115 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:49.747286 master-0 kubenswrapper[3976]: I0320 08:33:49.747257 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:33:49.747286 master-0 kubenswrapper[3976]: I0320 08:33:49.747283 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:49.747894 master-0 kubenswrapper[3976]: I0320 08:33:49.747855 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:49.747894 master-0 kubenswrapper[3976]: I0320 08:33:49.747892 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:49.747975 master-0 kubenswrapper[3976]: I0320 08:33:49.747904 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:49.764777 master-0 kubenswrapper[3976]: I0320 08:33:49.764745 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:49.765715 master-0 kubenswrapper[3976]: I0320 08:33:49.765673 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:49.765762 master-0 kubenswrapper[3976]: I0320 08:33:49.765728 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:49.765762 master-0 kubenswrapper[3976]: I0320 08:33:49.765741 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:49.765857 master-0 kubenswrapper[3976]: I0320 08:33:49.765837 3976 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 20 08:33:49.766996 master-0 kubenswrapper[3976]: E0320 08:33:49.766951 3976 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 20 08:33:49.783650 master-0 kubenswrapper[3976]: I0320 08:33:49.783604 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.783772 master-0 kubenswrapper[3976]: I0320 08:33:49.783664 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:33:49.783772 master-0 kubenswrapper[3976]: I0320 08:33:49.783707 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:33:49.783772 master-0 kubenswrapper[3976]: I0320 08:33:49.783738 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:33:49.783885 master-0 kubenswrapper[3976]: I0320 08:33:49.783847 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.783929 master-0 kubenswrapper[3976]: I0320 08:33:49.783907 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.783970 master-0 kubenswrapper[3976]: I0320 08:33:49.783932 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:33:49.783970 master-0 kubenswrapper[3976]: I0320 08:33:49.783957 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:33:49.784052 master-0 kubenswrapper[3976]: I0320 08:33:49.783975 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.784052 master-0 kubenswrapper[3976]: I0320 08:33:49.784021 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.784052 master-0 kubenswrapper[3976]: I0320 08:33:49.784041 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.784165 master-0 kubenswrapper[3976]: I0320 08:33:49.784096 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.784165 master-0 kubenswrapper[3976]: I0320 08:33:49.784150 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.784282 master-0 kubenswrapper[3976]: I0320 08:33:49.784235 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.784325 master-0 kubenswrapper[3976]: I0320 08:33:49.784293 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.784404 master-0 kubenswrapper[3976]: I0320 08:33:49.784372 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.784473 master-0 kubenswrapper[3976]: I0320 08:33:49.784409 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:33:49.885372 master-0 kubenswrapper[3976]: I0320 08:33:49.885134 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.885372 master-0 kubenswrapper[3976]: I0320 08:33:49.885236 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.885654 master-0 kubenswrapper[3976]: I0320 08:33:49.885366 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.885654 master-0 kubenswrapper[3976]: I0320 08:33:49.885437 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:33:49.885654 master-0 kubenswrapper[3976]: I0320 08:33:49.885483 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:33:49.885654 master-0 kubenswrapper[3976]: I0320 08:33:49.885483 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.885654 master-0 kubenswrapper[3976]: I0320 08:33:49.885524 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:33:49.885654 master-0 kubenswrapper[3976]: I0320 08:33:49.885593 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:33:49.885654 master-0 kubenswrapper[3976]: I0320 08:33:49.885647 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.885654 master-0 kubenswrapper[3976]: I0320 08:33:49.885630 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.885760 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.885813 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.885837 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.885854 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.885875 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.885891 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.885907 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.885931 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.885956 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.885938 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.885968 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.885996 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.886015 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.886050 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.886093 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.886090 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:33:49.886153 master-0 kubenswrapper[3976]: I0320 08:33:49.886118 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:49.887323 master-0 kubenswrapper[3976]: I0320 08:33:49.886152 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.887323 master-0 kubenswrapper[3976]: I0320 08:33:49.886157 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:33:49.887323 master-0 kubenswrapper[3976]: I0320 08:33:49.886179 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.887323 master-0 kubenswrapper[3976]: I0320 08:33:49.886209 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.887323 master-0 kubenswrapper[3976]: I0320 08:33:49.886244 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.887323 master-0 kubenswrapper[3976]: I0320 08:33:49.886263 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.887323 master-0 kubenswrapper[3976]: I0320 08:33:49.886247 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:49.967819 master-0 kubenswrapper[3976]: I0320 08:33:49.967665 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:49.969148 master-0 kubenswrapper[3976]: I0320 08:33:49.969083 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:49.969148 master-0 kubenswrapper[3976]: I0320 08:33:49.969133 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:49.969148 master-0 kubenswrapper[3976]: I0320 08:33:49.969151 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:49.969414 master-0 kubenswrapper[3976]: I0320 08:33:49.969252 3976 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 20 08:33:49.970352 master-0 kubenswrapper[3976]: E0320 08:33:49.970293 3976 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 20 08:33:50.071033 master-0 kubenswrapper[3976]: I0320 08:33:50.070941 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:33:50.075353 master-0 kubenswrapper[3976]: I0320 08:33:50.075323 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:33:50.086363 master-0 kubenswrapper[3976]: E0320 08:33:50.086296 3976 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 20 08:33:50.095558 master-0 kubenswrapper[3976]: I0320 08:33:50.095500 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:33:50.117807 master-0 kubenswrapper[3976]: I0320 08:33:50.117734 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:33:50.127925 master-0 kubenswrapper[3976]: I0320 08:33:50.127887 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:33:50.370617 master-0 kubenswrapper[3976]: I0320 08:33:50.370427 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:50.372224 master-0 kubenswrapper[3976]: I0320 08:33:50.372122 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:50.372224 master-0 kubenswrapper[3976]: I0320 08:33:50.372217 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:50.372224 master-0 kubenswrapper[3976]: I0320 08:33:50.372238 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:50.372601 master-0 kubenswrapper[3976]: I0320 08:33:50.372309 3976 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 20 08:33:50.373455 master-0 kubenswrapper[3976]: E0320 08:33:50.373404 3976 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 20 08:33:50.385057 master-0 kubenswrapper[3976]: W0320 08:33:50.384963 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:50.385158 master-0 kubenswrapper[3976]: E0320 08:33:50.385071 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:50.471515 master-0 kubenswrapper[3976]: I0320 08:33:50.471415 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:50.606603 master-0 kubenswrapper[3976]: W0320 08:33:50.606325 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:50.606603 master-0 kubenswrapper[3976]: E0320 08:33:50.606505 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:50.669939 master-0 kubenswrapper[3976]: W0320 08:33:50.669808 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46f265536aba6292ead501bc9b49f327.slice/crio-24e53548727bf62b2d0124119a6e8fe69dde1c3031b4a91a063f8dd58773fece WatchSource:0}: Error finding container 24e53548727bf62b2d0124119a6e8fe69dde1c3031b4a91a063f8dd58773fece: Status 404 returned error can't find the container with id 24e53548727bf62b2d0124119a6e8fe69dde1c3031b4a91a063f8dd58773fece Mar 20 08:33:50.672375 master-0 kubenswrapper[3976]: W0320 08:33:50.672322 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd664a6d0d2a24360dee10612610f1b59.slice/crio-4c2c291535895db17dfb8bf23092bd97a1919325de4d3a5857890c5d777dcb49 WatchSource:0}: Error finding container 4c2c291535895db17dfb8bf23092bd97a1919325de4d3a5857890c5d777dcb49: Status 404 returned error can't find the container with id 4c2c291535895db17dfb8bf23092bd97a1919325de4d3a5857890c5d777dcb49 Mar 20 08:33:50.675976 master-0 kubenswrapper[3976]: I0320 08:33:50.675905 3976 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:33:50.686756 master-0 kubenswrapper[3976]: W0320 08:33:50.686708 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc83737980b9ee109184b1d78e942cf36.slice/crio-bfb1d352b3974d283e9d723de75463f8ee764b61d5e2eec1ee72f20516314a15 WatchSource:0}: Error finding container bfb1d352b3974d283e9d723de75463f8ee764b61d5e2eec1ee72f20516314a15: Status 404 returned error can't find the container with id bfb1d352b3974d283e9d723de75463f8ee764b61d5e2eec1ee72f20516314a15 Mar 20 08:33:50.699919 master-0 kubenswrapper[3976]: W0320 08:33:50.699855 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49fac1b46a11e49501805e891baae4a9.slice/crio-fa276789b940e2d115bb484a83b6dacabc102c332a9c60e259aaf653e5bc53d5 WatchSource:0}: Error finding container fa276789b940e2d115bb484a83b6dacabc102c332a9c60e259aaf653e5bc53d5: Status 404 returned error can't find the container with id fa276789b940e2d115bb484a83b6dacabc102c332a9c60e259aaf653e5bc53d5 Mar 20 08:33:50.842358 master-0 kubenswrapper[3976]: W0320 08:33:50.842177 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:50.842358 master-0 kubenswrapper[3976]: E0320 08:33:50.842366 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:50.887829 master-0 kubenswrapper[3976]: E0320 08:33:50.887719 3976 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 20 08:33:50.897201 master-0 kubenswrapper[3976]: W0320 08:33:50.897071 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:50.897346 master-0 kubenswrapper[3976]: E0320 08:33:50.897213 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:51.174638 master-0 kubenswrapper[3976]: I0320 08:33:51.174535 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:51.176167 master-0 kubenswrapper[3976]: I0320 08:33:51.176107 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:51.176445 master-0 kubenswrapper[3976]: I0320 08:33:51.176335 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:51.176612 master-0 kubenswrapper[3976]: I0320 08:33:51.176572 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:51.176861 master-0 kubenswrapper[3976]: I0320 08:33:51.176824 3976 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 20 08:33:51.178251 master-0 kubenswrapper[3976]: E0320 08:33:51.178167 3976 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 20 08:33:51.436399 master-0 kubenswrapper[3976]: I0320 08:33:51.436252 3976 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 08:33:51.437867 master-0 kubenswrapper[3976]: E0320 08:33:51.437820 3976 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:51.471325 master-0 kubenswrapper[3976]: I0320 08:33:51.471273 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:51.651275 master-0 kubenswrapper[3976]: I0320 08:33:51.650712 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"91607e86857f069937668c70d7500f8b27325b07177f03b7a7ab831b3600ac2a"} Mar 20 08:33:51.652441 master-0 kubenswrapper[3976]: I0320 08:33:51.652350 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"fa276789b940e2d115bb484a83b6dacabc102c332a9c60e259aaf653e5bc53d5"} Mar 20 08:33:51.654801 master-0 kubenswrapper[3976]: I0320 08:33:51.654767 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"bfb1d352b3974d283e9d723de75463f8ee764b61d5e2eec1ee72f20516314a15"} Mar 20 08:33:51.656488 master-0 kubenswrapper[3976]: I0320 08:33:51.656448 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"4c2c291535895db17dfb8bf23092bd97a1919325de4d3a5857890c5d777dcb49"} Mar 20 08:33:51.657974 master-0 kubenswrapper[3976]: I0320 08:33:51.657905 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"24e53548727bf62b2d0124119a6e8fe69dde1c3031b4a91a063f8dd58773fece"} Mar 20 08:33:52.472411 master-0 kubenswrapper[3976]: I0320 08:33:52.471845 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:52.489130 master-0 kubenswrapper[3976]: E0320 08:33:52.489036 3976 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 20 08:33:52.529572 master-0 kubenswrapper[3976]: E0320 08:33:52.529369 3976 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189e7fa03f53b38f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.464499087 +0000 UTC m=+0.733322404,LastTimestamp:2026-03-20 08:33:49.464499087 +0000 UTC m=+0.733322404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:33:52.546361 master-0 kubenswrapper[3976]: W0320 08:33:52.546268 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:52.546452 master-0 kubenswrapper[3976]: E0320 08:33:52.546368 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:52.663685 master-0 kubenswrapper[3976]: I0320 08:33:52.662969 3976 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="9f89ba02fad31edac59462f307fd61765f540c624fde9c9ab1bb13b80a642b0c" exitCode=0 Mar 20 08:33:52.663685 master-0 kubenswrapper[3976]: I0320 08:33:52.663046 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"9f89ba02fad31edac59462f307fd61765f540c624fde9c9ab1bb13b80a642b0c"} Mar 20 08:33:52.663685 master-0 kubenswrapper[3976]: I0320 08:33:52.663119 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:52.664109 master-0 kubenswrapper[3976]: I0320 08:33:52.664062 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:52.664109 master-0 kubenswrapper[3976]: I0320 08:33:52.664111 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:52.664234 master-0 kubenswrapper[3976]: I0320 08:33:52.664128 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:52.778479 master-0 kubenswrapper[3976]: I0320 08:33:52.778352 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:52.779492 master-0 kubenswrapper[3976]: I0320 08:33:52.779451 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:52.779492 master-0 kubenswrapper[3976]: I0320 08:33:52.779491 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:52.779579 master-0 kubenswrapper[3976]: I0320 08:33:52.779508 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:52.779618 master-0 kubenswrapper[3976]: I0320 08:33:52.779592 3976 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 20 08:33:52.780754 master-0 kubenswrapper[3976]: E0320 08:33:52.780700 3976 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 20 08:33:53.176043 master-0 kubenswrapper[3976]: W0320 08:33:53.175882 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:53.176043 master-0 kubenswrapper[3976]: E0320 08:33:53.175963 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:53.272330 master-0 kubenswrapper[3976]: W0320 08:33:53.272174 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:53.272418 master-0 kubenswrapper[3976]: E0320 08:33:53.272348 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:53.471652 master-0 kubenswrapper[3976]: I0320 08:33:53.471603 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:53.471837 master-0 kubenswrapper[3976]: W0320 08:33:53.471757 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:53.471918 master-0 kubenswrapper[3976]: E0320 08:33:53.471859 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:53.667048 master-0 kubenswrapper[3976]: I0320 08:33:53.667007 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/0.log" Mar 20 08:33:53.668395 master-0 kubenswrapper[3976]: I0320 08:33:53.668342 3976 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="a60ec6deeba27ead91a166fc89c2fe4bb87dba9e4662161bb82848ecc564f59f" exitCode=1 Mar 20 08:33:53.668467 master-0 kubenswrapper[3976]: I0320 08:33:53.668424 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:53.668515 master-0 kubenswrapper[3976]: I0320 08:33:53.668419 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"a60ec6deeba27ead91a166fc89c2fe4bb87dba9e4662161bb82848ecc564f59f"} Mar 20 08:33:53.669566 master-0 kubenswrapper[3976]: I0320 08:33:53.669543 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:53.669566 master-0 kubenswrapper[3976]: I0320 08:33:53.669569 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:53.669686 master-0 kubenswrapper[3976]: I0320 08:33:53.669579 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:53.669926 master-0 kubenswrapper[3976]: I0320 08:33:53.669899 3976 scope.go:117] "RemoveContainer" containerID="a60ec6deeba27ead91a166fc89c2fe4bb87dba9e4662161bb82848ecc564f59f" Mar 20 08:33:53.671219 master-0 kubenswrapper[3976]: I0320 08:33:53.671123 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"6a321a6b1056a105feaf6a80afaf577bd08c544abd875f2fd35678bf25e3bfcd"} Mar 20 08:33:54.472848 master-0 kubenswrapper[3976]: I0320 08:33:54.472764 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:54.675195 master-0 kubenswrapper[3976]: I0320 08:33:54.675058 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"ac50f53bc5c085a17ffb0495207fc3be5a9d394ca5e56961969c0592f7f25b0d"} Mar 20 08:33:54.675195 master-0 kubenswrapper[3976]: I0320 08:33:54.675156 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:54.676865 master-0 kubenswrapper[3976]: I0320 08:33:54.676089 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:54.676865 master-0 kubenswrapper[3976]: I0320 08:33:54.676134 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:54.676865 master-0 kubenswrapper[3976]: I0320 08:33:54.676144 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:54.677076 master-0 kubenswrapper[3976]: I0320 08:33:54.677053 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/1.log" Mar 20 08:33:54.677662 master-0 kubenswrapper[3976]: I0320 08:33:54.677639 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/0.log" Mar 20 08:33:54.678942 master-0 kubenswrapper[3976]: I0320 08:33:54.678380 3976 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="329c7493bd191b904981534930f30c5936c95ceff1c65de77e4e5848875d1489" exitCode=1 Mar 20 08:33:54.678942 master-0 kubenswrapper[3976]: I0320 08:33:54.678413 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"329c7493bd191b904981534930f30c5936c95ceff1c65de77e4e5848875d1489"} Mar 20 08:33:54.678942 master-0 kubenswrapper[3976]: I0320 08:33:54.678439 3976 scope.go:117] "RemoveContainer" containerID="a60ec6deeba27ead91a166fc89c2fe4bb87dba9e4662161bb82848ecc564f59f" Mar 20 08:33:54.678942 master-0 kubenswrapper[3976]: I0320 08:33:54.678491 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:54.679659 master-0 kubenswrapper[3976]: I0320 08:33:54.679585 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:54.679659 master-0 kubenswrapper[3976]: I0320 08:33:54.679618 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:54.679659 master-0 kubenswrapper[3976]: I0320 08:33:54.679630 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:54.680608 master-0 kubenswrapper[3976]: I0320 08:33:54.680202 3976 scope.go:117] "RemoveContainer" containerID="329c7493bd191b904981534930f30c5936c95ceff1c65de77e4e5848875d1489" Mar 20 08:33:54.680608 master-0 kubenswrapper[3976]: E0320 08:33:54.680527 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 20 08:33:55.472325 master-0 kubenswrapper[3976]: I0320 08:33:55.472155 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:55.682391 master-0 kubenswrapper[3976]: I0320 08:33:55.682342 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/1.log" Mar 20 08:33:55.683245 master-0 kubenswrapper[3976]: I0320 08:33:55.683172 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:55.683620 master-0 kubenswrapper[3976]: I0320 08:33:55.683171 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:55.684037 master-0 kubenswrapper[3976]: I0320 08:33:55.683980 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:55.684037 master-0 kubenswrapper[3976]: I0320 08:33:55.684021 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:55.684037 master-0 kubenswrapper[3976]: I0320 08:33:55.684033 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:55.684162 master-0 kubenswrapper[3976]: I0320 08:33:55.684077 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:55.684162 master-0 kubenswrapper[3976]: I0320 08:33:55.684138 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:55.684162 master-0 kubenswrapper[3976]: I0320 08:33:55.684157 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:55.684808 master-0 kubenswrapper[3976]: I0320 08:33:55.684770 3976 scope.go:117] "RemoveContainer" containerID="329c7493bd191b904981534930f30c5936c95ceff1c65de77e4e5848875d1489" Mar 20 08:33:55.685053 master-0 kubenswrapper[3976]: E0320 08:33:55.685017 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 20 08:33:55.690531 master-0 kubenswrapper[3976]: E0320 08:33:55.690496 3976 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 20 08:33:55.818089 master-0 kubenswrapper[3976]: I0320 08:33:55.817479 3976 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 08:33:55.822760 master-0 kubenswrapper[3976]: E0320 08:33:55.822689 3976 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:55.981092 master-0 kubenswrapper[3976]: I0320 08:33:55.981004 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:55.982679 master-0 kubenswrapper[3976]: I0320 08:33:55.982632 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:55.982679 master-0 kubenswrapper[3976]: I0320 08:33:55.982689 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:55.982922 master-0 kubenswrapper[3976]: I0320 08:33:55.982698 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:55.982922 master-0 kubenswrapper[3976]: I0320 08:33:55.982745 3976 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 20 08:33:55.983776 master-0 kubenswrapper[3976]: E0320 08:33:55.983709 3976 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 20 08:33:56.445291 master-0 kubenswrapper[3976]: W0320 08:33:56.445133 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:56.445589 master-0 kubenswrapper[3976]: E0320 08:33:56.445296 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:56.471825 master-0 kubenswrapper[3976]: I0320 08:33:56.471756 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:57.471589 master-0 kubenswrapper[3976]: I0320 08:33:57.471478 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:57.966516 master-0 kubenswrapper[3976]: W0320 08:33:57.966408 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:57.966776 master-0 kubenswrapper[3976]: E0320 08:33:57.966534 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:33:58.471832 master-0 kubenswrapper[3976]: I0320 08:33:58.471740 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 20 08:33:58.699845 master-0 kubenswrapper[3976]: I0320 08:33:58.699763 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"8f140ecbb839c536b9c01be26f60184770878f8b44764513599810ad9344fdb7"} Mar 20 08:33:58.702538 master-0 kubenswrapper[3976]: I0320 08:33:58.702421 3976 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="81fdbea135dce13afe4433f7d61b259980b46bfdce14d456ee42556d90e1cda4" exitCode=0 Mar 20 08:33:58.702538 master-0 kubenswrapper[3976]: I0320 08:33:58.702495 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerDied","Data":"81fdbea135dce13afe4433f7d61b259980b46bfdce14d456ee42556d90e1cda4"} Mar 20 08:33:58.702682 master-0 kubenswrapper[3976]: I0320 08:33:58.702567 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:58.703879 master-0 kubenswrapper[3976]: I0320 08:33:58.703814 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:58.703879 master-0 kubenswrapper[3976]: I0320 08:33:58.703878 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:58.704062 master-0 kubenswrapper[3976]: I0320 08:33:58.703899 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:58.709476 master-0 kubenswrapper[3976]: I0320 08:33:58.709414 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"6369057ac22c507cf87d187a9b3dae20c7b88e730a831d4ea4937e43b3fed7fb"} Mar 20 08:33:58.709615 master-0 kubenswrapper[3976]: I0320 08:33:58.709583 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:58.710985 master-0 kubenswrapper[3976]: I0320 08:33:58.710943 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:58.711069 master-0 kubenswrapper[3976]: I0320 08:33:58.711020 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:58.711069 master-0 kubenswrapper[3976]: I0320 08:33:58.711037 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:58.714766 master-0 kubenswrapper[3976]: I0320 08:33:58.714709 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:58.716422 master-0 kubenswrapper[3976]: I0320 08:33:58.716262 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:58.716422 master-0 kubenswrapper[3976]: I0320 08:33:58.716305 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:58.716422 master-0 kubenswrapper[3976]: I0320 08:33:58.716323 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:33:59.665779 master-0 kubenswrapper[3976]: E0320 08:33:59.665730 3976 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 20 08:33:59.714784 master-0 kubenswrapper[3976]: I0320 08:33:59.714735 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:33:59.715028 master-0 kubenswrapper[3976]: I0320 08:33:59.714735 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"c7ff704cef5e82a8995a139ddd4e2496d1fd9c707ed823bbd9e67f8d259c2ea7"} Mar 20 08:33:59.716284 master-0 kubenswrapper[3976]: I0320 08:33:59.715612 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:33:59.716284 master-0 kubenswrapper[3976]: I0320 08:33:59.715665 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:33:59.716284 master-0 kubenswrapper[3976]: I0320 08:33:59.715677 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:00.378223 master-0 kubenswrapper[3976]: I0320 08:34:00.376397 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:00.378223 master-0 kubenswrapper[3976]: W0320 08:34:00.376507 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:00.378223 master-0 kubenswrapper[3976]: E0320 08:34:00.376556 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 08:34:00.378223 master-0 kubenswrapper[3976]: W0320 08:34:00.376683 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 08:34:00.378223 master-0 kubenswrapper[3976]: E0320 08:34:00.376702 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 08:34:00.474605 master-0 kubenswrapper[3976]: I0320 08:34:00.474452 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:00.719213 master-0 kubenswrapper[3976]: I0320 08:34:00.719122 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"9adcc83ca09a3e8a61346c1bb76c593566cc39bfca1852854fa89f14749366d6"} Mar 20 08:34:00.719213 master-0 kubenswrapper[3976]: I0320 08:34:00.719208 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:00.719947 master-0 kubenswrapper[3976]: I0320 08:34:00.719896 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:00.719947 master-0 kubenswrapper[3976]: I0320 08:34:00.719917 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:00.719947 master-0 kubenswrapper[3976]: I0320 08:34:00.719926 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:01.488425 master-0 kubenswrapper[3976]: I0320 08:34:01.486082 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:01.724878 master-0 kubenswrapper[3976]: I0320 08:34:01.724720 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"21ccef18afe96346c593d227394cf1225a9a87bf9c404fb2038be61860ddf492"} Mar 20 08:34:01.724878 master-0 kubenswrapper[3976]: I0320 08:34:01.724814 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:01.725638 master-0 kubenswrapper[3976]: I0320 08:34:01.724819 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:01.725917 master-0 kubenswrapper[3976]: I0320 08:34:01.725874 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:01.725917 master-0 kubenswrapper[3976]: I0320 08:34:01.725914 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:01.726024 master-0 kubenswrapper[3976]: I0320 08:34:01.725927 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:01.726024 master-0 kubenswrapper[3976]: I0320 08:34:01.726005 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:01.726103 master-0 kubenswrapper[3976]: I0320 08:34:01.726041 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:01.726103 master-0 kubenswrapper[3976]: I0320 08:34:01.726055 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:02.099456 master-0 kubenswrapper[3976]: E0320 08:34:02.099301 3976 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 08:34:02.384065 master-0 kubenswrapper[3976]: I0320 08:34:02.383856 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:02.385222 master-0 kubenswrapper[3976]: I0320 08:34:02.385138 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:02.385222 master-0 kubenswrapper[3976]: I0320 08:34:02.385178 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:02.385222 master-0 kubenswrapper[3976]: I0320 08:34:02.385211 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:02.385473 master-0 kubenswrapper[3976]: I0320 08:34:02.385267 3976 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 20 08:34:02.393387 master-0 kubenswrapper[3976]: E0320 08:34:02.393342 3976 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 20 08:34:02.475346 master-0 kubenswrapper[3976]: I0320 08:34:02.475268 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:02.537702 master-0 kubenswrapper[3976]: E0320 08:34:02.537512 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa03f53b38f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.464499087 +0000 UTC m=+0.733322404,LastTimestamp:2026-03-20 08:33:49.464499087 +0000 UTC m=+0.733322404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.543041 master-0 kubenswrapper[3976]: E0320 08:34:02.542863 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac2f47 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520629575 +0000 UTC m=+0.789452872,LastTimestamp:2026-03-20 08:33:49.520629575 +0000 UTC m=+0.789452872,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.550382 master-0 kubenswrapper[3976]: E0320 08:34:02.550257 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac999b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520656795 +0000 UTC m=+0.789480092,LastTimestamp:2026-03-20 08:33:49.520656795 +0000 UTC m=+0.789480092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.555904 master-0 kubenswrapper[3976]: E0320 08:34:02.555701 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042acd3cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520671695 +0000 UTC m=+0.789494992,LastTimestamp:2026-03-20 08:33:49.520671695 +0000 UTC m=+0.789494992,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.561573 master-0 kubenswrapper[3976]: E0320 08:34:02.561372 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa04b61aef1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.666742001 +0000 UTC m=+0.935565288,LastTimestamp:2026-03-20 08:33:49.666742001 +0000 UTC m=+0.935565288,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.568224 master-0 kubenswrapper[3976]: E0320 08:34:02.567949 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042ac2f47\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac2f47 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520629575 +0000 UTC m=+0.789452872,LastTimestamp:2026-03-20 08:33:49.743099738 +0000 UTC m=+1.011923025,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.574417 master-0 kubenswrapper[3976]: E0320 08:34:02.574271 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042ac999b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac999b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520656795 +0000 UTC m=+0.789480092,LastTimestamp:2026-03-20 08:33:49.743118757 +0000 UTC m=+1.011942034,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.580475 master-0 kubenswrapper[3976]: E0320 08:34:02.580323 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042acd3cf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042acd3cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520671695 +0000 UTC m=+0.789494992,LastTimestamp:2026-03-20 08:33:49.743129487 +0000 UTC m=+1.011952774,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.586413 master-0 kubenswrapper[3976]: E0320 08:34:02.586156 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042ac2f47\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac2f47 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520629575 +0000 UTC m=+0.789452872,LastTimestamp:2026-03-20 08:33:49.743841977 +0000 UTC m=+1.012665264,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.591121 master-0 kubenswrapper[3976]: E0320 08:34:02.590937 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042ac999b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac999b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520656795 +0000 UTC m=+0.789480092,LastTimestamp:2026-03-20 08:33:49.743851807 +0000 UTC m=+1.012675084,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.595626 master-0 kubenswrapper[3976]: E0320 08:34:02.595459 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042acd3cf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042acd3cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520671695 +0000 UTC m=+0.789494992,LastTimestamp:2026-03-20 08:33:49.743861357 +0000 UTC m=+1.012684644,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.600389 master-0 kubenswrapper[3976]: E0320 08:34:02.600261 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042ac2f47\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac2f47 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520629575 +0000 UTC m=+0.789452872,LastTimestamp:2026-03-20 08:33:49.744756504 +0000 UTC m=+1.013579791,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.605127 master-0 kubenswrapper[3976]: E0320 08:34:02.604897 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042ac999b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac999b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520656795 +0000 UTC m=+0.789480092,LastTimestamp:2026-03-20 08:33:49.744769074 +0000 UTC m=+1.013592361,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.610854 master-0 kubenswrapper[3976]: E0320 08:34:02.610498 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042acd3cf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042acd3cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520671695 +0000 UTC m=+0.789494992,LastTimestamp:2026-03-20 08:33:49.744778044 +0000 UTC m=+1.013601331,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.615634 master-0 kubenswrapper[3976]: E0320 08:34:02.615513 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042ac2f47\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac2f47 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520629575 +0000 UTC m=+0.789452872,LastTimestamp:2026-03-20 08:33:49.744807434 +0000 UTC m=+1.013630721,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.621420 master-0 kubenswrapper[3976]: E0320 08:34:02.621277 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042ac999b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac999b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520656795 +0000 UTC m=+0.789480092,LastTimestamp:2026-03-20 08:33:49.744821584 +0000 UTC m=+1.013644871,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.626487 master-0 kubenswrapper[3976]: E0320 08:34:02.626411 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042acd3cf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042acd3cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520671695 +0000 UTC m=+0.789494992,LastTimestamp:2026-03-20 08:33:49.744832773 +0000 UTC m=+1.013656060,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.630795 master-0 kubenswrapper[3976]: E0320 08:34:02.630658 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042ac2f47\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac2f47 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520629575 +0000 UTC m=+0.789452872,LastTimestamp:2026-03-20 08:33:49.745134949 +0000 UTC m=+1.013958276,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.634914 master-0 kubenswrapper[3976]: E0320 08:34:02.634729 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042ac999b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac999b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520656795 +0000 UTC m=+0.789480092,LastTimestamp:2026-03-20 08:33:49.745178259 +0000 UTC m=+1.014001586,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.639924 master-0 kubenswrapper[3976]: E0320 08:34:02.639829 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042acd3cf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042acd3cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520671695 +0000 UTC m=+0.789494992,LastTimestamp:2026-03-20 08:33:49.745239858 +0000 UTC m=+1.014063185,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.644984 master-0 kubenswrapper[3976]: E0320 08:34:02.644849 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042ac2f47\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac2f47 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520629575 +0000 UTC m=+0.789452872,LastTimestamp:2026-03-20 08:33:49.745751411 +0000 UTC m=+1.014574738,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.651302 master-0 kubenswrapper[3976]: E0320 08:34:02.651217 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042ac999b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac999b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520656795 +0000 UTC m=+0.789480092,LastTimestamp:2026-03-20 08:33:49.74577857 +0000 UTC m=+1.014601897,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.656443 master-0 kubenswrapper[3976]: E0320 08:34:02.656305 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042acd3cf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042acd3cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520671695 +0000 UTC m=+0.789494992,LastTimestamp:2026-03-20 08:33:49.74579932 +0000 UTC m=+1.014622647,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.661222 master-0 kubenswrapper[3976]: E0320 08:34:02.661044 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042ac2f47\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac2f47 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520629575 +0000 UTC m=+0.789452872,LastTimestamp:2026-03-20 08:33:49.746013097 +0000 UTC m=+1.014836414,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.667178 master-0 kubenswrapper[3976]: E0320 08:34:02.666998 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e7fa042ac999b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e7fa042ac999b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:49.520656795 +0000 UTC m=+0.789480092,LastTimestamp:2026-03-20 08:33:49.746029187 +0000 UTC m=+1.014852514,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.673032 master-0 kubenswrapper[3976]: E0320 08:34:02.672827 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e7fa08786da22 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:50.67581085 +0000 UTC m=+1.944634157,LastTimestamp:2026-03-20 08:33:50.67581085 +0000 UTC m=+1.944634157,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.678866 master-0 kubenswrapper[3976]: E0320 08:34:02.678688 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e7fa087909739 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:50.676449081 +0000 UTC m=+1.945272388,LastTimestamp:2026-03-20 08:33:50.676449081 +0000 UTC m=+1.945272388,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.684040 master-0 kubenswrapper[3976]: E0320 08:34:02.683868 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e7fa0885fad68 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:50.690020712 +0000 UTC m=+1.958843999,LastTimestamp:2026-03-20 08:33:50.690020712 +0000 UTC m=+1.958843999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.688271 master-0 kubenswrapper[3976]: E0320 08:34:02.688069 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e7fa08959ff54 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:50.706425684 +0000 UTC m=+1.975248981,LastTimestamp:2026-03-20 08:33:50.706425684 +0000 UTC m=+1.975248981,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.698403 master-0 kubenswrapper[3976]: E0320 08:34:02.697828 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa08a354c8a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:50.720797834 +0000 UTC m=+1.989621131,LastTimestamp:2026-03-20 08:33:50.720797834 +0000 UTC m=+1.989621131,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.706588 master-0 kubenswrapper[3976]: E0320 08:34:02.706442 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa0dc0e336c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" in 1.373s (1.373s including waiting). Image size: 465090934 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:52.093967212 +0000 UTC m=+3.362790499,LastTimestamp:2026-03-20 08:33:52.093967212 +0000 UTC m=+3.362790499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.727035 master-0 kubenswrapper[3976]: I0320 08:34:02.726999 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:02.728888 master-0 kubenswrapper[3976]: I0320 08:34:02.728872 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:02.728973 master-0 kubenswrapper[3976]: I0320 08:34:02.728963 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:02.729030 master-0 kubenswrapper[3976]: I0320 08:34:02.729021 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:02.733319 master-0 kubenswrapper[3976]: E0320 08:34:02.730793 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa0e9728ab9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:52.318646969 +0000 UTC m=+3.587470266,LastTimestamp:2026-03-20 08:33:52.318646969 +0000 UTC m=+3.587470266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.738283 master-0 kubenswrapper[3976]: E0320 08:34:02.738128 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa0ea5ccc7e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:52.33399923 +0000 UTC m=+3.602822527,LastTimestamp:2026-03-20 08:33:52.33399923 +0000 UTC m=+3.602822527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.747381 master-0 kubenswrapper[3976]: E0320 08:34:02.747218 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa11f17df15 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:53.218674453 +0000 UTC m=+4.487497750,LastTimestamp:2026-03-20 08:33:53.218674453 +0000 UTC m=+4.487497750,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.752791 master-0 kubenswrapper[3976]: E0320 08:34:02.752659 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e7fa1222c3e33 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\" in 2.593s (2.593s including waiting). Image size: 529326739 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:53.270341171 +0000 UTC m=+4.539164458,LastTimestamp:2026-03-20 08:33:53.270341171 +0000 UTC m=+4.539164458,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.758552 master-0 kubenswrapper[3976]: E0320 08:34:02.758407 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa12d3b75f2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:53.455887858 +0000 UTC m=+4.724711145,LastTimestamp:2026-03-20 08:33:53.455887858 +0000 UTC m=+4.724711145,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.763785 master-0 kubenswrapper[3976]: E0320 08:34:02.763666 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e7fa12d4531a9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:53.456525737 +0000 UTC m=+4.725349024,LastTimestamp:2026-03-20 08:33:53.456525737 +0000 UTC m=+4.725349024,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.768400 master-0 kubenswrapper[3976]: E0320 08:34:02.768248 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e7fa12e058bfa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:53.46913177 +0000 UTC m=+4.737955057,LastTimestamp:2026-03-20 08:33:53.46913177 +0000 UTC m=+4.737955057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.773093 master-0 kubenswrapper[3976]: E0320 08:34:02.772901 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e7fa12e37e1e1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:53.472430561 +0000 UTC m=+4.741253848,LastTimestamp:2026-03-20 08:33:53.472430561 +0000 UTC m=+4.741253848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.778906 master-0 kubenswrapper[3976]: E0320 08:34:02.778660 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa12ee93228 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:53.484050984 +0000 UTC m=+4.752874271,LastTimestamp:2026-03-20 08:33:53.484050984 +0000 UTC m=+4.752874271,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.788767 master-0 kubenswrapper[3976]: E0320 08:34:02.788627 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e7fa11f17df15\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa11f17df15 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:53.218674453 +0000 UTC m=+4.487497750,LastTimestamp:2026-03-20 08:33:53.673695396 +0000 UTC m=+4.942518683,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.792805 master-0 kubenswrapper[3976]: E0320 08:34:02.792683 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e7fa13b3d8ec6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:53.69090631 +0000 UTC m=+4.959729597,LastTimestamp:2026-03-20 08:33:53.69090631 +0000 UTC m=+4.959729597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.798006 master-0 kubenswrapper[3976]: E0320 08:34:02.797839 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e7fa13ca3e4d9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:53.714390233 +0000 UTC m=+4.983213530,LastTimestamp:2026-03-20 08:33:53.714390233 +0000 UTC m=+4.983213530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.802155 master-0 kubenswrapper[3976]: E0320 08:34:02.802026 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e7fa12d3b75f2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa12d3b75f2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:53.455887858 +0000 UTC m=+4.724711145,LastTimestamp:2026-03-20 08:33:53.889264557 +0000 UTC m=+5.158087834,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.806367 master-0 kubenswrapper[3976]: E0320 08:34:02.806265 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e7fa12ee93228\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa12ee93228 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:53.484050984 +0000 UTC m=+4.752874271,LastTimestamp:2026-03-20 08:33:53.902669835 +0000 UTC m=+5.171493112,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.812508 master-0 kubenswrapper[3976]: E0320 08:34:02.812428 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa17638d334 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:54.680451892 +0000 UTC m=+5.949275179,LastTimestamp:2026-03-20 08:33:54.680451892 +0000 UTC m=+5.949275179,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.817527 master-0 kubenswrapper[3976]: E0320 08:34:02.817412 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e7fa17638d334\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa17638d334 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:54.680451892 +0000 UTC m=+5.949275179,LastTimestamp:2026-03-20 08:33:55.684969663 +0000 UTC m=+6.953792980,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.823657 master-0 kubenswrapper[3976]: E0320 08:34:02.823162 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e7fa23428477d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" in 7.191s (7.191s including waiting). Image size: 943841779 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:57.867038589 +0000 UTC m=+9.135861876,LastTimestamp:2026-03-20 08:33:57.867038589 +0000 UTC m=+9.135861876,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.828429 master-0 kubenswrapper[3976]: E0320 08:34:02.828321 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e7fa234a87207 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" in 7.168s (7.168s including waiting). Image size: 943841779 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:57.875438087 +0000 UTC m=+9.144261364,LastTimestamp:2026-03-20 08:33:57.875438087 +0000 UTC m=+9.144261364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.833136 master-0 kubenswrapper[3976]: E0320 08:34:02.832985 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e7fa236ad3a47 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" in 7.219s (7.219s including waiting). Image size: 943841779 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:57.909305927 +0000 UTC m=+9.178129214,LastTimestamp:2026-03-20 08:33:57.909305927 +0000 UTC m=+9.178129214,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.837350 master-0 kubenswrapper[3976]: E0320 08:34:02.837226 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e7fa23e293645 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:58.034871877 +0000 UTC m=+9.303695164,LastTimestamp:2026-03-20 08:33:58.034871877 +0000 UTC m=+9.303695164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.841466 master-0 kubenswrapper[3976]: E0320 08:34:02.841384 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e7fa23eb71495 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:58.044169365 +0000 UTC m=+9.312992652,LastTimestamp:2026-03-20 08:33:58.044169365 +0000 UTC m=+9.312992652,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.845415 master-0 kubenswrapper[3976]: E0320 08:34:02.845259 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e7fa23ed1b5bb kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:58.045914555 +0000 UTC m=+9.314737842,LastTimestamp:2026-03-20 08:33:58.045914555 +0000 UTC m=+9.314737842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.849566 master-0 kubenswrapper[3976]: E0320 08:34:02.849499 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e7fa23f09521e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:58.04955907 +0000 UTC m=+9.318382357,LastTimestamp:2026-03-20 08:33:58.04955907 +0000 UTC m=+9.318382357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.853355 master-0 kubenswrapper[3976]: E0320 08:34:02.853207 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e7fa23f94a8c1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:58.058690753 +0000 UTC m=+9.327514040,LastTimestamp:2026-03-20 08:33:58.058690753 +0000 UTC m=+9.327514040,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.857166 master-0 kubenswrapper[3976]: E0320 08:34:02.857065 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e7fa23fb9de8c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:58.061129356 +0000 UTC m=+9.329952643,LastTimestamp:2026-03-20 08:33:58.061129356 +0000 UTC m=+9.329952643,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.861467 master-0 kubenswrapper[3976]: E0320 08:34:02.861329 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e7fa2402ff0b3 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:58.068867251 +0000 UTC m=+9.337690548,LastTimestamp:2026-03-20 08:33:58.068867251 +0000 UTC m=+9.337690548,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.867544 master-0 kubenswrapper[3976]: E0320 08:34:02.867396 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e7fa266ad0e91 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:58.714601105 +0000 UTC m=+9.983424432,LastTimestamp:2026-03-20 08:33:58.714601105 +0000 UTC m=+9.983424432,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.872677 master-0 kubenswrapper[3976]: E0320 08:34:02.872602 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e7fa27352221c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:58.92674614 +0000 UTC m=+10.195569427,LastTimestamp:2026-03-20 08:33:58.92674614 +0000 UTC m=+10.195569427,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.876277 master-0 kubenswrapper[3976]: E0320 08:34:02.876134 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e7fa273e3a365 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:58.936281957 +0000 UTC m=+10.205105244,LastTimestamp:2026-03-20 08:33:58.936281957 +0000 UTC m=+10.205105244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.880094 master-0 kubenswrapper[3976]: E0320 08:34:02.880010 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e7fa273f4dff8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:58.937411576 +0000 UTC m=+10.206234863,LastTimestamp:2026-03-20 08:33:58.937411576 +0000 UTC m=+10.206234863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.884756 master-0 kubenswrapper[3976]: E0320 08:34:02.884666 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e7fa2bb5074f1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\" in 2.088s (2.088s including waiting). Image size: 505246690 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:34:00.134595825 +0000 UTC m=+11.403419112,LastTimestamp:2026-03-20 08:34:00.134595825 +0000 UTC m=+11.403419112,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.889937 master-0 kubenswrapper[3976]: E0320 08:34:02.889676 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e7fa2c711ec8c kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:34:00.331824268 +0000 UTC m=+11.600647555,LastTimestamp:2026-03-20 08:34:00.331824268 +0000 UTC m=+11.600647555,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.893552 master-0 kubenswrapper[3976]: E0320 08:34:02.893430 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e7fa2c7aae8ef kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:34:00.341850351 +0000 UTC m=+11.610673638,LastTimestamp:2026-03-20 08:34:00.341850351 +0000 UTC m=+11.610673638,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.897734 master-0 kubenswrapper[3976]: E0320 08:34:02.897608 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e7fa3020f7a0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" in 2.384s (2.384s including waiting). Image size: 514984269 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:34:01.321519631 +0000 UTC m=+12.590342918,LastTimestamp:2026-03-20 08:34:01.321519631 +0000 UTC m=+12.590342918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.901595 master-0 kubenswrapper[3976]: E0320 08:34:02.901446 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e7fa310fba9cf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:34:01.571879375 +0000 UTC m=+12.840702662,LastTimestamp:2026-03-20 08:34:01.571879375 +0000 UTC m=+12.840702662,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:02.905624 master-0 kubenswrapper[3976]: E0320 08:34:02.905486 3976 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e7fa31178779d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:34:01.580058525 +0000 UTC m=+12.848881812,LastTimestamp:2026-03-20 08:34:01.580058525 +0000 UTC m=+12.848881812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:03.147930 master-0 kubenswrapper[3976]: I0320 08:34:03.147714 3976 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:34:03.148176 master-0 kubenswrapper[3976]: I0320 08:34:03.148030 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:03.149629 master-0 kubenswrapper[3976]: I0320 08:34:03.149526 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:03.149629 master-0 kubenswrapper[3976]: I0320 08:34:03.149580 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:03.149629 master-0 kubenswrapper[3976]: I0320 08:34:03.149598 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:03.152919 master-0 kubenswrapper[3976]: I0320 08:34:03.152867 3976 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:34:03.173344 master-0 kubenswrapper[3976]: I0320 08:34:03.173290 3976 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:34:03.479532 master-0 kubenswrapper[3976]: I0320 08:34:03.479377 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:03.729470 master-0 kubenswrapper[3976]: I0320 08:34:03.729426 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:03.730562 master-0 kubenswrapper[3976]: I0320 08:34:03.729425 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:03.730660 master-0 kubenswrapper[3976]: I0320 08:34:03.729478 3976 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:34:03.732459 master-0 kubenswrapper[3976]: I0320 08:34:03.732397 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:03.732590 master-0 kubenswrapper[3976]: I0320 08:34:03.732473 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:03.732590 master-0 kubenswrapper[3976]: I0320 08:34:03.732505 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:03.733690 master-0 kubenswrapper[3976]: I0320 08:34:03.733606 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:03.734985 master-0 kubenswrapper[3976]: I0320 08:34:03.734918 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:03.735086 master-0 kubenswrapper[3976]: I0320 08:34:03.734992 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:03.893415 master-0 kubenswrapper[3976]: I0320 08:34:03.893340 3976 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 08:34:04.044512 master-0 kubenswrapper[3976]: I0320 08:34:04.044443 3976 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 08:34:04.475933 master-0 kubenswrapper[3976]: I0320 08:34:04.475764 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:04.732456 master-0 kubenswrapper[3976]: I0320 08:34:04.732272 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:04.733275 master-0 kubenswrapper[3976]: I0320 08:34:04.732951 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:04.733275 master-0 kubenswrapper[3976]: I0320 08:34:04.732974 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:04.733275 master-0 kubenswrapper[3976]: I0320 08:34:04.732984 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:05.242547 master-0 kubenswrapper[3976]: I0320 08:34:05.242473 3976 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:34:05.477088 master-0 kubenswrapper[3976]: I0320 08:34:05.476980 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:05.734700 master-0 kubenswrapper[3976]: I0320 08:34:05.734565 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:05.736135 master-0 kubenswrapper[3976]: I0320 08:34:05.736073 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:05.736219 master-0 kubenswrapper[3976]: I0320 08:34:05.736140 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:05.736219 master-0 kubenswrapper[3976]: I0320 08:34:05.736155 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:05.936581 master-0 kubenswrapper[3976]: W0320 08:34:05.936454 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 08:34:05.936581 master-0 kubenswrapper[3976]: E0320 08:34:05.936532 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 08:34:05.956246 master-0 kubenswrapper[3976]: W0320 08:34:05.956167 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 08:34:05.956246 master-0 kubenswrapper[3976]: E0320 08:34:05.956237 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 08:34:06.396673 master-0 kubenswrapper[3976]: I0320 08:34:06.396601 3976 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:34:06.397016 master-0 kubenswrapper[3976]: I0320 08:34:06.396837 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:06.398436 master-0 kubenswrapper[3976]: I0320 08:34:06.398358 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:06.398491 master-0 kubenswrapper[3976]: I0320 08:34:06.398454 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:06.398491 master-0 kubenswrapper[3976]: I0320 08:34:06.398475 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:06.403852 master-0 kubenswrapper[3976]: I0320 08:34:06.403819 3976 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:34:06.478214 master-0 kubenswrapper[3976]: I0320 08:34:06.478126 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:06.737486 master-0 kubenswrapper[3976]: I0320 08:34:06.737344 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:06.738384 master-0 kubenswrapper[3976]: I0320 08:34:06.738348 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:06.738442 master-0 kubenswrapper[3976]: I0320 08:34:06.738400 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:06.738442 master-0 kubenswrapper[3976]: I0320 08:34:06.738416 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:06.743488 master-0 kubenswrapper[3976]: I0320 08:34:06.743467 3976 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:34:07.476085 master-0 kubenswrapper[3976]: I0320 08:34:07.476035 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:07.642266 master-0 kubenswrapper[3976]: I0320 08:34:07.642205 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:07.643700 master-0 kubenswrapper[3976]: I0320 08:34:07.643658 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:07.643786 master-0 kubenswrapper[3976]: I0320 08:34:07.643711 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:07.643786 master-0 kubenswrapper[3976]: I0320 08:34:07.643735 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:07.644456 master-0 kubenswrapper[3976]: I0320 08:34:07.644435 3976 scope.go:117] "RemoveContainer" containerID="329c7493bd191b904981534930f30c5936c95ceff1c65de77e4e5848875d1489" Mar 20 08:34:07.653544 master-0 kubenswrapper[3976]: E0320 08:34:07.653404 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e7fa11f17df15\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa11f17df15 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:53.218674453 +0000 UTC m=+4.487497750,LastTimestamp:2026-03-20 08:34:07.646807977 +0000 UTC m=+18.915631264,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:07.739912 master-0 kubenswrapper[3976]: I0320 08:34:07.739780 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:07.742615 master-0 kubenswrapper[3976]: I0320 08:34:07.740981 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:07.742615 master-0 kubenswrapper[3976]: I0320 08:34:07.741014 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:07.742615 master-0 kubenswrapper[3976]: I0320 08:34:07.741023 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:07.898763 master-0 kubenswrapper[3976]: E0320 08:34:07.898112 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e7fa12d3b75f2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa12d3b75f2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:53.455887858 +0000 UTC m=+4.724711145,LastTimestamp:2026-03-20 08:34:07.891524319 +0000 UTC m=+19.160347606,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:07.919510 master-0 kubenswrapper[3976]: E0320 08:34:07.919353 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e7fa12ee93228\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa12ee93228 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:53.484050984 +0000 UTC m=+4.752874271,LastTimestamp:2026-03-20 08:34:07.912927104 +0000 UTC m=+19.181750411,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:08.476325 master-0 kubenswrapper[3976]: I0320 08:34:08.476171 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:08.685484 master-0 kubenswrapper[3976]: I0320 08:34:08.685389 3976 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:34:08.685804 master-0 kubenswrapper[3976]: I0320 08:34:08.685647 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:08.688027 master-0 kubenswrapper[3976]: I0320 08:34:08.687962 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:08.688027 master-0 kubenswrapper[3976]: I0320 08:34:08.688014 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:08.688027 master-0 kubenswrapper[3976]: I0320 08:34:08.688029 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:08.691145 master-0 kubenswrapper[3976]: I0320 08:34:08.691102 3976 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:34:08.744795 master-0 kubenswrapper[3976]: I0320 08:34:08.744596 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 20 08:34:08.746253 master-0 kubenswrapper[3976]: I0320 08:34:08.745138 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/1.log" Mar 20 08:34:08.746253 master-0 kubenswrapper[3976]: I0320 08:34:08.745722 3976 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="96ff83659a940b2334d29c8de766dce2f56c0eddae2926cc1d3dd2e347430a94" exitCode=1 Mar 20 08:34:08.746253 master-0 kubenswrapper[3976]: I0320 08:34:08.745775 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"96ff83659a940b2334d29c8de766dce2f56c0eddae2926cc1d3dd2e347430a94"} Mar 20 08:34:08.746253 master-0 kubenswrapper[3976]: I0320 08:34:08.745886 3976 scope.go:117] "RemoveContainer" containerID="329c7493bd191b904981534930f30c5936c95ceff1c65de77e4e5848875d1489" Mar 20 08:34:08.746253 master-0 kubenswrapper[3976]: I0320 08:34:08.745953 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:08.746253 master-0 kubenswrapper[3976]: I0320 08:34:08.746041 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:08.747265 master-0 kubenswrapper[3976]: I0320 08:34:08.747225 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:08.747265 master-0 kubenswrapper[3976]: I0320 08:34:08.747269 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:08.747265 master-0 kubenswrapper[3976]: I0320 08:34:08.747227 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:08.747265 master-0 kubenswrapper[3976]: I0320 08:34:08.747285 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:08.747635 master-0 kubenswrapper[3976]: I0320 08:34:08.747307 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:08.747635 master-0 kubenswrapper[3976]: I0320 08:34:08.747323 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:08.747803 master-0 kubenswrapper[3976]: I0320 08:34:08.747710 3976 scope.go:117] "RemoveContainer" containerID="96ff83659a940b2334d29c8de766dce2f56c0eddae2926cc1d3dd2e347430a94" Mar 20 08:34:08.747963 master-0 kubenswrapper[3976]: E0320 08:34:08.747907 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 20 08:34:08.751219 master-0 kubenswrapper[3976]: I0320 08:34:08.751127 3976 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:34:08.755216 master-0 kubenswrapper[3976]: E0320 08:34:08.755072 3976 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e7fa17638d334\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e7fa17638d334 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:33:54.680451892 +0000 UTC m=+5.949275179,LastTimestamp:2026-03-20 08:34:08.747865094 +0000 UTC m=+20.016688381,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:34:08.795647 master-0 kubenswrapper[3976]: I0320 08:34:08.795563 3976 csr.go:261] certificate signing request csr-hgtpf is approved, waiting to be issued Mar 20 08:34:09.108081 master-0 kubenswrapper[3976]: E0320 08:34:09.107989 3976 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 08:34:09.393874 master-0 kubenswrapper[3976]: I0320 08:34:09.393646 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:09.395291 master-0 kubenswrapper[3976]: I0320 08:34:09.395230 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:09.395291 master-0 kubenswrapper[3976]: I0320 08:34:09.395276 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:09.395291 master-0 kubenswrapper[3976]: I0320 08:34:09.395291 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:09.395648 master-0 kubenswrapper[3976]: I0320 08:34:09.395353 3976 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 20 08:34:09.400579 master-0 kubenswrapper[3976]: E0320 08:34:09.400530 3976 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 20 08:34:09.477885 master-0 kubenswrapper[3976]: I0320 08:34:09.477758 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:09.666838 master-0 kubenswrapper[3976]: E0320 08:34:09.666641 3976 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 20 08:34:09.751071 master-0 kubenswrapper[3976]: I0320 08:34:09.750987 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 20 08:34:09.752363 master-0 kubenswrapper[3976]: I0320 08:34:09.752324 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:09.753137 master-0 kubenswrapper[3976]: I0320 08:34:09.753096 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:09.753137 master-0 kubenswrapper[3976]: I0320 08:34:09.753127 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:09.753137 master-0 kubenswrapper[3976]: I0320 08:34:09.753136 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:10.137540 master-0 kubenswrapper[3976]: W0320 08:34:10.137421 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 08:34:10.137540 master-0 kubenswrapper[3976]: E0320 08:34:10.137504 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 08:34:10.479324 master-0 kubenswrapper[3976]: I0320 08:34:10.478906 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:11.479786 master-0 kubenswrapper[3976]: I0320 08:34:11.479732 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:12.278598 master-0 kubenswrapper[3976]: W0320 08:34:12.278545 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:12.278907 master-0 kubenswrapper[3976]: E0320 08:34:12.278608 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 08:34:12.477881 master-0 kubenswrapper[3976]: I0320 08:34:12.477775 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:13.479242 master-0 kubenswrapper[3976]: I0320 08:34:13.479147 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:14.480207 master-0 kubenswrapper[3976]: I0320 08:34:14.480139 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:15.176766 master-0 kubenswrapper[3976]: I0320 08:34:15.176657 3976 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:34:15.177145 master-0 kubenswrapper[3976]: I0320 08:34:15.176804 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:15.178224 master-0 kubenswrapper[3976]: I0320 08:34:15.178140 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:15.178224 master-0 kubenswrapper[3976]: I0320 08:34:15.178206 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:15.178224 master-0 kubenswrapper[3976]: I0320 08:34:15.178223 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:15.491886 master-0 kubenswrapper[3976]: I0320 08:34:15.476580 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:16.113701 master-0 kubenswrapper[3976]: E0320 08:34:16.113648 3976 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 08:34:16.401623 master-0 kubenswrapper[3976]: I0320 08:34:16.401442 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:16.402688 master-0 kubenswrapper[3976]: I0320 08:34:16.402650 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:16.402688 master-0 kubenswrapper[3976]: I0320 08:34:16.402680 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:16.402688 master-0 kubenswrapper[3976]: I0320 08:34:16.402692 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:16.403070 master-0 kubenswrapper[3976]: I0320 08:34:16.402740 3976 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 20 08:34:16.409625 master-0 kubenswrapper[3976]: E0320 08:34:16.409599 3976 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 20 08:34:16.478049 master-0 kubenswrapper[3976]: I0320 08:34:16.477995 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:17.476449 master-0 kubenswrapper[3976]: I0320 08:34:17.476310 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:18.479452 master-0 kubenswrapper[3976]: I0320 08:34:18.479334 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:19.475639 master-0 kubenswrapper[3976]: I0320 08:34:19.475567 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:19.667463 master-0 kubenswrapper[3976]: E0320 08:34:19.667353 3976 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 20 08:34:20.169403 master-0 kubenswrapper[3976]: W0320 08:34:20.169330 3976 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 08:34:20.169403 master-0 kubenswrapper[3976]: E0320 08:34:20.169405 3976 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 08:34:20.479945 master-0 kubenswrapper[3976]: I0320 08:34:20.479749 3976 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:34:20.923856 master-0 kubenswrapper[3976]: I0320 08:34:20.923766 3976 csr.go:257] certificate signing request csr-hgtpf is issued Mar 20 08:34:21.356018 master-0 kubenswrapper[3976]: I0320 08:34:21.355936 3976 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 08:34:21.419070 master-0 kubenswrapper[3976]: I0320 08:34:21.418996 3976 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 08:34:21.478062 master-0 kubenswrapper[3976]: I0320 08:34:21.477994 3976 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 20 08:34:21.494818 master-0 kubenswrapper[3976]: I0320 08:34:21.494767 3976 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 20 08:34:21.562640 master-0 kubenswrapper[3976]: I0320 08:34:21.562578 3976 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 20 08:34:21.826937 master-0 kubenswrapper[3976]: I0320 08:34:21.826880 3976 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 20 08:34:21.826937 master-0 kubenswrapper[3976]: E0320 08:34:21.826925 3976 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 20 08:34:21.848809 master-0 kubenswrapper[3976]: I0320 08:34:21.848754 3976 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 20 08:34:21.863538 master-0 kubenswrapper[3976]: I0320 08:34:21.863471 3976 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 20 08:34:21.924281 master-0 kubenswrapper[3976]: I0320 08:34:21.924223 3976 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 20 08:34:21.925320 master-0 kubenswrapper[3976]: I0320 08:34:21.925242 3976 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-21 08:25:35 +0000 UTC, rotation deadline is 2026-03-21 01:53:46.685784344 +0000 UTC Mar 20 08:34:21.925320 master-0 kubenswrapper[3976]: I0320 08:34:21.925317 3976 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 17h19m24.760472029s for next certificate rotation Mar 20 08:34:22.205133 master-0 kubenswrapper[3976]: I0320 08:34:22.204974 3976 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 20 08:34:22.205133 master-0 kubenswrapper[3976]: E0320 08:34:22.205031 3976 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 20 08:34:22.308454 master-0 kubenswrapper[3976]: I0320 08:34:22.308396 3976 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 20 08:34:22.323752 master-0 kubenswrapper[3976]: I0320 08:34:22.323709 3976 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 20 08:34:22.381898 master-0 kubenswrapper[3976]: I0320 08:34:22.381831 3976 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 20 08:34:22.642541 master-0 kubenswrapper[3976]: I0320 08:34:22.642469 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:22.644411 master-0 kubenswrapper[3976]: I0320 08:34:22.644358 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:22.644511 master-0 kubenswrapper[3976]: I0320 08:34:22.644418 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:22.644511 master-0 kubenswrapper[3976]: I0320 08:34:22.644432 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:22.644930 master-0 kubenswrapper[3976]: I0320 08:34:22.644887 3976 scope.go:117] "RemoveContainer" containerID="96ff83659a940b2334d29c8de766dce2f56c0eddae2926cc1d3dd2e347430a94" Mar 20 08:34:22.645127 master-0 kubenswrapper[3976]: E0320 08:34:22.645081 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 20 08:34:22.661746 master-0 kubenswrapper[3976]: I0320 08:34:22.661703 3976 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 20 08:34:22.661746 master-0 kubenswrapper[3976]: E0320 08:34:22.661742 3976 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 20 08:34:23.119723 master-0 kubenswrapper[3976]: E0320 08:34:23.119657 3976 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Mar 20 08:34:23.235015 master-0 kubenswrapper[3976]: I0320 08:34:23.234940 3976 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 20 08:34:23.251166 master-0 kubenswrapper[3976]: I0320 08:34:23.251112 3976 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 20 08:34:23.307731 master-0 kubenswrapper[3976]: I0320 08:34:23.307656 3976 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 20 08:34:23.410122 master-0 kubenswrapper[3976]: I0320 08:34:23.409936 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:23.411952 master-0 kubenswrapper[3976]: I0320 08:34:23.411884 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:23.411952 master-0 kubenswrapper[3976]: I0320 08:34:23.411957 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:23.412122 master-0 kubenswrapper[3976]: I0320 08:34:23.411973 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:23.412122 master-0 kubenswrapper[3976]: I0320 08:34:23.412072 3976 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 20 08:34:23.424525 master-0 kubenswrapper[3976]: I0320 08:34:23.424465 3976 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 20 08:34:23.424675 master-0 kubenswrapper[3976]: E0320 08:34:23.424539 3976 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 20 08:34:23.436384 master-0 kubenswrapper[3976]: E0320 08:34:23.436334 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:23.497763 master-0 kubenswrapper[3976]: I0320 08:34:23.497697 3976 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 08:34:23.520562 master-0 kubenswrapper[3976]: I0320 08:34:23.520501 3976 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 08:34:23.536727 master-0 kubenswrapper[3976]: E0320 08:34:23.536647 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:23.637589 master-0 kubenswrapper[3976]: E0320 08:34:23.637527 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:23.738324 master-0 kubenswrapper[3976]: E0320 08:34:23.738172 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:23.839427 master-0 kubenswrapper[3976]: E0320 08:34:23.839349 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:23.940401 master-0 kubenswrapper[3976]: E0320 08:34:23.940315 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:24.041298 master-0 kubenswrapper[3976]: E0320 08:34:24.041143 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:24.142353 master-0 kubenswrapper[3976]: E0320 08:34:24.142278 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:24.243552 master-0 kubenswrapper[3976]: E0320 08:34:24.243457 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:24.343897 master-0 kubenswrapper[3976]: E0320 08:34:24.343821 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:24.445169 master-0 kubenswrapper[3976]: E0320 08:34:24.445041 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:24.545721 master-0 kubenswrapper[3976]: E0320 08:34:24.545626 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:24.645963 master-0 kubenswrapper[3976]: E0320 08:34:24.645790 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:24.746839 master-0 kubenswrapper[3976]: E0320 08:34:24.746735 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:24.847079 master-0 kubenswrapper[3976]: E0320 08:34:24.847000 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:24.947683 master-0 kubenswrapper[3976]: E0320 08:34:24.947521 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:25.048853 master-0 kubenswrapper[3976]: E0320 08:34:25.048744 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:25.149275 master-0 kubenswrapper[3976]: E0320 08:34:25.149144 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:25.249574 master-0 kubenswrapper[3976]: E0320 08:34:25.249390 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:25.350086 master-0 kubenswrapper[3976]: E0320 08:34:25.349999 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:25.451281 master-0 kubenswrapper[3976]: E0320 08:34:25.451172 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:25.551779 master-0 kubenswrapper[3976]: E0320 08:34:25.551696 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:25.652957 master-0 kubenswrapper[3976]: E0320 08:34:25.652869 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:25.754072 master-0 kubenswrapper[3976]: E0320 08:34:25.753950 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:25.854666 master-0 kubenswrapper[3976]: E0320 08:34:25.854498 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:25.955748 master-0 kubenswrapper[3976]: E0320 08:34:25.955661 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:26.055983 master-0 kubenswrapper[3976]: E0320 08:34:26.055898 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:26.156509 master-0 kubenswrapper[3976]: E0320 08:34:26.156270 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:26.257326 master-0 kubenswrapper[3976]: E0320 08:34:26.257230 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:26.358431 master-0 kubenswrapper[3976]: E0320 08:34:26.358337 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:26.459258 master-0 kubenswrapper[3976]: E0320 08:34:26.459044 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:26.559386 master-0 kubenswrapper[3976]: E0320 08:34:26.559300 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:26.659608 master-0 kubenswrapper[3976]: E0320 08:34:26.659493 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:26.760338 master-0 kubenswrapper[3976]: E0320 08:34:26.760051 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:26.861372 master-0 kubenswrapper[3976]: E0320 08:34:26.861277 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:26.961807 master-0 kubenswrapper[3976]: E0320 08:34:26.961705 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:27.062675 master-0 kubenswrapper[3976]: E0320 08:34:27.062596 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:27.163751 master-0 kubenswrapper[3976]: E0320 08:34:27.163657 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:27.264587 master-0 kubenswrapper[3976]: E0320 08:34:27.264519 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:27.365957 master-0 kubenswrapper[3976]: E0320 08:34:27.365723 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:27.466546 master-0 kubenswrapper[3976]: E0320 08:34:27.466441 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:27.477490 master-0 kubenswrapper[3976]: I0320 08:34:27.477417 3976 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 08:34:27.566733 master-0 kubenswrapper[3976]: E0320 08:34:27.566601 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:27.667330 master-0 kubenswrapper[3976]: E0320 08:34:27.667101 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:27.767419 master-0 kubenswrapper[3976]: E0320 08:34:27.767301 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:27.867602 master-0 kubenswrapper[3976]: E0320 08:34:27.867529 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:27.967868 master-0 kubenswrapper[3976]: E0320 08:34:27.967701 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:28.068899 master-0 kubenswrapper[3976]: E0320 08:34:28.068819 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:28.169954 master-0 kubenswrapper[3976]: E0320 08:34:28.169844 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:28.270547 master-0 kubenswrapper[3976]: E0320 08:34:28.270337 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:28.371283 master-0 kubenswrapper[3976]: E0320 08:34:28.371167 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:28.472169 master-0 kubenswrapper[3976]: E0320 08:34:28.472053 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:28.572497 master-0 kubenswrapper[3976]: E0320 08:34:28.572409 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:28.672781 master-0 kubenswrapper[3976]: E0320 08:34:28.672654 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:28.773415 master-0 kubenswrapper[3976]: E0320 08:34:28.773294 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:28.874226 master-0 kubenswrapper[3976]: E0320 08:34:28.873967 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:28.974375 master-0 kubenswrapper[3976]: E0320 08:34:28.974285 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:29.075388 master-0 kubenswrapper[3976]: E0320 08:34:29.075270 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:29.176703 master-0 kubenswrapper[3976]: E0320 08:34:29.176489 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:29.277331 master-0 kubenswrapper[3976]: E0320 08:34:29.277271 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:29.378527 master-0 kubenswrapper[3976]: E0320 08:34:29.378440 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:29.478866 master-0 kubenswrapper[3976]: E0320 08:34:29.478707 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:29.579606 master-0 kubenswrapper[3976]: E0320 08:34:29.579530 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:29.615261 master-0 kubenswrapper[3976]: I0320 08:34:29.615140 3976 csr.go:261] certificate signing request csr-z7jb9 is approved, waiting to be issued Mar 20 08:34:29.625871 master-0 kubenswrapper[3976]: I0320 08:34:29.625780 3976 csr.go:257] certificate signing request csr-z7jb9 is issued Mar 20 08:34:29.668616 master-0 kubenswrapper[3976]: E0320 08:34:29.668562 3976 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 20 08:34:29.680172 master-0 kubenswrapper[3976]: E0320 08:34:29.680109 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:29.781437 master-0 kubenswrapper[3976]: E0320 08:34:29.781238 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:29.881534 master-0 kubenswrapper[3976]: E0320 08:34:29.881443 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:29.982127 master-0 kubenswrapper[3976]: E0320 08:34:29.981994 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:30.082884 master-0 kubenswrapper[3976]: E0320 08:34:30.082721 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:30.183385 master-0 kubenswrapper[3976]: E0320 08:34:30.183279 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:30.283589 master-0 kubenswrapper[3976]: E0320 08:34:30.283429 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:30.383802 master-0 kubenswrapper[3976]: E0320 08:34:30.383598 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:30.484892 master-0 kubenswrapper[3976]: E0320 08:34:30.484769 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:30.585752 master-0 kubenswrapper[3976]: E0320 08:34:30.585645 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:30.627386 master-0 kubenswrapper[3976]: I0320 08:34:30.627282 3976 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-21 08:25:35 +0000 UTC, rotation deadline is 2026-03-21 03:30:05.303105121 +0000 UTC Mar 20 08:34:30.627386 master-0 kubenswrapper[3976]: I0320 08:34:30.627349 3976 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h55m34.675761478s for next certificate rotation Mar 20 08:34:30.686522 master-0 kubenswrapper[3976]: E0320 08:34:30.686313 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:30.786840 master-0 kubenswrapper[3976]: E0320 08:34:30.786733 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:30.887577 master-0 kubenswrapper[3976]: E0320 08:34:30.887471 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:30.988840 master-0 kubenswrapper[3976]: E0320 08:34:30.988640 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:31.089562 master-0 kubenswrapper[3976]: E0320 08:34:31.089495 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:31.190688 master-0 kubenswrapper[3976]: E0320 08:34:31.190553 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:31.291516 master-0 kubenswrapper[3976]: E0320 08:34:31.291258 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:31.391672 master-0 kubenswrapper[3976]: E0320 08:34:31.391541 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:31.492353 master-0 kubenswrapper[3976]: E0320 08:34:31.492232 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:31.593427 master-0 kubenswrapper[3976]: E0320 08:34:31.593308 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:31.628063 master-0 kubenswrapper[3976]: I0320 08:34:31.627947 3976 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-21 08:25:35 +0000 UTC, rotation deadline is 2026-03-21 02:04:11.198263939 +0000 UTC Mar 20 08:34:31.628063 master-0 kubenswrapper[3976]: I0320 08:34:31.628017 3976 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h29m39.570254345s for next certificate rotation Mar 20 08:34:31.694233 master-0 kubenswrapper[3976]: E0320 08:34:31.694132 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:31.795290 master-0 kubenswrapper[3976]: E0320 08:34:31.795149 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:31.896151 master-0 kubenswrapper[3976]: E0320 08:34:31.895933 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:31.996492 master-0 kubenswrapper[3976]: E0320 08:34:31.996363 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:32.096704 master-0 kubenswrapper[3976]: E0320 08:34:32.096587 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:32.197572 master-0 kubenswrapper[3976]: E0320 08:34:32.197375 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:32.298668 master-0 kubenswrapper[3976]: E0320 08:34:32.298512 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:32.399372 master-0 kubenswrapper[3976]: E0320 08:34:32.399264 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:32.500072 master-0 kubenswrapper[3976]: E0320 08:34:32.499823 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:32.600810 master-0 kubenswrapper[3976]: E0320 08:34:32.600710 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:32.701919 master-0 kubenswrapper[3976]: E0320 08:34:32.701842 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:32.803039 master-0 kubenswrapper[3976]: E0320 08:34:32.802907 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:32.904086 master-0 kubenswrapper[3976]: E0320 08:34:32.903976 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:33.004774 master-0 kubenswrapper[3976]: E0320 08:34:33.004624 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:33.105512 master-0 kubenswrapper[3976]: E0320 08:34:33.105339 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:33.205579 master-0 kubenswrapper[3976]: E0320 08:34:33.205511 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:33.306774 master-0 kubenswrapper[3976]: E0320 08:34:33.306508 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:33.407554 master-0 kubenswrapper[3976]: E0320 08:34:33.407301 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:33.507638 master-0 kubenswrapper[3976]: E0320 08:34:33.507480 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:33.567375 master-0 kubenswrapper[3976]: E0320 08:34:33.567249 3976 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 20 08:34:33.608645 master-0 kubenswrapper[3976]: E0320 08:34:33.608524 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:33.709933 master-0 kubenswrapper[3976]: E0320 08:34:33.709714 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:33.810617 master-0 kubenswrapper[3976]: E0320 08:34:33.810495 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:33.911836 master-0 kubenswrapper[3976]: E0320 08:34:33.911726 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:34.012919 master-0 kubenswrapper[3976]: E0320 08:34:34.012659 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:34.113944 master-0 kubenswrapper[3976]: E0320 08:34:34.113829 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:34.214623 master-0 kubenswrapper[3976]: E0320 08:34:34.214512 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:34.315642 master-0 kubenswrapper[3976]: E0320 08:34:34.315535 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:34.416383 master-0 kubenswrapper[3976]: E0320 08:34:34.416260 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:34.517110 master-0 kubenswrapper[3976]: E0320 08:34:34.516898 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:34.617960 master-0 kubenswrapper[3976]: E0320 08:34:34.617749 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:34.718464 master-0 kubenswrapper[3976]: E0320 08:34:34.718338 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:34.818635 master-0 kubenswrapper[3976]: E0320 08:34:34.818505 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:34.919112 master-0 kubenswrapper[3976]: E0320 08:34:34.918939 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:35.019543 master-0 kubenswrapper[3976]: E0320 08:34:35.019454 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:35.119789 master-0 kubenswrapper[3976]: E0320 08:34:35.119690 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:35.220833 master-0 kubenswrapper[3976]: E0320 08:34:35.220589 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:35.321737 master-0 kubenswrapper[3976]: E0320 08:34:35.321632 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:35.421984 master-0 kubenswrapper[3976]: E0320 08:34:35.421893 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:35.522883 master-0 kubenswrapper[3976]: E0320 08:34:35.522630 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:35.623884 master-0 kubenswrapper[3976]: E0320 08:34:35.623741 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:35.724389 master-0 kubenswrapper[3976]: E0320 08:34:35.724292 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:35.824718 master-0 kubenswrapper[3976]: E0320 08:34:35.824657 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:35.926542 master-0 kubenswrapper[3976]: E0320 08:34:35.926438 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:36.027632 master-0 kubenswrapper[3976]: E0320 08:34:36.027514 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:36.128854 master-0 kubenswrapper[3976]: E0320 08:34:36.128637 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:36.229819 master-0 kubenswrapper[3976]: E0320 08:34:36.229707 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:36.330802 master-0 kubenswrapper[3976]: E0320 08:34:36.330700 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:36.431935 master-0 kubenswrapper[3976]: E0320 08:34:36.431720 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:36.532513 master-0 kubenswrapper[3976]: E0320 08:34:36.532442 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:36.632765 master-0 kubenswrapper[3976]: E0320 08:34:36.632680 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:36.733490 master-0 kubenswrapper[3976]: E0320 08:34:36.733252 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:36.834388 master-0 kubenswrapper[3976]: E0320 08:34:36.834314 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:36.935361 master-0 kubenswrapper[3976]: E0320 08:34:36.935258 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:37.035808 master-0 kubenswrapper[3976]: E0320 08:34:37.035625 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:37.136529 master-0 kubenswrapper[3976]: E0320 08:34:37.136450 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:37.237464 master-0 kubenswrapper[3976]: E0320 08:34:37.237381 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:37.338456 master-0 kubenswrapper[3976]: E0320 08:34:37.338366 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:37.439589 master-0 kubenswrapper[3976]: E0320 08:34:37.439507 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:37.539740 master-0 kubenswrapper[3976]: E0320 08:34:37.539649 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:37.593566 master-0 kubenswrapper[3976]: I0320 08:34:37.593351 3976 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 08:34:37.640351 master-0 kubenswrapper[3976]: E0320 08:34:37.640242 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:37.642783 master-0 kubenswrapper[3976]: I0320 08:34:37.642738 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:37.644614 master-0 kubenswrapper[3976]: I0320 08:34:37.644562 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:37.644697 master-0 kubenswrapper[3976]: I0320 08:34:37.644625 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:37.644697 master-0 kubenswrapper[3976]: I0320 08:34:37.644645 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:37.645313 master-0 kubenswrapper[3976]: I0320 08:34:37.645261 3976 scope.go:117] "RemoveContainer" containerID="96ff83659a940b2334d29c8de766dce2f56c0eddae2926cc1d3dd2e347430a94" Mar 20 08:34:37.741600 master-0 kubenswrapper[3976]: E0320 08:34:37.741258 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:37.842467 master-0 kubenswrapper[3976]: E0320 08:34:37.842404 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:37.942932 master-0 kubenswrapper[3976]: E0320 08:34:37.942838 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:38.043948 master-0 kubenswrapper[3976]: E0320 08:34:38.043834 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:38.144638 master-0 kubenswrapper[3976]: E0320 08:34:38.144426 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:38.245148 master-0 kubenswrapper[3976]: E0320 08:34:38.245029 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:38.346236 master-0 kubenswrapper[3976]: E0320 08:34:38.346105 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:38.447639 master-0 kubenswrapper[3976]: E0320 08:34:38.447413 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:38.548363 master-0 kubenswrapper[3976]: E0320 08:34:38.548152 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:38.648953 master-0 kubenswrapper[3976]: E0320 08:34:38.648846 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:38.750038 master-0 kubenswrapper[3976]: E0320 08:34:38.749781 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:38.837976 master-0 kubenswrapper[3976]: I0320 08:34:38.837877 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 20 08:34:38.838818 master-0 kubenswrapper[3976]: I0320 08:34:38.838755 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"879adddb20c3ea4126b46482343a718dc4153b404d31f5e2d5d624d657e93169"} Mar 20 08:34:38.838970 master-0 kubenswrapper[3976]: I0320 08:34:38.838923 3976 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:34:38.840160 master-0 kubenswrapper[3976]: I0320 08:34:38.840104 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:34:38.840160 master-0 kubenswrapper[3976]: I0320 08:34:38.840156 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:34:38.840361 master-0 kubenswrapper[3976]: I0320 08:34:38.840174 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:34:38.850935 master-0 kubenswrapper[3976]: E0320 08:34:38.850893 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:38.951438 master-0 kubenswrapper[3976]: E0320 08:34:38.951381 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:39.052457 master-0 kubenswrapper[3976]: E0320 08:34:39.052394 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:39.153519 master-0 kubenswrapper[3976]: E0320 08:34:39.153462 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:39.254430 master-0 kubenswrapper[3976]: E0320 08:34:39.254346 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:39.355609 master-0 kubenswrapper[3976]: E0320 08:34:39.355461 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:39.455939 master-0 kubenswrapper[3976]: E0320 08:34:39.455841 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:39.556807 master-0 kubenswrapper[3976]: E0320 08:34:39.556763 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:39.657840 master-0 kubenswrapper[3976]: E0320 08:34:39.657617 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:39.669203 master-0 kubenswrapper[3976]: E0320 08:34:39.669107 3976 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 20 08:34:39.758713 master-0 kubenswrapper[3976]: E0320 08:34:39.758559 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:39.859274 master-0 kubenswrapper[3976]: E0320 08:34:39.859193 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:39.959607 master-0 kubenswrapper[3976]: E0320 08:34:39.959431 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:40.060222 master-0 kubenswrapper[3976]: E0320 08:34:40.060102 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:40.161375 master-0 kubenswrapper[3976]: E0320 08:34:40.161281 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:40.262479 master-0 kubenswrapper[3976]: E0320 08:34:40.262205 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:40.363250 master-0 kubenswrapper[3976]: E0320 08:34:40.363147 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:40.463941 master-0 kubenswrapper[3976]: E0320 08:34:40.463849 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:40.564513 master-0 kubenswrapper[3976]: E0320 08:34:40.564402 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:40.665067 master-0 kubenswrapper[3976]: E0320 08:34:40.664951 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:40.766236 master-0 kubenswrapper[3976]: E0320 08:34:40.766121 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:40.867694 master-0 kubenswrapper[3976]: E0320 08:34:40.867502 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:40.968162 master-0 kubenswrapper[3976]: E0320 08:34:40.968046 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:41.069075 master-0 kubenswrapper[3976]: E0320 08:34:41.068926 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:41.169926 master-0 kubenswrapper[3976]: E0320 08:34:41.169683 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:41.269940 master-0 kubenswrapper[3976]: E0320 08:34:41.269827 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:41.370870 master-0 kubenswrapper[3976]: E0320 08:34:41.370738 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:41.471591 master-0 kubenswrapper[3976]: E0320 08:34:41.471415 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:41.572738 master-0 kubenswrapper[3976]: E0320 08:34:41.572625 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:41.673084 master-0 kubenswrapper[3976]: E0320 08:34:41.672973 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:41.774239 master-0 kubenswrapper[3976]: E0320 08:34:41.773985 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:41.874748 master-0 kubenswrapper[3976]: E0320 08:34:41.874642 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:41.975412 master-0 kubenswrapper[3976]: E0320 08:34:41.975303 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:42.076223 master-0 kubenswrapper[3976]: E0320 08:34:42.076119 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:42.177311 master-0 kubenswrapper[3976]: E0320 08:34:42.177162 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:42.278410 master-0 kubenswrapper[3976]: E0320 08:34:42.278300 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:42.379498 master-0 kubenswrapper[3976]: E0320 08:34:42.379295 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:42.480503 master-0 kubenswrapper[3976]: E0320 08:34:42.480375 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:42.581478 master-0 kubenswrapper[3976]: E0320 08:34:42.581372 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:42.682265 master-0 kubenswrapper[3976]: E0320 08:34:42.682039 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:42.783241 master-0 kubenswrapper[3976]: E0320 08:34:42.783146 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:42.883798 master-0 kubenswrapper[3976]: E0320 08:34:42.883678 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:42.983983 master-0 kubenswrapper[3976]: E0320 08:34:42.983784 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:43.084035 master-0 kubenswrapper[3976]: E0320 08:34:43.083912 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:43.184231 master-0 kubenswrapper[3976]: E0320 08:34:43.184103 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:43.285301 master-0 kubenswrapper[3976]: E0320 08:34:43.284981 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:43.385733 master-0 kubenswrapper[3976]: E0320 08:34:43.385621 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:43.486087 master-0 kubenswrapper[3976]: E0320 08:34:43.485980 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:43.587098 master-0 kubenswrapper[3976]: E0320 08:34:43.586993 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:43.664178 master-0 kubenswrapper[3976]: E0320 08:34:43.664087 3976 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 20 08:34:43.688343 master-0 kubenswrapper[3976]: E0320 08:34:43.688277 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:43.789559 master-0 kubenswrapper[3976]: E0320 08:34:43.789414 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:43.890145 master-0 kubenswrapper[3976]: E0320 08:34:43.889952 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:43.990386 master-0 kubenswrapper[3976]: E0320 08:34:43.990255 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:44.091167 master-0 kubenswrapper[3976]: E0320 08:34:44.091013 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:44.192224 master-0 kubenswrapper[3976]: E0320 08:34:44.191966 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:44.292796 master-0 kubenswrapper[3976]: E0320 08:34:44.292650 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:44.393390 master-0 kubenswrapper[3976]: E0320 08:34:44.393226 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:44.494624 master-0 kubenswrapper[3976]: E0320 08:34:44.494365 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:44.594706 master-0 kubenswrapper[3976]: E0320 08:34:44.594558 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:44.695781 master-0 kubenswrapper[3976]: E0320 08:34:44.695677 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:44.796220 master-0 kubenswrapper[3976]: E0320 08:34:44.796111 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:44.897116 master-0 kubenswrapper[3976]: E0320 08:34:44.896978 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:44.997984 master-0 kubenswrapper[3976]: E0320 08:34:44.997880 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:45.098932 master-0 kubenswrapper[3976]: E0320 08:34:45.098679 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:45.199401 master-0 kubenswrapper[3976]: E0320 08:34:45.199297 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:45.300255 master-0 kubenswrapper[3976]: E0320 08:34:45.300105 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:45.401386 master-0 kubenswrapper[3976]: E0320 08:34:45.401019 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:45.501648 master-0 kubenswrapper[3976]: E0320 08:34:45.501521 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:45.958997 master-0 kubenswrapper[3976]: E0320 08:34:45.602281 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:45.958997 master-0 kubenswrapper[3976]: E0320 08:34:45.703271 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:45.958997 master-0 kubenswrapper[3976]: E0320 08:34:45.804177 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:45.958997 master-0 kubenswrapper[3976]: E0320 08:34:45.904799 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:46.006100 master-0 kubenswrapper[3976]: E0320 08:34:46.005996 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:46.106866 master-0 kubenswrapper[3976]: E0320 08:34:46.106736 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:46.207348 master-0 kubenswrapper[3976]: E0320 08:34:46.207217 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:46.308249 master-0 kubenswrapper[3976]: E0320 08:34:46.308132 3976 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:34:46.403304 master-0 kubenswrapper[3976]: I0320 08:34:46.403172 3976 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 08:34:46.486017 master-0 kubenswrapper[3976]: I0320 08:34:46.485894 3976 apiserver.go:52] "Watching apiserver" Mar 20 08:34:46.491846 master-0 kubenswrapper[3976]: I0320 08:34:46.491723 3976 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 08:34:46.492155 master-0 kubenswrapper[3976]: I0320 08:34:46.492076 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-w2zwp","openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg","openshift-network-operator/network-operator-7bd846bfc4-mt454"] Mar 20 08:34:46.492716 master-0 kubenswrapper[3976]: I0320 08:34:46.492650 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.492716 master-0 kubenswrapper[3976]: I0320 08:34:46.492694 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:46.492889 master-0 kubenswrapper[3976]: I0320 08:34:46.492750 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:34:46.495795 master-0 kubenswrapper[3976]: I0320 08:34:46.495699 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Mar 20 08:34:46.496603 master-0 kubenswrapper[3976]: I0320 08:34:46.496534 3976 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Mar 20 08:34:46.497068 master-0 kubenswrapper[3976]: I0320 08:34:46.496760 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 08:34:46.497068 master-0 kubenswrapper[3976]: I0320 08:34:46.496960 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Mar 20 08:34:46.497216 master-0 kubenswrapper[3976]: I0320 08:34:46.497092 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 08:34:46.497471 master-0 kubenswrapper[3976]: I0320 08:34:46.497424 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 08:34:46.497535 master-0 kubenswrapper[3976]: I0320 08:34:46.497502 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Mar 20 08:34:46.497576 master-0 kubenswrapper[3976]: I0320 08:34:46.497531 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 08:34:46.499513 master-0 kubenswrapper[3976]: I0320 08:34:46.497925 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 08:34:46.499934 master-0 kubenswrapper[3976]: I0320 08:34:46.499904 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 08:34:46.577481 master-0 kubenswrapper[3976]: I0320 08:34:46.577272 3976 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 20 08:34:46.614428 master-0 kubenswrapper[3976]: I0320 08:34:46.614292 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-resolv-conf\") pod \"assisted-installer-controller-w2zwp\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.614428 master-0 kubenswrapper[3976]: I0320 08:34:46.614374 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab97c35-fe60-4134-8715-a7c6dd085fb3-service-ca\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:46.614428 master-0 kubenswrapper[3976]: I0320 08:34:46.614420 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-var-run-resolv-conf\") pod \"assisted-installer-controller-w2zwp\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.614920 master-0 kubenswrapper[3976]: I0320 08:34:46.614467 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6mb5\" (UniqueName: \"kubernetes.io/projected/ad692349-5089-4afc-85b2-9b6e7997567c-kube-api-access-h6mb5\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:34:46.614920 master-0 kubenswrapper[3976]: I0320 08:34:46.614500 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:46.614920 master-0 kubenswrapper[3976]: I0320 08:34:46.614537 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:46.614920 master-0 kubenswrapper[3976]: I0320 08:34:46.614575 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ad692349-5089-4afc-85b2-9b6e7997567c-metrics-tls\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:34:46.614920 master-0 kubenswrapper[3976]: I0320 08:34:46.614616 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r25hc\" (UniqueName: \"kubernetes.io/projected/fdfdabb8-83d6-4b38-a709-9e354062ba1a-kube-api-access-r25hc\") pod \"assisted-installer-controller-w2zwp\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.614920 master-0 kubenswrapper[3976]: I0320 08:34:46.614722 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:46.614920 master-0 kubenswrapper[3976]: I0320 08:34:46.614795 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dab97c35-fe60-4134-8715-a7c6dd085fb3-kube-api-access\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:46.614920 master-0 kubenswrapper[3976]: I0320 08:34:46.614834 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-sno-bootstrap-files\") pod \"assisted-installer-controller-w2zwp\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.614920 master-0 kubenswrapper[3976]: I0320 08:34:46.614868 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ad692349-5089-4afc-85b2-9b6e7997567c-host-etc-kube\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:34:46.614920 master-0 kubenswrapper[3976]: I0320 08:34:46.614932 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-ca-bundle\") pod \"assisted-installer-controller-w2zwp\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.715788 master-0 kubenswrapper[3976]: I0320 08:34:46.715660 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r25hc\" (UniqueName: \"kubernetes.io/projected/fdfdabb8-83d6-4b38-a709-9e354062ba1a-kube-api-access-r25hc\") pod \"assisted-installer-controller-w2zwp\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.716136 master-0 kubenswrapper[3976]: I0320 08:34:46.715966 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ad692349-5089-4afc-85b2-9b6e7997567c-metrics-tls\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:34:46.716136 master-0 kubenswrapper[3976]: I0320 08:34:46.716030 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:46.716136 master-0 kubenswrapper[3976]: I0320 08:34:46.716070 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dab97c35-fe60-4134-8715-a7c6dd085fb3-kube-api-access\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:46.716136 master-0 kubenswrapper[3976]: I0320 08:34:46.716111 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-ca-bundle\") pod \"assisted-installer-controller-w2zwp\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.716503 master-0 kubenswrapper[3976]: I0320 08:34:46.716145 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-sno-bootstrap-files\") pod \"assisted-installer-controller-w2zwp\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.716503 master-0 kubenswrapper[3976]: I0320 08:34:46.716176 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ad692349-5089-4afc-85b2-9b6e7997567c-host-etc-kube\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:34:46.716503 master-0 kubenswrapper[3976]: I0320 08:34:46.716243 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-resolv-conf\") pod \"assisted-installer-controller-w2zwp\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.716503 master-0 kubenswrapper[3976]: I0320 08:34:46.716277 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab97c35-fe60-4134-8715-a7c6dd085fb3-service-ca\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:46.716503 master-0 kubenswrapper[3976]: I0320 08:34:46.716315 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-var-run-resolv-conf\") pod \"assisted-installer-controller-w2zwp\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.716503 master-0 kubenswrapper[3976]: I0320 08:34:46.716337 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-ca-bundle\") pod \"assisted-installer-controller-w2zwp\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.716503 master-0 kubenswrapper[3976]: I0320 08:34:46.716346 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6mb5\" (UniqueName: \"kubernetes.io/projected/ad692349-5089-4afc-85b2-9b6e7997567c-kube-api-access-h6mb5\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:34:46.716503 master-0 kubenswrapper[3976]: I0320 08:34:46.716439 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:46.716503 master-0 kubenswrapper[3976]: I0320 08:34:46.716481 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:46.717297 master-0 kubenswrapper[3976]: I0320 08:34:46.716582 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:46.717297 master-0 kubenswrapper[3976]: I0320 08:34:46.716800 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-var-run-resolv-conf\") pod \"assisted-installer-controller-w2zwp\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.717297 master-0 kubenswrapper[3976]: E0320 08:34:46.716887 3976 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 20 08:34:46.717297 master-0 kubenswrapper[3976]: E0320 08:34:46.717017 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert podName:dab97c35-fe60-4134-8715-a7c6dd085fb3 nodeName:}" failed. No retries permitted until 2026-03-20 08:34:47.216983634 +0000 UTC m=+58.485806931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert") pod "cluster-version-operator-56d8475767-9gfkg" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3") : secret "cluster-version-operator-serving-cert" not found Mar 20 08:34:46.717297 master-0 kubenswrapper[3976]: I0320 08:34:46.717290 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-sno-bootstrap-files\") pod \"assisted-installer-controller-w2zwp\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.717736 master-0 kubenswrapper[3976]: I0320 08:34:46.717365 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ad692349-5089-4afc-85b2-9b6e7997567c-host-etc-kube\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:34:46.717736 master-0 kubenswrapper[3976]: I0320 08:34:46.717403 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-resolv-conf\") pod \"assisted-installer-controller-w2zwp\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.717736 master-0 kubenswrapper[3976]: I0320 08:34:46.717479 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:46.718286 master-0 kubenswrapper[3976]: I0320 08:34:46.718225 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab97c35-fe60-4134-8715-a7c6dd085fb3-service-ca\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:46.718435 master-0 kubenswrapper[3976]: I0320 08:34:46.718328 3976 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 08:34:46.728725 master-0 kubenswrapper[3976]: I0320 08:34:46.728526 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ad692349-5089-4afc-85b2-9b6e7997567c-metrics-tls\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:34:46.750615 master-0 kubenswrapper[3976]: I0320 08:34:46.750509 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6mb5\" (UniqueName: \"kubernetes.io/projected/ad692349-5089-4afc-85b2-9b6e7997567c-kube-api-access-h6mb5\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:34:46.752018 master-0 kubenswrapper[3976]: I0320 08:34:46.751926 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r25hc\" (UniqueName: \"kubernetes.io/projected/fdfdabb8-83d6-4b38-a709-9e354062ba1a-kube-api-access-r25hc\") pod \"assisted-installer-controller-w2zwp\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.752365 master-0 kubenswrapper[3976]: I0320 08:34:46.752307 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dab97c35-fe60-4134-8715-a7c6dd085fb3-kube-api-access\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:46.839257 master-0 kubenswrapper[3976]: I0320 08:34:46.838941 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:46.853151 master-0 kubenswrapper[3976]: I0320 08:34:46.853101 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:34:46.856816 master-0 kubenswrapper[3976]: W0320 08:34:46.856737 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdfdabb8_83d6_4b38_a709_9e354062ba1a.slice/crio-1ffb3901c3fd86821b7075a50ea55db7d24b5dcb1f13fa6241a8b0a2754ace0c WatchSource:0}: Error finding container 1ffb3901c3fd86821b7075a50ea55db7d24b5dcb1f13fa6241a8b0a2754ace0c: Status 404 returned error can't find the container with id 1ffb3901c3fd86821b7075a50ea55db7d24b5dcb1f13fa6241a8b0a2754ace0c Mar 20 08:34:46.864022 master-0 kubenswrapper[3976]: I0320 08:34:46.863950 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-w2zwp" event={"ID":"fdfdabb8-83d6-4b38-a709-9e354062ba1a","Type":"ContainerStarted","Data":"1ffb3901c3fd86821b7075a50ea55db7d24b5dcb1f13fa6241a8b0a2754ace0c"} Mar 20 08:34:46.868614 master-0 kubenswrapper[3976]: W0320 08:34:46.868513 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad692349_5089_4afc_85b2_9b6e7997567c.slice/crio-972f78b16f9b47d7c2f794578bdc02049da4958a2a754da51e80ca31806a5573 WatchSource:0}: Error finding container 972f78b16f9b47d7c2f794578bdc02049da4958a2a754da51e80ca31806a5573: Status 404 returned error can't find the container with id 972f78b16f9b47d7c2f794578bdc02049da4958a2a754da51e80ca31806a5573 Mar 20 08:34:47.222022 master-0 kubenswrapper[3976]: I0320 08:34:47.221807 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:47.222338 master-0 kubenswrapper[3976]: E0320 08:34:47.222026 3976 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 20 08:34:47.222338 master-0 kubenswrapper[3976]: E0320 08:34:47.222125 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert podName:dab97c35-fe60-4134-8715-a7c6dd085fb3 nodeName:}" failed. No retries permitted until 2026-03-20 08:34:48.222096519 +0000 UTC m=+59.490919836 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert") pod "cluster-version-operator-56d8475767-9gfkg" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3") : secret "cluster-version-operator-serving-cert" not found Mar 20 08:34:47.868172 master-0 kubenswrapper[3976]: I0320 08:34:47.868106 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" event={"ID":"ad692349-5089-4afc-85b2-9b6e7997567c","Type":"ContainerStarted","Data":"972f78b16f9b47d7c2f794578bdc02049da4958a2a754da51e80ca31806a5573"} Mar 20 08:34:48.230289 master-0 kubenswrapper[3976]: I0320 08:34:48.229644 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:48.230289 master-0 kubenswrapper[3976]: E0320 08:34:48.229902 3976 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 20 08:34:48.230289 master-0 kubenswrapper[3976]: E0320 08:34:48.230061 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert podName:dab97c35-fe60-4134-8715-a7c6dd085fb3 nodeName:}" failed. No retries permitted until 2026-03-20 08:34:50.230020389 +0000 UTC m=+61.498843696 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert") pod "cluster-version-operator-56d8475767-9gfkg" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3") : secret "cluster-version-operator-serving-cert" not found Mar 20 08:34:50.251162 master-0 kubenswrapper[3976]: I0320 08:34:50.251077 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:50.251778 master-0 kubenswrapper[3976]: E0320 08:34:50.251356 3976 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 20 08:34:50.251778 master-0 kubenswrapper[3976]: E0320 08:34:50.251505 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert podName:dab97c35-fe60-4134-8715-a7c6dd085fb3 nodeName:}" failed. No retries permitted until 2026-03-20 08:34:54.251470227 +0000 UTC m=+65.520293544 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert") pod "cluster-version-operator-56d8475767-9gfkg" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3") : secret "cluster-version-operator-serving-cert" not found Mar 20 08:34:53.487331 master-0 kubenswrapper[3976]: I0320 08:34:53.486886 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Mar 20 08:34:53.498097 master-0 kubenswrapper[3976]: I0320 08:34:53.498055 3976 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Mar 20 08:34:54.065546 master-0 kubenswrapper[3976]: I0320 08:34:54.065480 3976 generic.go:334] "Generic (PLEG): container finished" podID="fdfdabb8-83d6-4b38-a709-9e354062ba1a" containerID="ef7d3c19081b3942ae839231125bb3d9ed41e1148d63c694dd308a85f91f661c" exitCode=0 Mar 20 08:34:54.065788 master-0 kubenswrapper[3976]: I0320 08:34:54.065607 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-w2zwp" event={"ID":"fdfdabb8-83d6-4b38-a709-9e354062ba1a","Type":"ContainerDied","Data":"ef7d3c19081b3942ae839231125bb3d9ed41e1148d63c694dd308a85f91f661c"} Mar 20 08:34:54.067769 master-0 kubenswrapper[3976]: I0320 08:34:54.067721 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" event={"ID":"ad692349-5089-4afc-85b2-9b6e7997567c","Type":"ContainerStarted","Data":"871b2216305e9883e9345351545dfa140768d0b1a9975612361ac0d841fda34c"} Mar 20 08:34:54.119219 master-0 kubenswrapper[3976]: I0320 08:34:54.117047 3976 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" podStartSLOduration=24.507532146 podStartE2EDuration="31.117009735s" podCreationTimestamp="2026-03-20 08:34:23 +0000 UTC" firstStartedPulling="2026-03-20 08:34:46.871168858 +0000 UTC m=+58.139992175" lastFinishedPulling="2026-03-20 08:34:53.480646467 +0000 UTC m=+64.749469764" observedRunningTime="2026-03-20 08:34:54.116495889 +0000 UTC m=+65.385319217" watchObservedRunningTime="2026-03-20 08:34:54.117009735 +0000 UTC m=+65.385833062" Mar 20 08:34:54.281357 master-0 kubenswrapper[3976]: I0320 08:34:54.281313 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:34:54.281669 master-0 kubenswrapper[3976]: E0320 08:34:54.281627 3976 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 20 08:34:54.281806 master-0 kubenswrapper[3976]: E0320 08:34:54.281795 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert podName:dab97c35-fe60-4134-8715-a7c6dd085fb3 nodeName:}" failed. No retries permitted until 2026-03-20 08:35:02.281757028 +0000 UTC m=+73.550580305 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert") pod "cluster-version-operator-56d8475767-9gfkg" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3") : secret "cluster-version-operator-serving-cert" not found Mar 20 08:34:55.085775 master-0 kubenswrapper[3976]: I0320 08:34:55.085748 3976 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:55.187471 master-0 kubenswrapper[3976]: I0320 08:34:55.187327 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r25hc\" (UniqueName: \"kubernetes.io/projected/fdfdabb8-83d6-4b38-a709-9e354062ba1a-kube-api-access-r25hc\") pod \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " Mar 20 08:34:55.187768 master-0 kubenswrapper[3976]: I0320 08:34:55.187490 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-ca-bundle\") pod \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " Mar 20 08:34:55.187768 master-0 kubenswrapper[3976]: I0320 08:34:55.187679 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-sno-bootstrap-files\") pod \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " Mar 20 08:34:55.187768 master-0 kubenswrapper[3976]: I0320 08:34:55.187751 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-resolv-conf\") pod \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " Mar 20 08:34:55.187891 master-0 kubenswrapper[3976]: I0320 08:34:55.187776 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-var-run-resolv-conf\") pod \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\" (UID: \"fdfdabb8-83d6-4b38-a709-9e354062ba1a\") " Mar 20 08:34:55.187891 master-0 kubenswrapper[3976]: I0320 08:34:55.187679 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "fdfdabb8-83d6-4b38-a709-9e354062ba1a" (UID: "fdfdabb8-83d6-4b38-a709-9e354062ba1a"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:34:55.187891 master-0 kubenswrapper[3976]: I0320 08:34:55.187720 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "fdfdabb8-83d6-4b38-a709-9e354062ba1a" (UID: "fdfdabb8-83d6-4b38-a709-9e354062ba1a"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:34:55.187891 master-0 kubenswrapper[3976]: I0320 08:34:55.187882 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "fdfdabb8-83d6-4b38-a709-9e354062ba1a" (UID: "fdfdabb8-83d6-4b38-a709-9e354062ba1a"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:34:55.188047 master-0 kubenswrapper[3976]: I0320 08:34:55.187914 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "fdfdabb8-83d6-4b38-a709-9e354062ba1a" (UID: "fdfdabb8-83d6-4b38-a709-9e354062ba1a"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:34:55.194595 master-0 kubenswrapper[3976]: I0320 08:34:55.194498 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdfdabb8-83d6-4b38-a709-9e354062ba1a-kube-api-access-r25hc" (OuterVolumeSpecName: "kube-api-access-r25hc") pod "fdfdabb8-83d6-4b38-a709-9e354062ba1a" (UID: "fdfdabb8-83d6-4b38-a709-9e354062ba1a"). InnerVolumeSpecName "kube-api-access-r25hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:34:55.288960 master-0 kubenswrapper[3976]: I0320 08:34:55.288878 3976 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 20 08:34:55.289224 master-0 kubenswrapper[3976]: I0320 08:34:55.288979 3976 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 08:34:55.289224 master-0 kubenswrapper[3976]: I0320 08:34:55.289005 3976 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r25hc\" (UniqueName: \"kubernetes.io/projected/fdfdabb8-83d6-4b38-a709-9e354062ba1a-kube-api-access-r25hc\") on node \"master-0\" DevicePath \"\"" Mar 20 08:34:55.289224 master-0 kubenswrapper[3976]: I0320 08:34:55.289072 3976 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Mar 20 08:34:55.289224 master-0 kubenswrapper[3976]: I0320 08:34:55.289101 3976 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fdfdabb8-83d6-4b38-a709-9e354062ba1a-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 20 08:34:56.077159 master-0 kubenswrapper[3976]: I0320 08:34:56.077044 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-w2zwp" event={"ID":"fdfdabb8-83d6-4b38-a709-9e354062ba1a","Type":"ContainerDied","Data":"1ffb3901c3fd86821b7075a50ea55db7d24b5dcb1f13fa6241a8b0a2754ace0c"} Mar 20 08:34:56.077159 master-0 kubenswrapper[3976]: I0320 08:34:56.077127 3976 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ffb3901c3fd86821b7075a50ea55db7d24b5dcb1f13fa6241a8b0a2754ace0c" Mar 20 08:34:56.077159 master-0 kubenswrapper[3976]: I0320 08:34:56.077147 3976 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:34:56.553997 master-0 kubenswrapper[3976]: I0320 08:34:56.553934 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-fwvpv"] Mar 20 08:34:56.554623 master-0 kubenswrapper[3976]: E0320 08:34:56.554075 3976 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdfdabb8-83d6-4b38-a709-9e354062ba1a" containerName="assisted-installer-controller" Mar 20 08:34:56.554623 master-0 kubenswrapper[3976]: I0320 08:34:56.554094 3976 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdfdabb8-83d6-4b38-a709-9e354062ba1a" containerName="assisted-installer-controller" Mar 20 08:34:56.554623 master-0 kubenswrapper[3976]: I0320 08:34:56.554134 3976 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdfdabb8-83d6-4b38-a709-9e354062ba1a" containerName="assisted-installer-controller" Mar 20 08:34:56.554623 master-0 kubenswrapper[3976]: I0320 08:34:56.554411 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-fwvpv" Mar 20 08:34:56.600135 master-0 kubenswrapper[3976]: I0320 08:34:56.600093 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcp9k\" (UniqueName: \"kubernetes.io/projected/412becc8-c1a7-422c-94d1-dd1849070ef1-kube-api-access-wcp9k\") pod \"mtu-prober-fwvpv\" (UID: \"412becc8-c1a7-422c-94d1-dd1849070ef1\") " pod="openshift-network-operator/mtu-prober-fwvpv" Mar 20 08:34:56.701045 master-0 kubenswrapper[3976]: I0320 08:34:56.700931 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcp9k\" (UniqueName: \"kubernetes.io/projected/412becc8-c1a7-422c-94d1-dd1849070ef1-kube-api-access-wcp9k\") pod \"mtu-prober-fwvpv\" (UID: \"412becc8-c1a7-422c-94d1-dd1849070ef1\") " pod="openshift-network-operator/mtu-prober-fwvpv" Mar 20 08:34:56.733341 master-0 kubenswrapper[3976]: I0320 08:34:56.733244 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcp9k\" (UniqueName: \"kubernetes.io/projected/412becc8-c1a7-422c-94d1-dd1849070ef1-kube-api-access-wcp9k\") pod \"mtu-prober-fwvpv\" (UID: \"412becc8-c1a7-422c-94d1-dd1849070ef1\") " pod="openshift-network-operator/mtu-prober-fwvpv" Mar 20 08:34:56.868902 master-0 kubenswrapper[3976]: I0320 08:34:56.868731 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-fwvpv" Mar 20 08:34:56.884822 master-0 kubenswrapper[3976]: W0320 08:34:56.884738 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod412becc8_c1a7_422c_94d1_dd1849070ef1.slice/crio-bb6de0a9ae1218db5c085d7b11d80bbaa7dc058173d109d69d1996de9307a7d5 WatchSource:0}: Error finding container bb6de0a9ae1218db5c085d7b11d80bbaa7dc058173d109d69d1996de9307a7d5: Status 404 returned error can't find the container with id bb6de0a9ae1218db5c085d7b11d80bbaa7dc058173d109d69d1996de9307a7d5 Mar 20 08:34:57.081008 master-0 kubenswrapper[3976]: I0320 08:34:57.080955 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-fwvpv" event={"ID":"412becc8-c1a7-422c-94d1-dd1849070ef1","Type":"ContainerStarted","Data":"bb6de0a9ae1218db5c085d7b11d80bbaa7dc058173d109d69d1996de9307a7d5"} Mar 20 08:34:58.086634 master-0 kubenswrapper[3976]: I0320 08:34:58.086533 3976 generic.go:334] "Generic (PLEG): container finished" podID="412becc8-c1a7-422c-94d1-dd1849070ef1" containerID="21b9803fda84668208544ea6b68c3d3a859b684d4b97f36df7e3a02f81f34399" exitCode=0 Mar 20 08:34:58.086634 master-0 kubenswrapper[3976]: I0320 08:34:58.086614 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-fwvpv" event={"ID":"412becc8-c1a7-422c-94d1-dd1849070ef1","Type":"ContainerDied","Data":"21b9803fda84668208544ea6b68c3d3a859b684d4b97f36df7e3a02f81f34399"} Mar 20 08:34:59.109625 master-0 kubenswrapper[3976]: I0320 08:34:59.109554 3976 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-fwvpv" Mar 20 08:34:59.120110 master-0 kubenswrapper[3976]: I0320 08:34:59.120037 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcp9k\" (UniqueName: \"kubernetes.io/projected/412becc8-c1a7-422c-94d1-dd1849070ef1-kube-api-access-wcp9k\") pod \"412becc8-c1a7-422c-94d1-dd1849070ef1\" (UID: \"412becc8-c1a7-422c-94d1-dd1849070ef1\") " Mar 20 08:34:59.123993 master-0 kubenswrapper[3976]: I0320 08:34:59.123921 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/412becc8-c1a7-422c-94d1-dd1849070ef1-kube-api-access-wcp9k" (OuterVolumeSpecName: "kube-api-access-wcp9k") pod "412becc8-c1a7-422c-94d1-dd1849070ef1" (UID: "412becc8-c1a7-422c-94d1-dd1849070ef1"). InnerVolumeSpecName "kube-api-access-wcp9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:34:59.220788 master-0 kubenswrapper[3976]: I0320 08:34:59.220717 3976 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcp9k\" (UniqueName: \"kubernetes.io/projected/412becc8-c1a7-422c-94d1-dd1849070ef1-kube-api-access-wcp9k\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:00.094734 master-0 kubenswrapper[3976]: I0320 08:35:00.094621 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-fwvpv" event={"ID":"412becc8-c1a7-422c-94d1-dd1849070ef1","Type":"ContainerDied","Data":"bb6de0a9ae1218db5c085d7b11d80bbaa7dc058173d109d69d1996de9307a7d5"} Mar 20 08:35:00.094734 master-0 kubenswrapper[3976]: I0320 08:35:00.094685 3976 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-fwvpv" Mar 20 08:35:00.094734 master-0 kubenswrapper[3976]: I0320 08:35:00.094714 3976 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb6de0a9ae1218db5c085d7b11d80bbaa7dc058173d109d69d1996de9307a7d5" Mar 20 08:35:01.561150 master-0 kubenswrapper[3976]: I0320 08:35:01.561056 3976 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-fwvpv"] Mar 20 08:35:01.563671 master-0 kubenswrapper[3976]: I0320 08:35:01.563608 3976 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-fwvpv"] Mar 20 08:35:01.649551 master-0 kubenswrapper[3976]: I0320 08:35:01.649434 3976 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="412becc8-c1a7-422c-94d1-dd1849070ef1" path="/var/lib/kubelet/pods/412becc8-c1a7-422c-94d1-dd1849070ef1/volumes" Mar 20 08:35:02.350297 master-0 kubenswrapper[3976]: I0320 08:35:02.350175 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:35:02.350605 master-0 kubenswrapper[3976]: E0320 08:35:02.350459 3976 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 20 08:35:02.350699 master-0 kubenswrapper[3976]: E0320 08:35:02.350629 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert podName:dab97c35-fe60-4134-8715-a7c6dd085fb3 nodeName:}" failed. No retries permitted until 2026-03-20 08:35:18.350600182 +0000 UTC m=+89.619423469 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert") pod "cluster-version-operator-56d8475767-9gfkg" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3") : secret "cluster-version-operator-serving-cert" not found Mar 20 08:35:06.421725 master-0 kubenswrapper[3976]: I0320 08:35:06.421613 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-2fp4b"] Mar 20 08:35:06.421725 master-0 kubenswrapper[3976]: E0320 08:35:06.421730 3976 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412becc8-c1a7-422c-94d1-dd1849070ef1" containerName="prober" Mar 20 08:35:06.422929 master-0 kubenswrapper[3976]: I0320 08:35:06.421755 3976 state_mem.go:107] "Deleted CPUSet assignment" podUID="412becc8-c1a7-422c-94d1-dd1849070ef1" containerName="prober" Mar 20 08:35:06.422929 master-0 kubenswrapper[3976]: I0320 08:35:06.421808 3976 memory_manager.go:354] "RemoveStaleState removing state" podUID="412becc8-c1a7-422c-94d1-dd1849070ef1" containerName="prober" Mar 20 08:35:06.422929 master-0 kubenswrapper[3976]: I0320 08:35:06.422162 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.425515 master-0 kubenswrapper[3976]: I0320 08:35:06.425444 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 08:35:06.428902 master-0 kubenswrapper[3976]: I0320 08:35:06.428829 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 08:35:06.429270 master-0 kubenswrapper[3976]: I0320 08:35:06.429179 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 08:35:06.429383 master-0 kubenswrapper[3976]: I0320 08:35:06.429286 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.478568 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-kubelet\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.478640 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-etc-kubernetes\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.478686 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.478709 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-os-release\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.478736 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cni-binary-copy\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.478759 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-conf-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.478832 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxg4z\" (UniqueName: \"kubernetes.io/projected/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-kube-api-access-fxg4z\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.478902 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-socket-dir-parent\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.478942 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cnibin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.478992 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-multus-certs\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.479041 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-system-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.479090 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-daemon-config\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.479138 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-k8s-cni-cncf-io\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.479217 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-multus\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.479270 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-hostroot\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.479342 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-netns\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.479591 master-0 kubenswrapper[3976]: I0320 08:35:06.479397 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-bin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.580322 master-0 kubenswrapper[3976]: I0320 08:35:06.580248 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-kubelet\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.580322 master-0 kubenswrapper[3976]: I0320 08:35:06.580327 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-etc-kubernetes\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.580577 master-0 kubenswrapper[3976]: I0320 08:35:06.580391 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-etc-kubernetes\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.580577 master-0 kubenswrapper[3976]: I0320 08:35:06.580445 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-kubelet\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.580577 master-0 kubenswrapper[3976]: I0320 08:35:06.580495 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.580727 master-0 kubenswrapper[3976]: I0320 08:35:06.580690 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-os-release\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.580781 master-0 kubenswrapper[3976]: I0320 08:35:06.580737 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cni-binary-copy\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.580781 master-0 kubenswrapper[3976]: I0320 08:35:06.580765 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-conf-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.580857 master-0 kubenswrapper[3976]: I0320 08:35:06.580790 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxg4z\" (UniqueName: \"kubernetes.io/projected/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-kube-api-access-fxg4z\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.580857 master-0 kubenswrapper[3976]: I0320 08:35:06.580815 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-socket-dir-parent\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.580857 master-0 kubenswrapper[3976]: I0320 08:35:06.580838 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-os-release\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.580857 master-0 kubenswrapper[3976]: I0320 08:35:06.580843 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cnibin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581000 master-0 kubenswrapper[3976]: I0320 08:35:06.580883 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-multus-certs\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581000 master-0 kubenswrapper[3976]: I0320 08:35:06.580897 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cnibin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581000 master-0 kubenswrapper[3976]: I0320 08:35:06.580922 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-system-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581000 master-0 kubenswrapper[3976]: I0320 08:35:06.580941 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-conf-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581000 master-0 kubenswrapper[3976]: I0320 08:35:06.580957 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-daemon-config\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581168 master-0 kubenswrapper[3976]: I0320 08:35:06.580991 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581168 master-0 kubenswrapper[3976]: I0320 08:35:06.581022 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-multus-certs\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581168 master-0 kubenswrapper[3976]: I0320 08:35:06.581011 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-k8s-cni-cncf-io\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581168 master-0 kubenswrapper[3976]: I0320 08:35:06.581046 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-k8s-cni-cncf-io\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581168 master-0 kubenswrapper[3976]: I0320 08:35:06.581128 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-socket-dir-parent\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581168 master-0 kubenswrapper[3976]: I0320 08:35:06.581148 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-multus\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581433 master-0 kubenswrapper[3976]: I0320 08:35:06.581221 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-hostroot\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581433 master-0 kubenswrapper[3976]: I0320 08:35:06.581262 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-netns\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581433 master-0 kubenswrapper[3976]: I0320 08:35:06.581287 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-multus\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581433 master-0 kubenswrapper[3976]: I0320 08:35:06.581302 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-bin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581433 master-0 kubenswrapper[3976]: I0320 08:35:06.581336 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-hostroot\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581433 master-0 kubenswrapper[3976]: I0320 08:35:06.581292 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-system-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581433 master-0 kubenswrapper[3976]: I0320 08:35:06.581399 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-bin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.581433 master-0 kubenswrapper[3976]: I0320 08:35:06.581401 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-netns\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.582375 master-0 kubenswrapper[3976]: I0320 08:35:06.582342 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cni-binary-copy\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.582418 master-0 kubenswrapper[3976]: I0320 08:35:06.582343 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-daemon-config\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.608365 master-0 kubenswrapper[3976]: I0320 08:35:06.608313 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-rpbcn"] Mar 20 08:35:06.608908 master-0 kubenswrapper[3976]: I0320 08:35:06.608886 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.611904 master-0 kubenswrapper[3976]: I0320 08:35:06.611741 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 20 08:35:06.612354 master-0 kubenswrapper[3976]: I0320 08:35:06.612280 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 08:35:06.622239 master-0 kubenswrapper[3976]: I0320 08:35:06.622068 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxg4z\" (UniqueName: \"kubernetes.io/projected/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-kube-api-access-fxg4z\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.682022 master-0 kubenswrapper[3976]: I0320 08:35:06.681849 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-os-release\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.682022 master-0 kubenswrapper[3976]: I0320 08:35:06.681896 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.682022 master-0 kubenswrapper[3976]: I0320 08:35:06.681923 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.682022 master-0 kubenswrapper[3976]: I0320 08:35:06.681943 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-system-cni-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.682022 master-0 kubenswrapper[3976]: I0320 08:35:06.681958 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.682022 master-0 kubenswrapper[3976]: I0320 08:35:06.682002 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-cnibin\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.682630 master-0 kubenswrapper[3976]: I0320 08:35:06.682229 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-binary-copy\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.682630 master-0 kubenswrapper[3976]: I0320 08:35:06.682325 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c5ch\" (UniqueName: \"kubernetes.io/projected/b98b4efc-6117-487f-9cfc-38ce66dd9570-kube-api-access-6c5ch\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.749038 master-0 kubenswrapper[3976]: I0320 08:35:06.748965 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2fp4b" Mar 20 08:35:06.783839 master-0 kubenswrapper[3976]: I0320 08:35:06.783264 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-system-cni-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.784005 master-0 kubenswrapper[3976]: I0320 08:35:06.783867 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.784005 master-0 kubenswrapper[3976]: I0320 08:35:06.783913 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-cnibin\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.784235 master-0 kubenswrapper[3976]: I0320 08:35:06.784146 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-cnibin\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.784325 master-0 kubenswrapper[3976]: I0320 08:35:06.783466 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-system-cni-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.784467 master-0 kubenswrapper[3976]: I0320 08:35:06.784377 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-binary-copy\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.784543 master-0 kubenswrapper[3976]: I0320 08:35:06.784480 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c5ch\" (UniqueName: \"kubernetes.io/projected/b98b4efc-6117-487f-9cfc-38ce66dd9570-kube-api-access-6c5ch\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.784805 master-0 kubenswrapper[3976]: I0320 08:35:06.784733 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-os-release\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.785126 master-0 kubenswrapper[3976]: I0320 08:35:06.785078 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-os-release\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.785456 master-0 kubenswrapper[3976]: I0320 08:35:06.785382 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.785550 master-0 kubenswrapper[3976]: I0320 08:35:06.785403 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.785550 master-0 kubenswrapper[3976]: I0320 08:35:06.785511 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.786604 master-0 kubenswrapper[3976]: I0320 08:35:06.786537 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-binary-copy\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.786699 master-0 kubenswrapper[3976]: I0320 08:35:06.785709 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.787117 master-0 kubenswrapper[3976]: I0320 08:35:06.787033 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.814072 master-0 kubenswrapper[3976]: I0320 08:35:06.813925 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c5ch\" (UniqueName: \"kubernetes.io/projected/b98b4efc-6117-487f-9cfc-38ce66dd9570-kube-api-access-6c5ch\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.922716 master-0 kubenswrapper[3976]: I0320 08:35:06.922593 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:35:06.940034 master-0 kubenswrapper[3976]: W0320 08:35:06.939936 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb98b4efc_6117_487f_9cfc_38ce66dd9570.slice/crio-a49d9b3d48753f2f078ac0130c6af5ca5c251ec6fd9f68a980d833a929f8987e WatchSource:0}: Error finding container a49d9b3d48753f2f078ac0130c6af5ca5c251ec6fd9f68a980d833a929f8987e: Status 404 returned error can't find the container with id a49d9b3d48753f2f078ac0130c6af5ca5c251ec6fd9f68a980d833a929f8987e Mar 20 08:35:07.113935 master-0 kubenswrapper[3976]: I0320 08:35:07.113713 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerStarted","Data":"a49d9b3d48753f2f078ac0130c6af5ca5c251ec6fd9f68a980d833a929f8987e"} Mar 20 08:35:07.115920 master-0 kubenswrapper[3976]: I0320 08:35:07.115833 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2fp4b" event={"ID":"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7","Type":"ContainerStarted","Data":"1cdce851761fa32d1bf31f749f94743d1cc0bca1a250ad08c114ccc4c2f77b9b"} Mar 20 08:35:07.399825 master-0 kubenswrapper[3976]: I0320 08:35:07.399734 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-srdjm"] Mar 20 08:35:07.403079 master-0 kubenswrapper[3976]: I0320 08:35:07.403022 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:07.403230 master-0 kubenswrapper[3976]: E0320 08:35:07.403140 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:07.492047 master-0 kubenswrapper[3976]: I0320 08:35:07.491970 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njjkt\" (UniqueName: \"kubernetes.io/projected/813f91c2-2b37-4681-968d-4217e286e22f-kube-api-access-njjkt\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:07.492047 master-0 kubenswrapper[3976]: I0320 08:35:07.492055 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:07.592478 master-0 kubenswrapper[3976]: I0320 08:35:07.592385 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njjkt\" (UniqueName: \"kubernetes.io/projected/813f91c2-2b37-4681-968d-4217e286e22f-kube-api-access-njjkt\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:07.592478 master-0 kubenswrapper[3976]: I0320 08:35:07.592488 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:07.592947 master-0 kubenswrapper[3976]: E0320 08:35:07.592665 3976 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:35:07.592947 master-0 kubenswrapper[3976]: E0320 08:35:07.592738 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs podName:813f91c2-2b37-4681-968d-4217e286e22f nodeName:}" failed. No retries permitted until 2026-03-20 08:35:08.092715285 +0000 UTC m=+79.361538602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs") pod "network-metrics-daemon-srdjm" (UID: "813f91c2-2b37-4681-968d-4217e286e22f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:35:07.854290 master-0 kubenswrapper[3976]: I0320 08:35:07.854211 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njjkt\" (UniqueName: \"kubernetes.io/projected/813f91c2-2b37-4681-968d-4217e286e22f-kube-api-access-njjkt\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:08.096201 master-0 kubenswrapper[3976]: I0320 08:35:08.096120 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:08.096471 master-0 kubenswrapper[3976]: E0320 08:35:08.096337 3976 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:35:08.096471 master-0 kubenswrapper[3976]: E0320 08:35:08.096418 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs podName:813f91c2-2b37-4681-968d-4217e286e22f nodeName:}" failed. No retries permitted until 2026-03-20 08:35:09.096399101 +0000 UTC m=+80.365222398 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs") pod "network-metrics-daemon-srdjm" (UID: "813f91c2-2b37-4681-968d-4217e286e22f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:35:09.106495 master-0 kubenswrapper[3976]: I0320 08:35:09.106420 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:09.107379 master-0 kubenswrapper[3976]: E0320 08:35:09.106658 3976 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:35:09.107379 master-0 kubenswrapper[3976]: E0320 08:35:09.106739 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs podName:813f91c2-2b37-4681-968d-4217e286e22f nodeName:}" failed. No retries permitted until 2026-03-20 08:35:11.106711972 +0000 UTC m=+82.375535259 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs") pod "network-metrics-daemon-srdjm" (UID: "813f91c2-2b37-4681-968d-4217e286e22f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:35:09.642771 master-0 kubenswrapper[3976]: I0320 08:35:09.642707 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:09.643219 master-0 kubenswrapper[3976]: E0320 08:35:09.643137 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:10.125486 master-0 kubenswrapper[3976]: I0320 08:35:10.125350 3976 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="5772594b3f3e6aae19a5e357ad1c9bc0dade5e494667c07e21d51c8697d24253" exitCode=0 Mar 20 08:35:10.125486 master-0 kubenswrapper[3976]: I0320 08:35:10.125412 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerDied","Data":"5772594b3f3e6aae19a5e357ad1c9bc0dade5e494667c07e21d51c8697d24253"} Mar 20 08:35:11.143065 master-0 kubenswrapper[3976]: I0320 08:35:11.142906 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:11.144227 master-0 kubenswrapper[3976]: E0320 08:35:11.143110 3976 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:35:11.144227 master-0 kubenswrapper[3976]: E0320 08:35:11.143215 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs podName:813f91c2-2b37-4681-968d-4217e286e22f nodeName:}" failed. No retries permitted until 2026-03-20 08:35:15.143173195 +0000 UTC m=+86.411996572 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs") pod "network-metrics-daemon-srdjm" (UID: "813f91c2-2b37-4681-968d-4217e286e22f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:35:11.644128 master-0 kubenswrapper[3976]: I0320 08:35:11.643702 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:11.644128 master-0 kubenswrapper[3976]: E0320 08:35:11.643894 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:13.682423 master-0 kubenswrapper[3976]: I0320 08:35:13.682369 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:13.683403 master-0 kubenswrapper[3976]: E0320 08:35:13.683369 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:14.658542 master-0 kubenswrapper[3976]: W0320 08:35:14.658345 3976 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 20 08:35:14.659245 master-0 kubenswrapper[3976]: I0320 08:35:14.659130 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 20 08:35:15.192995 master-0 kubenswrapper[3976]: I0320 08:35:15.192933 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:15.193669 master-0 kubenswrapper[3976]: E0320 08:35:15.193209 3976 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:35:15.193669 master-0 kubenswrapper[3976]: E0320 08:35:15.193385 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs podName:813f91c2-2b37-4681-968d-4217e286e22f nodeName:}" failed. No retries permitted until 2026-03-20 08:35:23.193327248 +0000 UTC m=+94.462150535 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs") pod "network-metrics-daemon-srdjm" (UID: "813f91c2-2b37-4681-968d-4217e286e22f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:35:15.642588 master-0 kubenswrapper[3976]: I0320 08:35:15.642526 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:15.642965 master-0 kubenswrapper[3976]: E0320 08:35:15.642742 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:17.642803 master-0 kubenswrapper[3976]: I0320 08:35:17.642727 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:17.643452 master-0 kubenswrapper[3976]: E0320 08:35:17.642974 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:18.422121 master-0 kubenswrapper[3976]: I0320 08:35:18.422047 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:35:18.422387 master-0 kubenswrapper[3976]: E0320 08:35:18.422270 3976 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 20 08:35:18.422387 master-0 kubenswrapper[3976]: E0320 08:35:18.422338 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert podName:dab97c35-fe60-4134-8715-a7c6dd085fb3 nodeName:}" failed. No retries permitted until 2026-03-20 08:35:50.422316265 +0000 UTC m=+121.691139552 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert") pod "cluster-version-operator-56d8475767-9gfkg" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3") : secret "cluster-version-operator-serving-cert" not found Mar 20 08:35:18.812666 master-0 kubenswrapper[3976]: I0320 08:35:18.809794 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj"] Mar 20 08:35:18.812666 master-0 kubenswrapper[3976]: I0320 08:35:18.810219 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:35:18.814898 master-0 kubenswrapper[3976]: I0320 08:35:18.814253 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 08:35:18.814898 master-0 kubenswrapper[3976]: I0320 08:35:18.814405 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 08:35:18.814898 master-0 kubenswrapper[3976]: I0320 08:35:18.814512 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 08:35:18.814898 master-0 kubenswrapper[3976]: I0320 08:35:18.814653 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 08:35:18.814898 master-0 kubenswrapper[3976]: I0320 08:35:18.814656 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 08:35:18.849660 master-0 kubenswrapper[3976]: I0320 08:35:18.849310 3976 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=4.849289758 podStartE2EDuration="4.849289758s" podCreationTimestamp="2026-03-20 08:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:35:18.849166844 +0000 UTC m=+90.117990131" watchObservedRunningTime="2026-03-20 08:35:18.849289758 +0000 UTC m=+90.118113045" Mar 20 08:35:18.931229 master-0 kubenswrapper[3976]: I0320 08:35:18.927975 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:35:18.931229 master-0 kubenswrapper[3976]: I0320 08:35:18.928036 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:35:18.931229 master-0 kubenswrapper[3976]: I0320 08:35:18.928062 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd6vv\" (UniqueName: \"kubernetes.io/projected/5e3ddf9e-eeb5-4266-b675-092fd4e27623-kube-api-access-xd6vv\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:35:18.931229 master-0 kubenswrapper[3976]: I0320 08:35:18.928229 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-env-overrides\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:35:19.020289 master-0 kubenswrapper[3976]: I0320 08:35:19.019206 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nsmzc"] Mar 20 08:35:19.020708 master-0 kubenswrapper[3976]: I0320 08:35:19.020684 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.023943 master-0 kubenswrapper[3976]: I0320 08:35:19.023835 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 08:35:19.024920 master-0 kubenswrapper[3976]: I0320 08:35:19.024890 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 08:35:19.029128 master-0 kubenswrapper[3976]: I0320 08:35:19.029090 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-kubelet\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029128 master-0 kubenswrapper[3976]: I0320 08:35:19.029123 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-cni-bin\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029240 master-0 kubenswrapper[3976]: I0320 08:35:19.029156 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:35:19.029240 master-0 kubenswrapper[3976]: I0320 08:35:19.029175 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-systemd-units\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029240 master-0 kubenswrapper[3976]: I0320 08:35:19.029201 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-systemd\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029240 master-0 kubenswrapper[3976]: I0320 08:35:19.029220 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:35:19.029240 master-0 kubenswrapper[3976]: I0320 08:35:19.029235 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-etc-openvswitch\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029406 master-0 kubenswrapper[3976]: I0320 08:35:19.029254 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr5s8\" (UniqueName: \"kubernetes.io/projected/d3dc1be4-f742-47cc-95b6-82e0bc34a716-kube-api-access-xr5s8\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029406 master-0 kubenswrapper[3976]: I0320 08:35:19.029271 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-run-netns\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029406 master-0 kubenswrapper[3976]: I0320 08:35:19.029286 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovnkube-config\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029406 master-0 kubenswrapper[3976]: I0320 08:35:19.029299 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-ovn\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029406 master-0 kubenswrapper[3976]: I0320 08:35:19.029318 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-run-ovn-kubernetes\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029406 master-0 kubenswrapper[3976]: I0320 08:35:19.029334 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd6vv\" (UniqueName: \"kubernetes.io/projected/5e3ddf9e-eeb5-4266-b675-092fd4e27623-kube-api-access-xd6vv\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:35:19.029406 master-0 kubenswrapper[3976]: I0320 08:35:19.029350 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-env-overrides\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029406 master-0 kubenswrapper[3976]: I0320 08:35:19.029370 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-slash\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029406 master-0 kubenswrapper[3976]: I0320 08:35:19.029390 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-env-overrides\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:35:19.029406 master-0 kubenswrapper[3976]: I0320 08:35:19.029408 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-cni-netd\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029670 master-0 kubenswrapper[3976]: I0320 08:35:19.029427 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029670 master-0 kubenswrapper[3976]: I0320 08:35:19.029446 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-node-log\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029670 master-0 kubenswrapper[3976]: I0320 08:35:19.029464 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovnkube-script-lib\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029670 master-0 kubenswrapper[3976]: I0320 08:35:19.029482 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-openvswitch\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029670 master-0 kubenswrapper[3976]: I0320 08:35:19.029498 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-log-socket\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029670 master-0 kubenswrapper[3976]: I0320 08:35:19.029516 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovn-node-metrics-cert\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.029670 master-0 kubenswrapper[3976]: I0320 08:35:19.029533 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-var-lib-openvswitch\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.030207 master-0 kubenswrapper[3976]: I0320 08:35:19.030158 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:35:19.031591 master-0 kubenswrapper[3976]: I0320 08:35:19.031557 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-env-overrides\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:35:19.046770 master-0 kubenswrapper[3976]: I0320 08:35:19.041930 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:35:19.055166 master-0 kubenswrapper[3976]: I0320 08:35:19.055110 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd6vv\" (UniqueName: \"kubernetes.io/projected/5e3ddf9e-eeb5-4266-b675-092fd4e27623-kube-api-access-xd6vv\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:35:19.129861 master-0 kubenswrapper[3976]: I0320 08:35:19.129771 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-cni-bin\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.129861 master-0 kubenswrapper[3976]: I0320 08:35:19.129822 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-systemd-units\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.129861 master-0 kubenswrapper[3976]: I0320 08:35:19.129842 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-systemd\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130061 master-0 kubenswrapper[3976]: I0320 08:35:19.129888 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-systemd\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130061 master-0 kubenswrapper[3976]: I0320 08:35:19.129897 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-systemd-units\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130061 master-0 kubenswrapper[3976]: I0320 08:35:19.129919 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-cni-bin\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130061 master-0 kubenswrapper[3976]: I0320 08:35:19.129935 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-etc-openvswitch\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130061 master-0 kubenswrapper[3976]: I0320 08:35:19.129955 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr5s8\" (UniqueName: \"kubernetes.io/projected/d3dc1be4-f742-47cc-95b6-82e0bc34a716-kube-api-access-xr5s8\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130061 master-0 kubenswrapper[3976]: I0320 08:35:19.129971 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-run-netns\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130061 master-0 kubenswrapper[3976]: I0320 08:35:19.129987 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovnkube-config\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130061 master-0 kubenswrapper[3976]: I0320 08:35:19.130001 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-ovn\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130061 master-0 kubenswrapper[3976]: I0320 08:35:19.130019 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-run-ovn-kubernetes\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130061 master-0 kubenswrapper[3976]: I0320 08:35:19.130036 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-env-overrides\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130061 master-0 kubenswrapper[3976]: I0320 08:35:19.130061 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-slash\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130442 master-0 kubenswrapper[3976]: I0320 08:35:19.130078 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-cni-netd\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130442 master-0 kubenswrapper[3976]: I0320 08:35:19.130094 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130442 master-0 kubenswrapper[3976]: I0320 08:35:19.130115 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-node-log\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130442 master-0 kubenswrapper[3976]: I0320 08:35:19.130132 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovnkube-script-lib\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130442 master-0 kubenswrapper[3976]: I0320 08:35:19.130151 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-openvswitch\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130442 master-0 kubenswrapper[3976]: I0320 08:35:19.130167 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-log-socket\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130442 master-0 kubenswrapper[3976]: I0320 08:35:19.130214 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovn-node-metrics-cert\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130442 master-0 kubenswrapper[3976]: I0320 08:35:19.130231 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-var-lib-openvswitch\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130442 master-0 kubenswrapper[3976]: I0320 08:35:19.130254 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-slash\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130442 master-0 kubenswrapper[3976]: I0320 08:35:19.130262 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-var-lib-openvswitch\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130442 master-0 kubenswrapper[3976]: I0320 08:35:19.130281 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-cni-netd\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130442 master-0 kubenswrapper[3976]: I0320 08:35:19.130292 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-run-netns\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130442 master-0 kubenswrapper[3976]: I0320 08:35:19.130309 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130442 master-0 kubenswrapper[3976]: I0320 08:35:19.130333 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-node-log\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130790 master-0 kubenswrapper[3976]: I0320 08:35:19.130531 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-etc-openvswitch\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130790 master-0 kubenswrapper[3976]: I0320 08:35:19.130578 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-log-socket\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130790 master-0 kubenswrapper[3976]: I0320 08:35:19.130683 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-kubelet\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.130790 master-0 kubenswrapper[3976]: I0320 08:35:19.130764 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-kubelet\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.131016 master-0 kubenswrapper[3976]: I0320 08:35:19.130887 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-run-ovn-kubernetes\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.131016 master-0 kubenswrapper[3976]: I0320 08:35:19.130923 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-openvswitch\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.131016 master-0 kubenswrapper[3976]: I0320 08:35:19.130938 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovnkube-script-lib\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.131016 master-0 kubenswrapper[3976]: I0320 08:35:19.130972 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-ovn\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.131301 master-0 kubenswrapper[3976]: I0320 08:35:19.131263 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-env-overrides\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.131491 master-0 kubenswrapper[3976]: I0320 08:35:19.131464 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovnkube-config\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.137545 master-0 kubenswrapper[3976]: I0320 08:35:19.137512 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovn-node-metrics-cert\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.143525 master-0 kubenswrapper[3976]: I0320 08:35:19.143503 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:35:19.154320 master-0 kubenswrapper[3976]: I0320 08:35:19.154292 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr5s8\" (UniqueName: \"kubernetes.io/projected/d3dc1be4-f742-47cc-95b6-82e0bc34a716-kube-api-access-xr5s8\") pod \"ovnkube-node-nsmzc\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.336038 master-0 kubenswrapper[3976]: I0320 08:35:19.335998 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:19.645082 master-0 kubenswrapper[3976]: I0320 08:35:19.644972 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:19.645082 master-0 kubenswrapper[3976]: E0320 08:35:19.645094 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:19.657127 master-0 kubenswrapper[3976]: I0320 08:35:19.657066 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 20 08:35:21.643576 master-0 kubenswrapper[3976]: I0320 08:35:21.643500 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:21.644473 master-0 kubenswrapper[3976]: E0320 08:35:21.643736 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:21.656825 master-0 kubenswrapper[3976]: I0320 08:35:21.656753 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 20 08:35:22.000447 master-0 kubenswrapper[3976]: I0320 08:35:21.999804 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xnrw6"] Mar 20 08:35:22.000447 master-0 kubenswrapper[3976]: I0320 08:35:22.000203 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:22.000447 master-0 kubenswrapper[3976]: E0320 08:35:22.000263 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:22.030796 master-0 kubenswrapper[3976]: I0320 08:35:22.029722 3976 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=3.029694399 podStartE2EDuration="3.029694399s" podCreationTimestamp="2026-03-20 08:35:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:35:22.029635167 +0000 UTC m=+93.298458454" watchObservedRunningTime="2026-03-20 08:35:22.029694399 +0000 UTC m=+93.298517696" Mar 20 08:35:22.030796 master-0 kubenswrapper[3976]: I0320 08:35:22.030444 3976 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=1.030436791 podStartE2EDuration="1.030436791s" podCreationTimestamp="2026-03-20 08:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:35:22.013904102 +0000 UTC m=+93.282727389" watchObservedRunningTime="2026-03-20 08:35:22.030436791 +0000 UTC m=+93.299260078" Mar 20 08:35:22.155292 master-0 kubenswrapper[3976]: I0320 08:35:22.155229 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2zzd\" (UniqueName: \"kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd\") pod \"network-check-target-xnrw6\" (UID: \"c0142d4e-9fd4-4375-a773-bb89b38af654\") " pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:22.256501 master-0 kubenswrapper[3976]: I0320 08:35:22.256340 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2zzd\" (UniqueName: \"kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd\") pod \"network-check-target-xnrw6\" (UID: \"c0142d4e-9fd4-4375-a773-bb89b38af654\") " pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:22.269347 master-0 kubenswrapper[3976]: E0320 08:35:22.269246 3976 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:35:22.269347 master-0 kubenswrapper[3976]: E0320 08:35:22.269291 3976 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:35:22.269347 master-0 kubenswrapper[3976]: E0320 08:35:22.269306 3976 projected.go:194] Error preparing data for projected volume kube-api-access-w2zzd for pod openshift-network-diagnostics/network-check-target-xnrw6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:35:22.269347 master-0 kubenswrapper[3976]: E0320 08:35:22.269379 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd podName:c0142d4e-9fd4-4375-a773-bb89b38af654 nodeName:}" failed. No retries permitted until 2026-03-20 08:35:22.769357501 +0000 UTC m=+94.038180788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-w2zzd" (UniqueName: "kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd") pod "network-check-target-xnrw6" (UID: "c0142d4e-9fd4-4375-a773-bb89b38af654") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:35:22.862010 master-0 kubenswrapper[3976]: I0320 08:35:22.861931 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2zzd\" (UniqueName: \"kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd\") pod \"network-check-target-xnrw6\" (UID: \"c0142d4e-9fd4-4375-a773-bb89b38af654\") " pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:22.862595 master-0 kubenswrapper[3976]: E0320 08:35:22.862129 3976 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:35:22.862595 master-0 kubenswrapper[3976]: E0320 08:35:22.862145 3976 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:35:22.862595 master-0 kubenswrapper[3976]: E0320 08:35:22.862157 3976 projected.go:194] Error preparing data for projected volume kube-api-access-w2zzd for pod openshift-network-diagnostics/network-check-target-xnrw6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:35:22.862595 master-0 kubenswrapper[3976]: E0320 08:35:22.862236 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd podName:c0142d4e-9fd4-4375-a773-bb89b38af654 nodeName:}" failed. No retries permitted until 2026-03-20 08:35:23.862216753 +0000 UTC m=+95.131040040 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-w2zzd" (UniqueName: "kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd") pod "network-check-target-xnrw6" (UID: "c0142d4e-9fd4-4375-a773-bb89b38af654") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:35:23.040209 master-0 kubenswrapper[3976]: W0320 08:35:23.040049 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3dc1be4_f742_47cc_95b6_82e0bc34a716.slice/crio-156eb04894ecec25bfbcd06dc0b87578738f576b411e8c7c85f2a4cc0e48d6f0 WatchSource:0}: Error finding container 156eb04894ecec25bfbcd06dc0b87578738f576b411e8c7c85f2a4cc0e48d6f0: Status 404 returned error can't find the container with id 156eb04894ecec25bfbcd06dc0b87578738f576b411e8c7c85f2a4cc0e48d6f0 Mar 20 08:35:23.157097 master-0 kubenswrapper[3976]: I0320 08:35:23.157031 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" event={"ID":"5e3ddf9e-eeb5-4266-b675-092fd4e27623","Type":"ContainerStarted","Data":"f63327bd315d50a90d1b9877abdb96105e01d6fdb0e17116424c13ecbe62dbc3"} Mar 20 08:35:23.158167 master-0 kubenswrapper[3976]: I0320 08:35:23.158126 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerStarted","Data":"156eb04894ecec25bfbcd06dc0b87578738f576b411e8c7c85f2a4cc0e48d6f0"} Mar 20 08:35:23.264935 master-0 kubenswrapper[3976]: I0320 08:35:23.264371 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:23.264935 master-0 kubenswrapper[3976]: E0320 08:35:23.264615 3976 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:35:23.265105 master-0 kubenswrapper[3976]: E0320 08:35:23.265020 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs podName:813f91c2-2b37-4681-968d-4217e286e22f nodeName:}" failed. No retries permitted until 2026-03-20 08:35:39.264989291 +0000 UTC m=+110.533812578 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs") pod "network-metrics-daemon-srdjm" (UID: "813f91c2-2b37-4681-968d-4217e286e22f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:35:23.643146 master-0 kubenswrapper[3976]: I0320 08:35:23.642964 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:23.643398 master-0 kubenswrapper[3976]: I0320 08:35:23.642969 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:23.643398 master-0 kubenswrapper[3976]: E0320 08:35:23.643164 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:23.643398 master-0 kubenswrapper[3976]: E0320 08:35:23.643340 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:23.870435 master-0 kubenswrapper[3976]: I0320 08:35:23.870340 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2zzd\" (UniqueName: \"kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd\") pod \"network-check-target-xnrw6\" (UID: \"c0142d4e-9fd4-4375-a773-bb89b38af654\") " pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:23.871175 master-0 kubenswrapper[3976]: E0320 08:35:23.870709 3976 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:35:23.871175 master-0 kubenswrapper[3976]: E0320 08:35:23.870772 3976 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:35:23.871175 master-0 kubenswrapper[3976]: E0320 08:35:23.870793 3976 projected.go:194] Error preparing data for projected volume kube-api-access-w2zzd for pod openshift-network-diagnostics/network-check-target-xnrw6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:35:23.871175 master-0 kubenswrapper[3976]: E0320 08:35:23.870909 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd podName:c0142d4e-9fd4-4375-a773-bb89b38af654 nodeName:}" failed. No retries permitted until 2026-03-20 08:35:25.870878455 +0000 UTC m=+97.139701962 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-w2zzd" (UniqueName: "kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd") pod "network-check-target-xnrw6" (UID: "c0142d4e-9fd4-4375-a773-bb89b38af654") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:35:24.165612 master-0 kubenswrapper[3976]: I0320 08:35:24.165549 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" event={"ID":"5e3ddf9e-eeb5-4266-b675-092fd4e27623","Type":"ContainerStarted","Data":"a134abc184f79415563956f2eeb439b259ce0571570a2fb953199c779754242d"} Mar 20 08:35:24.167982 master-0 kubenswrapper[3976]: I0320 08:35:24.167932 3976 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="2f1eef10f4235bf6943bb1062fc964d69fc5c901795041a7ddca120ef33de66d" exitCode=0 Mar 20 08:35:24.168044 master-0 kubenswrapper[3976]: I0320 08:35:24.167949 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerDied","Data":"2f1eef10f4235bf6943bb1062fc964d69fc5c901795041a7ddca120ef33de66d"} Mar 20 08:35:24.171427 master-0 kubenswrapper[3976]: I0320 08:35:24.171375 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2fp4b" event={"ID":"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7","Type":"ContainerStarted","Data":"2e3f8fb15f65cb56f636062e77511d2b7c7ac1c5b96ff94db9a664613cc3a72a"} Mar 20 08:35:24.247088 master-0 kubenswrapper[3976]: I0320 08:35:24.246666 3976 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2fp4b" podStartSLOduration=1.920647716 podStartE2EDuration="18.246640651s" podCreationTimestamp="2026-03-20 08:35:06 +0000 UTC" firstStartedPulling="2026-03-20 08:35:06.774275942 +0000 UTC m=+78.043099269" lastFinishedPulling="2026-03-20 08:35:23.100268917 +0000 UTC m=+94.369092204" observedRunningTime="2026-03-20 08:35:24.246519317 +0000 UTC m=+95.515342604" watchObservedRunningTime="2026-03-20 08:35:24.246640651 +0000 UTC m=+95.515463938" Mar 20 08:35:24.843656 master-0 kubenswrapper[3976]: I0320 08:35:24.843601 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:24.844010 master-0 kubenswrapper[3976]: E0320 08:35:24.843739 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:24.844986 master-0 kubenswrapper[3976]: I0320 08:35:24.844855 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-6t5vb"] Mar 20 08:35:24.845902 master-0 kubenswrapper[3976]: I0320 08:35:24.845873 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:35:24.856288 master-0 kubenswrapper[3976]: I0320 08:35:24.852943 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 08:35:24.856288 master-0 kubenswrapper[3976]: I0320 08:35:24.853015 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 08:35:24.856288 master-0 kubenswrapper[3976]: I0320 08:35:24.852961 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 08:35:24.856288 master-0 kubenswrapper[3976]: I0320 08:35:24.856098 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 08:35:24.858621 master-0 kubenswrapper[3976]: I0320 08:35:24.858416 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 08:35:25.045772 master-0 kubenswrapper[3976]: I0320 08:35:25.045693 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-env-overrides\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:35:25.046366 master-0 kubenswrapper[3976]: I0320 08:35:25.045829 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb0fc10f-5796-4cd5-b8f5-72d678054c24-webhook-cert\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:35:25.046366 master-0 kubenswrapper[3976]: I0320 08:35:25.046029 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-ovnkube-identity-cm\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:35:25.046366 master-0 kubenswrapper[3976]: I0320 08:35:25.046128 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh47j\" (UniqueName: \"kubernetes.io/projected/fb0fc10f-5796-4cd5-b8f5-72d678054c24-kube-api-access-lh47j\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:35:25.147117 master-0 kubenswrapper[3976]: I0320 08:35:25.146972 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-ovnkube-identity-cm\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:35:25.147117 master-0 kubenswrapper[3976]: I0320 08:35:25.147042 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh47j\" (UniqueName: \"kubernetes.io/projected/fb0fc10f-5796-4cd5-b8f5-72d678054c24-kube-api-access-lh47j\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:35:25.147363 master-0 kubenswrapper[3976]: I0320 08:35:25.147319 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-env-overrides\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:35:25.147663 master-0 kubenswrapper[3976]: I0320 08:35:25.147612 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb0fc10f-5796-4cd5-b8f5-72d678054c24-webhook-cert\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:35:25.148259 master-0 kubenswrapper[3976]: I0320 08:35:25.148205 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-env-overrides\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:35:25.149667 master-0 kubenswrapper[3976]: I0320 08:35:25.149636 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-ovnkube-identity-cm\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:35:25.154094 master-0 kubenswrapper[3976]: I0320 08:35:25.154052 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb0fc10f-5796-4cd5-b8f5-72d678054c24-webhook-cert\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:35:25.166657 master-0 kubenswrapper[3976]: I0320 08:35:25.166616 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh47j\" (UniqueName: \"kubernetes.io/projected/fb0fc10f-5796-4cd5-b8f5-72d678054c24-kube-api-access-lh47j\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:35:25.169916 master-0 kubenswrapper[3976]: I0320 08:35:25.169322 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:35:25.180760 master-0 kubenswrapper[3976]: W0320 08:35:25.180708 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb0fc10f_5796_4cd5_b8f5_72d678054c24.slice/crio-37ba8dc9671f8fe4b5333f2dc9ab294ad8d004712a5f4bddaf2f6742452a4b3c WatchSource:0}: Error finding container 37ba8dc9671f8fe4b5333f2dc9ab294ad8d004712a5f4bddaf2f6742452a4b3c: Status 404 returned error can't find the container with id 37ba8dc9671f8fe4b5333f2dc9ab294ad8d004712a5f4bddaf2f6742452a4b3c Mar 20 08:35:25.642500 master-0 kubenswrapper[3976]: I0320 08:35:25.642426 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:25.642816 master-0 kubenswrapper[3976]: E0320 08:35:25.642781 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:25.659441 master-0 kubenswrapper[3976]: I0320 08:35:25.659389 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 20 08:35:25.956155 master-0 kubenswrapper[3976]: I0320 08:35:25.955596 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2zzd\" (UniqueName: \"kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd\") pod \"network-check-target-xnrw6\" (UID: \"c0142d4e-9fd4-4375-a773-bb89b38af654\") " pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:25.956155 master-0 kubenswrapper[3976]: E0320 08:35:25.955632 3976 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:35:25.956155 master-0 kubenswrapper[3976]: E0320 08:35:25.955680 3976 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:35:25.956155 master-0 kubenswrapper[3976]: E0320 08:35:25.955693 3976 projected.go:194] Error preparing data for projected volume kube-api-access-w2zzd for pod openshift-network-diagnostics/network-check-target-xnrw6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:35:25.956155 master-0 kubenswrapper[3976]: E0320 08:35:25.955748 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd podName:c0142d4e-9fd4-4375-a773-bb89b38af654 nodeName:}" failed. No retries permitted until 2026-03-20 08:35:29.955731578 +0000 UTC m=+101.224554865 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-w2zzd" (UniqueName: "kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd") pod "network-check-target-xnrw6" (UID: "c0142d4e-9fd4-4375-a773-bb89b38af654") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:35:26.181810 master-0 kubenswrapper[3976]: I0320 08:35:26.181744 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-6t5vb" event={"ID":"fb0fc10f-5796-4cd5-b8f5-72d678054c24","Type":"ContainerStarted","Data":"37ba8dc9671f8fe4b5333f2dc9ab294ad8d004712a5f4bddaf2f6742452a4b3c"} Mar 20 08:35:26.643907 master-0 kubenswrapper[3976]: I0320 08:35:26.643019 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:26.643907 master-0 kubenswrapper[3976]: E0320 08:35:26.643345 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:27.189019 master-0 kubenswrapper[3976]: I0320 08:35:27.188872 3976 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="d7a9f93548b4324f9218b5fb15026983da36f57336679426ecdeef802c274095" exitCode=0 Mar 20 08:35:27.189019 master-0 kubenswrapper[3976]: I0320 08:35:27.188957 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerDied","Data":"d7a9f93548b4324f9218b5fb15026983da36f57336679426ecdeef802c274095"} Mar 20 08:35:27.232629 master-0 kubenswrapper[3976]: I0320 08:35:27.232455 3976 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=2.232411495 podStartE2EDuration="2.232411495s" podCreationTimestamp="2026-03-20 08:35:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:35:27.232084244 +0000 UTC m=+98.500907531" watchObservedRunningTime="2026-03-20 08:35:27.232411495 +0000 UTC m=+98.501234782" Mar 20 08:35:27.644800 master-0 kubenswrapper[3976]: I0320 08:35:27.644744 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:27.645015 master-0 kubenswrapper[3976]: E0320 08:35:27.644881 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:28.642621 master-0 kubenswrapper[3976]: I0320 08:35:28.642553 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:28.643215 master-0 kubenswrapper[3976]: E0320 08:35:28.642675 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:29.642598 master-0 kubenswrapper[3976]: I0320 08:35:29.642560 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:29.644466 master-0 kubenswrapper[3976]: E0320 08:35:29.644237 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:30.007421 master-0 kubenswrapper[3976]: I0320 08:35:30.007278 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2zzd\" (UniqueName: \"kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd\") pod \"network-check-target-xnrw6\" (UID: \"c0142d4e-9fd4-4375-a773-bb89b38af654\") " pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:30.007724 master-0 kubenswrapper[3976]: E0320 08:35:30.007529 3976 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:35:30.007724 master-0 kubenswrapper[3976]: E0320 08:35:30.007570 3976 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:35:30.007724 master-0 kubenswrapper[3976]: E0320 08:35:30.007587 3976 projected.go:194] Error preparing data for projected volume kube-api-access-w2zzd for pod openshift-network-diagnostics/network-check-target-xnrw6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:35:30.007724 master-0 kubenswrapper[3976]: E0320 08:35:30.007663 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd podName:c0142d4e-9fd4-4375-a773-bb89b38af654 nodeName:}" failed. No retries permitted until 2026-03-20 08:35:38.007640624 +0000 UTC m=+109.276463911 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-w2zzd" (UniqueName: "kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd") pod "network-check-target-xnrw6" (UID: "c0142d4e-9fd4-4375-a773-bb89b38af654") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:35:30.201266 master-0 kubenswrapper[3976]: I0320 08:35:30.201172 3976 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="ca1d7ca00a56b55ea93c4440e9e959ff93d3c3b08431ba60809fba320b9496a7" exitCode=0 Mar 20 08:35:30.201266 master-0 kubenswrapper[3976]: I0320 08:35:30.201244 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerDied","Data":"ca1d7ca00a56b55ea93c4440e9e959ff93d3c3b08431ba60809fba320b9496a7"} Mar 20 08:35:30.642734 master-0 kubenswrapper[3976]: I0320 08:35:30.642645 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:30.643002 master-0 kubenswrapper[3976]: E0320 08:35:30.642808 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:31.642748 master-0 kubenswrapper[3976]: I0320 08:35:31.642633 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:31.643537 master-0 kubenswrapper[3976]: E0320 08:35:31.642932 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:32.643012 master-0 kubenswrapper[3976]: I0320 08:35:32.642944 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:32.643757 master-0 kubenswrapper[3976]: E0320 08:35:32.643134 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:33.643046 master-0 kubenswrapper[3976]: I0320 08:35:33.642866 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:33.643661 master-0 kubenswrapper[3976]: E0320 08:35:33.643297 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:34.642323 master-0 kubenswrapper[3976]: I0320 08:35:34.642255 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:34.642589 master-0 kubenswrapper[3976]: E0320 08:35:34.642418 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:35.655263 master-0 kubenswrapper[3976]: I0320 08:35:35.655207 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:35.655914 master-0 kubenswrapper[3976]: E0320 08:35:35.655565 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:36.642463 master-0 kubenswrapper[3976]: I0320 08:35:36.642371 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:36.642709 master-0 kubenswrapper[3976]: E0320 08:35:36.642571 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:37.644637 master-0 kubenswrapper[3976]: I0320 08:35:37.644565 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:37.645326 master-0 kubenswrapper[3976]: E0320 08:35:37.644789 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:38.074475 master-0 kubenswrapper[3976]: I0320 08:35:38.074428 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2zzd\" (UniqueName: \"kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd\") pod \"network-check-target-xnrw6\" (UID: \"c0142d4e-9fd4-4375-a773-bb89b38af654\") " pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:38.074837 master-0 kubenswrapper[3976]: E0320 08:35:38.074673 3976 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:35:38.074837 master-0 kubenswrapper[3976]: E0320 08:35:38.074721 3976 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:35:38.074837 master-0 kubenswrapper[3976]: E0320 08:35:38.074736 3976 projected.go:194] Error preparing data for projected volume kube-api-access-w2zzd for pod openshift-network-diagnostics/network-check-target-xnrw6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:35:38.074837 master-0 kubenswrapper[3976]: E0320 08:35:38.074810 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd podName:c0142d4e-9fd4-4375-a773-bb89b38af654 nodeName:}" failed. No retries permitted until 2026-03-20 08:35:54.07478845 +0000 UTC m=+125.343611737 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-w2zzd" (UniqueName: "kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd") pod "network-check-target-xnrw6" (UID: "c0142d4e-9fd4-4375-a773-bb89b38af654") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:35:38.642811 master-0 kubenswrapper[3976]: I0320 08:35:38.642747 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:38.643151 master-0 kubenswrapper[3976]: E0320 08:35:38.642903 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:39.305817 master-0 kubenswrapper[3976]: I0320 08:35:39.305752 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:39.306444 master-0 kubenswrapper[3976]: E0320 08:35:39.305916 3976 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:35:39.306444 master-0 kubenswrapper[3976]: E0320 08:35:39.305984 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs podName:813f91c2-2b37-4681-968d-4217e286e22f nodeName:}" failed. No retries permitted until 2026-03-20 08:36:11.305962636 +0000 UTC m=+142.574785923 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs") pod "network-metrics-daemon-srdjm" (UID: "813f91c2-2b37-4681-968d-4217e286e22f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:35:39.642738 master-0 kubenswrapper[3976]: I0320 08:35:39.642587 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:39.643589 master-0 kubenswrapper[3976]: E0320 08:35:39.643536 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:40.642853 master-0 kubenswrapper[3976]: I0320 08:35:40.642744 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:40.644165 master-0 kubenswrapper[3976]: E0320 08:35:40.642949 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:41.642661 master-0 kubenswrapper[3976]: I0320 08:35:41.642604 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:41.642942 master-0 kubenswrapper[3976]: E0320 08:35:41.642748 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:42.641936 master-0 kubenswrapper[3976]: I0320 08:35:42.641887 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:42.642270 master-0 kubenswrapper[3976]: E0320 08:35:42.642001 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:43.642841 master-0 kubenswrapper[3976]: I0320 08:35:43.642760 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:43.643546 master-0 kubenswrapper[3976]: E0320 08:35:43.642931 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:44.642746 master-0 kubenswrapper[3976]: I0320 08:35:44.642672 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:44.643032 master-0 kubenswrapper[3976]: E0320 08:35:44.642850 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:44.787590 master-0 kubenswrapper[3976]: I0320 08:35:44.787523 3976 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nsmzc"] Mar 20 08:35:45.251424 master-0 kubenswrapper[3976]: I0320 08:35:45.251352 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-6t5vb" event={"ID":"fb0fc10f-5796-4cd5-b8f5-72d678054c24","Type":"ContainerStarted","Data":"11fb9901a3c95280d4ada671324191ca15e473462e8fdb0e196a3a680e2d44a1"} Mar 20 08:35:45.251424 master-0 kubenswrapper[3976]: I0320 08:35:45.251433 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-6t5vb" event={"ID":"fb0fc10f-5796-4cd5-b8f5-72d678054c24","Type":"ContainerStarted","Data":"2530070030abd272dac9151bcbdcdd74c4a2472ddf19cb97a57dadd8614ece94"} Mar 20 08:35:45.256392 master-0 kubenswrapper[3976]: I0320 08:35:45.256351 3976 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="0562124cc868051528c8c76baabb685e9f641cfd32418a6cbc0b305b7b8b1525" exitCode=0 Mar 20 08:35:45.256519 master-0 kubenswrapper[3976]: I0320 08:35:45.256457 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerDied","Data":"0562124cc868051528c8c76baabb685e9f641cfd32418a6cbc0b305b7b8b1525"} Mar 20 08:35:45.260263 master-0 kubenswrapper[3976]: I0320 08:35:45.260243 3976 generic.go:334] "Generic (PLEG): container finished" podID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerID="4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53" exitCode=0 Mar 20 08:35:45.260411 master-0 kubenswrapper[3976]: I0320 08:35:45.260293 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerDied","Data":"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53"} Mar 20 08:35:45.265835 master-0 kubenswrapper[3976]: I0320 08:35:45.265776 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" event={"ID":"5e3ddf9e-eeb5-4266-b675-092fd4e27623","Type":"ContainerStarted","Data":"c4a834368b75816e5bf327a50499cbf160883d81fc9ea89519da8bf5870c95aa"} Mar 20 08:35:45.286258 master-0 kubenswrapper[3976]: I0320 08:35:45.281922 3976 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-6t5vb" podStartSLOduration=2.015297683 podStartE2EDuration="21.281881291s" podCreationTimestamp="2026-03-20 08:35:24 +0000 UTC" firstStartedPulling="2026-03-20 08:35:25.183024625 +0000 UTC m=+96.451847912" lastFinishedPulling="2026-03-20 08:35:44.449608193 +0000 UTC m=+115.718431520" observedRunningTime="2026-03-20 08:35:45.277128725 +0000 UTC m=+116.545952072" watchObservedRunningTime="2026-03-20 08:35:45.281881291 +0000 UTC m=+116.550704588" Mar 20 08:35:45.339552 master-0 kubenswrapper[3976]: I0320 08:35:45.337965 3976 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" podStartSLOduration=6.161637252 podStartE2EDuration="27.337940538s" podCreationTimestamp="2026-03-20 08:35:18 +0000 UTC" firstStartedPulling="2026-03-20 08:35:23.222386688 +0000 UTC m=+94.491209965" lastFinishedPulling="2026-03-20 08:35:44.398689944 +0000 UTC m=+115.667513251" observedRunningTime="2026-03-20 08:35:45.302432794 +0000 UTC m=+116.571256161" watchObservedRunningTime="2026-03-20 08:35:45.337940538 +0000 UTC m=+116.606763825" Mar 20 08:35:45.645862 master-0 kubenswrapper[3976]: I0320 08:35:45.645807 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:45.646699 master-0 kubenswrapper[3976]: E0320 08:35:45.646026 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:46.277177 master-0 kubenswrapper[3976]: I0320 08:35:46.276572 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerStarted","Data":"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8"} Mar 20 08:35:46.277177 master-0 kubenswrapper[3976]: I0320 08:35:46.277038 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerStarted","Data":"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4"} Mar 20 08:35:46.277177 master-0 kubenswrapper[3976]: I0320 08:35:46.277058 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerStarted","Data":"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212"} Mar 20 08:35:46.277177 master-0 kubenswrapper[3976]: I0320 08:35:46.277070 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerStarted","Data":"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4"} Mar 20 08:35:46.277177 master-0 kubenswrapper[3976]: I0320 08:35:46.277080 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerStarted","Data":"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7"} Mar 20 08:35:46.277177 master-0 kubenswrapper[3976]: I0320 08:35:46.277093 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerStarted","Data":"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3"} Mar 20 08:35:46.281839 master-0 kubenswrapper[3976]: I0320 08:35:46.281731 3976 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="09df5a13ce7374304f28bc120919f2392b8b1eedb768ae74aa71f1f46b1260f3" exitCode=0 Mar 20 08:35:46.281839 master-0 kubenswrapper[3976]: I0320 08:35:46.281812 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerDied","Data":"09df5a13ce7374304f28bc120919f2392b8b1eedb768ae74aa71f1f46b1260f3"} Mar 20 08:35:46.642062 master-0 kubenswrapper[3976]: I0320 08:35:46.641973 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:46.642476 master-0 kubenswrapper[3976]: E0320 08:35:46.642179 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:47.291987 master-0 kubenswrapper[3976]: I0320 08:35:47.291851 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerStarted","Data":"e0bba107e6b49f693f3963a5c0c601999ff2c2d961d6645822b30cd922e252a1"} Mar 20 08:35:47.322146 master-0 kubenswrapper[3976]: I0320 08:35:47.322003 3976 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" podStartSLOduration=3.93833454 podStartE2EDuration="41.321961585s" podCreationTimestamp="2026-03-20 08:35:06 +0000 UTC" firstStartedPulling="2026-03-20 08:35:06.943460344 +0000 UTC m=+78.212283641" lastFinishedPulling="2026-03-20 08:35:44.327087399 +0000 UTC m=+115.595910686" observedRunningTime="2026-03-20 08:35:47.321816 +0000 UTC m=+118.590639287" watchObservedRunningTime="2026-03-20 08:35:47.321961585 +0000 UTC m=+118.590784912" Mar 20 08:35:47.643431 master-0 kubenswrapper[3976]: I0320 08:35:47.643347 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:47.644066 master-0 kubenswrapper[3976]: E0320 08:35:47.644023 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:48.642312 master-0 kubenswrapper[3976]: I0320 08:35:48.642246 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:48.643001 master-0 kubenswrapper[3976]: E0320 08:35:48.642485 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:49.302720 master-0 kubenswrapper[3976]: I0320 08:35:49.302642 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerStarted","Data":"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1"} Mar 20 08:35:49.547970 master-0 kubenswrapper[3976]: E0320 08:35:49.547883 3976 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 08:35:49.643279 master-0 kubenswrapper[3976]: I0320 08:35:49.643038 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:49.644634 master-0 kubenswrapper[3976]: E0320 08:35:49.644484 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:50.066953 master-0 kubenswrapper[3976]: E0320 08:35:50.066804 3976 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 08:35:50.509561 master-0 kubenswrapper[3976]: I0320 08:35:50.509028 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:35:50.509737 master-0 kubenswrapper[3976]: E0320 08:35:50.509267 3976 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 20 08:35:50.509737 master-0 kubenswrapper[3976]: E0320 08:35:50.509671 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert podName:dab97c35-fe60-4134-8715-a7c6dd085fb3 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:54.509648548 +0000 UTC m=+185.778471845 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert") pod "cluster-version-operator-56d8475767-9gfkg" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3") : secret "cluster-version-operator-serving-cert" not found Mar 20 08:35:50.644807 master-0 kubenswrapper[3976]: I0320 08:35:50.642279 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:50.644807 master-0 kubenswrapper[3976]: E0320 08:35:50.642437 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:51.318321 master-0 kubenswrapper[3976]: I0320 08:35:51.318227 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerStarted","Data":"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e"} Mar 20 08:35:51.318816 master-0 kubenswrapper[3976]: I0320 08:35:51.318546 3976 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="ovn-controller" containerID="cri-o://bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3" gracePeriod=30 Mar 20 08:35:51.318816 master-0 kubenswrapper[3976]: I0320 08:35:51.318799 3976 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:51.318979 master-0 kubenswrapper[3976]: I0320 08:35:51.318837 3976 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:51.318979 master-0 kubenswrapper[3976]: I0320 08:35:51.318901 3976 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:51.319311 master-0 kubenswrapper[3976]: I0320 08:35:51.319259 3976 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="sbdb" containerID="cri-o://1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1" gracePeriod=30 Mar 20 08:35:51.319311 master-0 kubenswrapper[3976]: I0320 08:35:51.319313 3976 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="nbdb" containerID="cri-o://0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8" gracePeriod=30 Mar 20 08:35:51.319534 master-0 kubenswrapper[3976]: I0320 08:35:51.319352 3976 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="northd" containerID="cri-o://71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4" gracePeriod=30 Mar 20 08:35:51.319534 master-0 kubenswrapper[3976]: I0320 08:35:51.319394 3976 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212" gracePeriod=30 Mar 20 08:35:51.319534 master-0 kubenswrapper[3976]: I0320 08:35:51.319434 3976 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="kube-rbac-proxy-node" containerID="cri-o://b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4" gracePeriod=30 Mar 20 08:35:51.319848 master-0 kubenswrapper[3976]: I0320 08:35:51.319540 3976 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="ovn-acl-logging" containerID="cri-o://2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7" gracePeriod=30 Mar 20 08:35:51.326272 master-0 kubenswrapper[3976]: E0320 08:35:51.326214 3976 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 20 08:35:51.329420 master-0 kubenswrapper[3976]: E0320 08:35:51.329369 3976 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 20 08:35:51.337162 master-0 kubenswrapper[3976]: E0320 08:35:51.336980 3976 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 20 08:35:51.340268 master-0 kubenswrapper[3976]: E0320 08:35:51.340121 3976 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 20 08:35:51.342573 master-0 kubenswrapper[3976]: E0320 08:35:51.342488 3976 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 20 08:35:51.342704 master-0 kubenswrapper[3976]: E0320 08:35:51.342604 3976 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="nbdb" Mar 20 08:35:51.358894 master-0 kubenswrapper[3976]: E0320 08:35:51.358789 3976 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 20 08:35:51.359087 master-0 kubenswrapper[3976]: E0320 08:35:51.358926 3976 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="sbdb" Mar 20 08:35:51.372330 master-0 kubenswrapper[3976]: I0320 08:35:51.372142 3976 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="ovnkube-controller" containerID="cri-o://016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e" gracePeriod=30 Mar 20 08:35:51.643505 master-0 kubenswrapper[3976]: I0320 08:35:51.643430 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:51.643785 master-0 kubenswrapper[3976]: E0320 08:35:51.643668 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:52.126816 master-0 kubenswrapper[3976]: I0320 08:35:52.126742 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsmzc_d3dc1be4-f742-47cc-95b6-82e0bc34a716/ovnkube-controller/0.log" Mar 20 08:35:52.129561 master-0 kubenswrapper[3976]: I0320 08:35:52.129503 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsmzc_d3dc1be4-f742-47cc-95b6-82e0bc34a716/kube-rbac-proxy-ovn-metrics/0.log" Mar 20 08:35:52.130075 master-0 kubenswrapper[3976]: I0320 08:35:52.130027 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsmzc_d3dc1be4-f742-47cc-95b6-82e0bc34a716/kube-rbac-proxy-node/0.log" Mar 20 08:35:52.130928 master-0 kubenswrapper[3976]: I0320 08:35:52.130882 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsmzc_d3dc1be4-f742-47cc-95b6-82e0bc34a716/ovn-acl-logging/0.log" Mar 20 08:35:52.131740 master-0 kubenswrapper[3976]: I0320 08:35:52.131689 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsmzc_d3dc1be4-f742-47cc-95b6-82e0bc34a716/ovn-controller/0.log" Mar 20 08:35:52.132342 master-0 kubenswrapper[3976]: I0320 08:35:52.132300 3976 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:52.204385 master-0 kubenswrapper[3976]: I0320 08:35:52.204276 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rxdwp"] Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: E0320 08:35:52.204431 3976 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="northd" Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: I0320 08:35:52.204454 3976 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="northd" Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: E0320 08:35:52.204470 3976 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="ovn-controller" Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: I0320 08:35:52.204483 3976 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="ovn-controller" Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: E0320 08:35:52.204521 3976 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="sbdb" Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: I0320 08:35:52.204535 3976 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="sbdb" Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: E0320 08:35:52.204551 3976 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="kube-rbac-proxy-node" Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: I0320 08:35:52.204564 3976 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="kube-rbac-proxy-node" Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: E0320 08:35:52.204578 3976 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="nbdb" Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: I0320 08:35:52.204591 3976 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="nbdb" Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: E0320 08:35:52.204605 3976 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="kubecfg-setup" Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: I0320 08:35:52.204619 3976 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="kubecfg-setup" Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: E0320 08:35:52.204634 3976 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="ovn-acl-logging" Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: I0320 08:35:52.204647 3976 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="ovn-acl-logging" Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: E0320 08:35:52.204663 3976 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 08:35:52.204667 master-0 kubenswrapper[3976]: I0320 08:35:52.204678 3976 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 08:35:52.205050 master-0 kubenswrapper[3976]: E0320 08:35:52.204695 3976 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="ovnkube-controller" Mar 20 08:35:52.205050 master-0 kubenswrapper[3976]: I0320 08:35:52.204709 3976 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="ovnkube-controller" Mar 20 08:35:52.205050 master-0 kubenswrapper[3976]: I0320 08:35:52.204772 3976 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="kube-rbac-proxy-node" Mar 20 08:35:52.205050 master-0 kubenswrapper[3976]: I0320 08:35:52.204786 3976 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="ovn-controller" Mar 20 08:35:52.205050 master-0 kubenswrapper[3976]: I0320 08:35:52.204801 3976 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="northd" Mar 20 08:35:52.205050 master-0 kubenswrapper[3976]: I0320 08:35:52.204816 3976 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="sbdb" Mar 20 08:35:52.205050 master-0 kubenswrapper[3976]: I0320 08:35:52.204829 3976 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 08:35:52.205050 master-0 kubenswrapper[3976]: I0320 08:35:52.204843 3976 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="ovnkube-controller" Mar 20 08:35:52.205050 master-0 kubenswrapper[3976]: I0320 08:35:52.204856 3976 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="ovn-acl-logging" Mar 20 08:35:52.205050 master-0 kubenswrapper[3976]: I0320 08:35:52.204871 3976 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerName="nbdb" Mar 20 08:35:52.205987 master-0 kubenswrapper[3976]: I0320 08:35:52.205939 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.323282 master-0 kubenswrapper[3976]: I0320 08:35:52.323224 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-node-log\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.323282 master-0 kubenswrapper[3976]: I0320 08:35:52.323262 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-slash\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.323282 master-0 kubenswrapper[3976]: I0320 08:35:52.323283 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-openvswitch\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.323282 master-0 kubenswrapper[3976]: I0320 08:35:52.323301 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.323689 master-0 kubenswrapper[3976]: I0320 08:35:52.323331 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-env-overrides\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.323689 master-0 kubenswrapper[3976]: I0320 08:35:52.323351 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr5s8\" (UniqueName: \"kubernetes.io/projected/d3dc1be4-f742-47cc-95b6-82e0bc34a716-kube-api-access-xr5s8\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.323689 master-0 kubenswrapper[3976]: I0320 08:35:52.323369 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-run-netns\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.323689 master-0 kubenswrapper[3976]: I0320 08:35:52.323392 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-var-lib-openvswitch\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.323689 master-0 kubenswrapper[3976]: I0320 08:35:52.323392 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-node-log" (OuterVolumeSpecName: "node-log") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:35:52.323689 master-0 kubenswrapper[3976]: I0320 08:35:52.323413 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovn-node-metrics-cert\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.323689 master-0 kubenswrapper[3976]: I0320 08:35:52.323428 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-systemd\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.323689 master-0 kubenswrapper[3976]: I0320 08:35:52.323442 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:35:52.323689 master-0 kubenswrapper[3976]: I0320 08:35:52.323513 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:35:52.323689 master-0 kubenswrapper[3976]: I0320 08:35:52.323628 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:35:52.323689 master-0 kubenswrapper[3976]: I0320 08:35:52.323584 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:35:52.323973 master-0 kubenswrapper[3976]: I0320 08:35:52.323785 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsmzc_d3dc1be4-f742-47cc-95b6-82e0bc34a716/ovnkube-controller/0.log" Mar 20 08:35:52.324008 master-0 kubenswrapper[3976]: I0320 08:35:52.323989 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-ovn\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.324039 master-0 kubenswrapper[3976]: I0320 08:35:52.324021 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:35:52.324072 master-0 kubenswrapper[3976]: I0320 08:35:52.324055 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovnkube-config\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.324100 master-0 kubenswrapper[3976]: I0320 08:35:52.324088 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-log-socket\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.324133 master-0 kubenswrapper[3976]: I0320 08:35:52.324072 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:35:52.324133 master-0 kubenswrapper[3976]: I0320 08:35:52.324120 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovnkube-script-lib\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.324199 master-0 kubenswrapper[3976]: I0320 08:35:52.324151 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-cni-netd\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.324237 master-0 kubenswrapper[3976]: I0320 08:35:52.324208 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-etc-openvswitch\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.324269 master-0 kubenswrapper[3976]: I0320 08:35:52.324242 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-systemd-units\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.324299 master-0 kubenswrapper[3976]: I0320 08:35:52.324272 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-run-ovn-kubernetes\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.324327 master-0 kubenswrapper[3976]: I0320 08:35:52.324305 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-cni-bin\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.324356 master-0 kubenswrapper[3976]: I0320 08:35:52.324341 3976 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-kubelet\") pod \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\" (UID: \"d3dc1be4-f742-47cc-95b6-82e0bc34a716\") " Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324447 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-script-lib\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324517 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-bin\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324553 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-slash\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324449 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324576 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324489 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-log-socket" (OuterVolumeSpecName: "log-socket") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324537 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324556 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324512 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324576 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324674 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324769 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-node-log\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324813 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324824 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-ovn\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324869 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtc2p\" (UniqueName: \"kubernetes.io/projected/248a3d2f-3be4-46bf-959c-79d28736c0d6-kube-api-access-mtc2p\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325141 master-0 kubenswrapper[3976]: I0320 08:35:52.324925 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-systemd-units\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325661 master-0 kubenswrapper[3976]: I0320 08:35:52.324953 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-netd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325661 master-0 kubenswrapper[3976]: I0320 08:35:52.324980 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325661 master-0 kubenswrapper[3976]: I0320 08:35:52.325053 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-env-overrides\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325661 master-0 kubenswrapper[3976]: I0320 08:35:52.325124 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-slash" (OuterVolumeSpecName: "host-slash") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:35:52.325661 master-0 kubenswrapper[3976]: I0320 08:35:52.325153 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-systemd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325661 master-0 kubenswrapper[3976]: I0320 08:35:52.325214 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovn-node-metrics-cert\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325661 master-0 kubenswrapper[3976]: I0320 08:35:52.325256 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-etc-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325661 master-0 kubenswrapper[3976]: I0320 08:35:52.325365 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:35:52.325661 master-0 kubenswrapper[3976]: I0320 08:35:52.325410 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325661 master-0 kubenswrapper[3976]: I0320 08:35:52.325529 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-config\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325661 master-0 kubenswrapper[3976]: I0320 08:35:52.325585 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-netns\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325661 master-0 kubenswrapper[3976]: I0320 08:35:52.325634 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-log-socket\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.325661 master-0 kubenswrapper[3976]: I0320 08:35:52.325680 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-kubelet\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.325719 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-var-lib-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.325820 3976 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.325844 3976 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-log-socket\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.325865 3976 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.325887 3976 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.325908 3976 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.325928 3976 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.325947 3976 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-systemd-units\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.325965 3976 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.325986 3976 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-kubelet\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.326005 3976 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-slash\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.326023 3976 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-node-log\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.326041 3976 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.326063 3976 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.326083 3976 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d3dc1be4-f742-47cc-95b6-82e0bc34a716-env-overrides\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326077 master-0 kubenswrapper[3976]: I0320 08:35:52.326105 3976 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-host-run-netns\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326602 master-0 kubenswrapper[3976]: I0320 08:35:52.326127 3976 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326602 master-0 kubenswrapper[3976]: I0320 08:35:52.326149 3976 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.326602 master-0 kubenswrapper[3976]: I0320 08:35:52.326568 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsmzc_d3dc1be4-f742-47cc-95b6-82e0bc34a716/kube-rbac-proxy-ovn-metrics/0.log" Mar 20 08:35:52.327216 master-0 kubenswrapper[3976]: I0320 08:35:52.327163 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsmzc_d3dc1be4-f742-47cc-95b6-82e0bc34a716/kube-rbac-proxy-node/0.log" Mar 20 08:35:52.327806 master-0 kubenswrapper[3976]: I0320 08:35:52.327765 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsmzc_d3dc1be4-f742-47cc-95b6-82e0bc34a716/ovn-acl-logging/0.log" Mar 20 08:35:52.328545 master-0 kubenswrapper[3976]: I0320 08:35:52.328347 3976 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nsmzc_d3dc1be4-f742-47cc-95b6-82e0bc34a716/ovn-controller/0.log" Mar 20 08:35:52.329016 master-0 kubenswrapper[3976]: I0320 08:35:52.328976 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:35:52.329554 master-0 kubenswrapper[3976]: I0320 08:35:52.329504 3976 generic.go:334] "Generic (PLEG): container finished" podID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerID="016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e" exitCode=1 Mar 20 08:35:52.329610 master-0 kubenswrapper[3976]: I0320 08:35:52.329558 3976 generic.go:334] "Generic (PLEG): container finished" podID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerID="1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1" exitCode=0 Mar 20 08:35:52.329610 master-0 kubenswrapper[3976]: I0320 08:35:52.329603 3976 generic.go:334] "Generic (PLEG): container finished" podID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerID="0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8" exitCode=0 Mar 20 08:35:52.329716 master-0 kubenswrapper[3976]: I0320 08:35:52.329625 3976 generic.go:334] "Generic (PLEG): container finished" podID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerID="71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4" exitCode=0 Mar 20 08:35:52.329716 master-0 kubenswrapper[3976]: I0320 08:35:52.329657 3976 generic.go:334] "Generic (PLEG): container finished" podID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerID="452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212" exitCode=143 Mar 20 08:35:52.329716 master-0 kubenswrapper[3976]: I0320 08:35:52.329697 3976 generic.go:334] "Generic (PLEG): container finished" podID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerID="b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4" exitCode=143 Mar 20 08:35:52.329716 master-0 kubenswrapper[3976]: I0320 08:35:52.329713 3976 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" Mar 20 08:35:52.329869 master-0 kubenswrapper[3976]: I0320 08:35:52.329724 3976 generic.go:334] "Generic (PLEG): container finished" podID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerID="2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7" exitCode=143 Mar 20 08:35:52.329869 master-0 kubenswrapper[3976]: I0320 08:35:52.329755 3976 generic.go:334] "Generic (PLEG): container finished" podID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" containerID="bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3" exitCode=143 Mar 20 08:35:52.329869 master-0 kubenswrapper[3976]: I0320 08:35:52.329608 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerDied","Data":"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e"} Mar 20 08:35:52.329869 master-0 kubenswrapper[3976]: I0320 08:35:52.329837 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerDied","Data":"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1"} Mar 20 08:35:52.330008 master-0 kubenswrapper[3976]: I0320 08:35:52.329876 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerDied","Data":"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8"} Mar 20 08:35:52.330008 master-0 kubenswrapper[3976]: I0320 08:35:52.329910 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerDied","Data":"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4"} Mar 20 08:35:52.330008 master-0 kubenswrapper[3976]: I0320 08:35:52.329939 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerDied","Data":"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212"} Mar 20 08:35:52.330008 master-0 kubenswrapper[3976]: I0320 08:35:52.329965 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerDied","Data":"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4"} Mar 20 08:35:52.330205 master-0 kubenswrapper[3976]: I0320 08:35:52.329997 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7"} Mar 20 08:35:52.330273 master-0 kubenswrapper[3976]: I0320 08:35:52.330228 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3"} Mar 20 08:35:52.330273 master-0 kubenswrapper[3976]: I0320 08:35:52.330249 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53"} Mar 20 08:35:52.330273 master-0 kubenswrapper[3976]: I0320 08:35:52.330267 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerDied","Data":"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7"} Mar 20 08:35:52.330362 master-0 kubenswrapper[3976]: I0320 08:35:52.330289 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e"} Mar 20 08:35:52.330362 master-0 kubenswrapper[3976]: I0320 08:35:52.330304 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1"} Mar 20 08:35:52.330362 master-0 kubenswrapper[3976]: I0320 08:35:52.330318 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8"} Mar 20 08:35:52.330362 master-0 kubenswrapper[3976]: I0320 08:35:52.330329 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4"} Mar 20 08:35:52.330362 master-0 kubenswrapper[3976]: I0320 08:35:52.330340 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212"} Mar 20 08:35:52.330362 master-0 kubenswrapper[3976]: I0320 08:35:52.330352 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4"} Mar 20 08:35:52.330362 master-0 kubenswrapper[3976]: I0320 08:35:52.330363 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7"} Mar 20 08:35:52.330536 master-0 kubenswrapper[3976]: I0320 08:35:52.330376 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3"} Mar 20 08:35:52.330536 master-0 kubenswrapper[3976]: I0320 08:35:52.330388 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53"} Mar 20 08:35:52.330536 master-0 kubenswrapper[3976]: I0320 08:35:52.330404 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerDied","Data":"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3"} Mar 20 08:35:52.330536 master-0 kubenswrapper[3976]: I0320 08:35:52.330419 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e"} Mar 20 08:35:52.330536 master-0 kubenswrapper[3976]: I0320 08:35:52.330432 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1"} Mar 20 08:35:52.330536 master-0 kubenswrapper[3976]: I0320 08:35:52.330444 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8"} Mar 20 08:35:52.330536 master-0 kubenswrapper[3976]: I0320 08:35:52.330449 3976 scope.go:117] "RemoveContainer" containerID="016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e" Mar 20 08:35:52.330743 master-0 kubenswrapper[3976]: I0320 08:35:52.330455 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4"} Mar 20 08:35:52.330743 master-0 kubenswrapper[3976]: I0320 08:35:52.330592 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212"} Mar 20 08:35:52.330743 master-0 kubenswrapper[3976]: I0320 08:35:52.330618 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4"} Mar 20 08:35:52.330743 master-0 kubenswrapper[3976]: I0320 08:35:52.330627 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7"} Mar 20 08:35:52.330743 master-0 kubenswrapper[3976]: I0320 08:35:52.330636 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3"} Mar 20 08:35:52.330743 master-0 kubenswrapper[3976]: I0320 08:35:52.330645 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53"} Mar 20 08:35:52.330743 master-0 kubenswrapper[3976]: I0320 08:35:52.330636 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3dc1be4-f742-47cc-95b6-82e0bc34a716-kube-api-access-xr5s8" (OuterVolumeSpecName: "kube-api-access-xr5s8") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "kube-api-access-xr5s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:35:52.330743 master-0 kubenswrapper[3976]: I0320 08:35:52.330668 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nsmzc" event={"ID":"d3dc1be4-f742-47cc-95b6-82e0bc34a716","Type":"ContainerDied","Data":"156eb04894ecec25bfbcd06dc0b87578738f576b411e8c7c85f2a4cc0e48d6f0"} Mar 20 08:35:52.330743 master-0 kubenswrapper[3976]: I0320 08:35:52.330739 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e"} Mar 20 08:35:52.331006 master-0 kubenswrapper[3976]: I0320 08:35:52.330750 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1"} Mar 20 08:35:52.331006 master-0 kubenswrapper[3976]: I0320 08:35:52.330757 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8"} Mar 20 08:35:52.331006 master-0 kubenswrapper[3976]: I0320 08:35:52.330766 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4"} Mar 20 08:35:52.331006 master-0 kubenswrapper[3976]: I0320 08:35:52.330773 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212"} Mar 20 08:35:52.331006 master-0 kubenswrapper[3976]: I0320 08:35:52.330779 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4"} Mar 20 08:35:52.331006 master-0 kubenswrapper[3976]: I0320 08:35:52.330784 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7"} Mar 20 08:35:52.331006 master-0 kubenswrapper[3976]: I0320 08:35:52.330791 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3"} Mar 20 08:35:52.331006 master-0 kubenswrapper[3976]: I0320 08:35:52.330796 3976 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53"} Mar 20 08:35:52.332168 master-0 kubenswrapper[3976]: I0320 08:35:52.332143 3976 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d3dc1be4-f742-47cc-95b6-82e0bc34a716" (UID: "d3dc1be4-f742-47cc-95b6-82e0bc34a716"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:35:52.348755 master-0 kubenswrapper[3976]: I0320 08:35:52.348714 3976 scope.go:117] "RemoveContainer" containerID="1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1" Mar 20 08:35:52.361299 master-0 kubenswrapper[3976]: I0320 08:35:52.361276 3976 scope.go:117] "RemoveContainer" containerID="0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8" Mar 20 08:35:52.371726 master-0 kubenswrapper[3976]: I0320 08:35:52.371705 3976 scope.go:117] "RemoveContainer" containerID="71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4" Mar 20 08:35:52.382141 master-0 kubenswrapper[3976]: I0320 08:35:52.382098 3976 scope.go:117] "RemoveContainer" containerID="452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212" Mar 20 08:35:52.395040 master-0 kubenswrapper[3976]: I0320 08:35:52.394987 3976 scope.go:117] "RemoveContainer" containerID="b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4" Mar 20 08:35:52.406468 master-0 kubenswrapper[3976]: I0320 08:35:52.406426 3976 scope.go:117] "RemoveContainer" containerID="2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7" Mar 20 08:35:52.418569 master-0 kubenswrapper[3976]: I0320 08:35:52.418086 3976 scope.go:117] "RemoveContainer" containerID="bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3" Mar 20 08:35:52.426906 master-0 kubenswrapper[3976]: I0320 08:35:52.426860 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-etc-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427025 master-0 kubenswrapper[3976]: I0320 08:35:52.427002 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-etc-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427025 master-0 kubenswrapper[3976]: I0320 08:35:52.427072 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427219 master-0 kubenswrapper[3976]: I0320 08:35:52.427106 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-config\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427219 master-0 kubenswrapper[3976]: I0320 08:35:52.427133 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-netns\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427219 master-0 kubenswrapper[3976]: I0320 08:35:52.427159 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-log-socket\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427219 master-0 kubenswrapper[3976]: I0320 08:35:52.427200 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-kubelet\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427400 master-0 kubenswrapper[3976]: I0320 08:35:52.427223 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-var-lib-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427400 master-0 kubenswrapper[3976]: I0320 08:35:52.427329 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-log-socket\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427400 master-0 kubenswrapper[3976]: I0320 08:35:52.427390 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427525 master-0 kubenswrapper[3976]: I0320 08:35:52.427454 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-var-lib-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427525 master-0 kubenswrapper[3976]: I0320 08:35:52.427516 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-kubelet\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427583 master-0 kubenswrapper[3976]: I0320 08:35:52.427556 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-netns\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427701 master-0 kubenswrapper[3976]: I0320 08:35:52.427673 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-script-lib\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427779 master-0 kubenswrapper[3976]: I0320 08:35:52.427749 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-bin\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427818 master-0 kubenswrapper[3976]: I0320 08:35:52.427779 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-slash\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427864 master-0 kubenswrapper[3976]: I0320 08:35:52.427821 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427864 master-0 kubenswrapper[3976]: I0320 08:35:52.427853 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-node-log\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.427946 master-0 kubenswrapper[3976]: I0320 08:35:52.427916 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-node-log\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428023 master-0 kubenswrapper[3976]: I0320 08:35:52.427969 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-bin\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428023 master-0 kubenswrapper[3976]: I0320 08:35:52.427980 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428118 master-0 kubenswrapper[3976]: I0320 08:35:52.428083 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-ovn\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428158 master-0 kubenswrapper[3976]: I0320 08:35:52.428132 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-ovn\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428232 master-0 kubenswrapper[3976]: I0320 08:35:52.428170 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtc2p\" (UniqueName: \"kubernetes.io/projected/248a3d2f-3be4-46bf-959c-79d28736c0d6-kube-api-access-mtc2p\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428293 master-0 kubenswrapper[3976]: I0320 08:35:52.428272 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-netd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428346 master-0 kubenswrapper[3976]: I0320 08:35:52.428311 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428380 master-0 kubenswrapper[3976]: I0320 08:35:52.428359 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-env-overrides\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428429 master-0 kubenswrapper[3976]: I0320 08:35:52.428377 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-netd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428429 master-0 kubenswrapper[3976]: I0320 08:35:52.428405 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-systemd-units\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428525 master-0 kubenswrapper[3976]: I0320 08:35:52.428497 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-systemd-units\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428576 master-0 kubenswrapper[3976]: I0320 08:35:52.428511 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428659 master-0 kubenswrapper[3976]: I0320 08:35:52.428609 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-systemd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428699 master-0 kubenswrapper[3976]: I0320 08:35:52.428658 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-systemd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428746 master-0 kubenswrapper[3976]: I0320 08:35:52.428690 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovn-node-metrics-cert\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.428901 master-0 kubenswrapper[3976]: I0320 08:35:52.428863 3976 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d3dc1be4-f742-47cc-95b6-82e0bc34a716-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.428941 master-0 kubenswrapper[3976]: I0320 08:35:52.428919 3976 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d3dc1be4-f742-47cc-95b6-82e0bc34a716-run-systemd\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.428991 master-0 kubenswrapper[3976]: I0320 08:35:52.428945 3976 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr5s8\" (UniqueName: \"kubernetes.io/projected/d3dc1be4-f742-47cc-95b6-82e0bc34a716-kube-api-access-xr5s8\") on node \"master-0\" DevicePath \"\"" Mar 20 08:35:52.429310 master-0 kubenswrapper[3976]: I0320 08:35:52.427992 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-slash\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.429529 master-0 kubenswrapper[3976]: I0320 08:35:52.429475 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-env-overrides\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.429579 master-0 kubenswrapper[3976]: I0320 08:35:52.429525 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-script-lib\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.430058 master-0 kubenswrapper[3976]: I0320 08:35:52.430022 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-config\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.432178 master-0 kubenswrapper[3976]: I0320 08:35:52.432127 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovn-node-metrics-cert\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.438566 master-0 kubenswrapper[3976]: I0320 08:35:52.438514 3976 scope.go:117] "RemoveContainer" containerID="4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53" Mar 20 08:35:52.449039 master-0 kubenswrapper[3976]: I0320 08:35:52.448998 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtc2p\" (UniqueName: \"kubernetes.io/projected/248a3d2f-3be4-46bf-959c-79d28736c0d6-kube-api-access-mtc2p\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.451199 master-0 kubenswrapper[3976]: I0320 08:35:52.451165 3976 scope.go:117] "RemoveContainer" containerID="016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e" Mar 20 08:35:52.451757 master-0 kubenswrapper[3976]: E0320 08:35:52.451725 3976 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e\": container with ID starting with 016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e not found: ID does not exist" containerID="016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e" Mar 20 08:35:52.451796 master-0 kubenswrapper[3976]: I0320 08:35:52.451761 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e"} err="failed to get container status \"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e\": rpc error: code = NotFound desc = could not find container \"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e\": container with ID starting with 016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e not found: ID does not exist" Mar 20 08:35:52.451796 master-0 kubenswrapper[3976]: I0320 08:35:52.451792 3976 scope.go:117] "RemoveContainer" containerID="1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1" Mar 20 08:35:52.452121 master-0 kubenswrapper[3976]: E0320 08:35:52.452089 3976 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1\": container with ID starting with 1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1 not found: ID does not exist" containerID="1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1" Mar 20 08:35:52.452156 master-0 kubenswrapper[3976]: I0320 08:35:52.452122 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1"} err="failed to get container status \"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1\": rpc error: code = NotFound desc = could not find container \"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1\": container with ID starting with 1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1 not found: ID does not exist" Mar 20 08:35:52.452199 master-0 kubenswrapper[3976]: I0320 08:35:52.452154 3976 scope.go:117] "RemoveContainer" containerID="0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8" Mar 20 08:35:52.452722 master-0 kubenswrapper[3976]: E0320 08:35:52.452674 3976 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8\": container with ID starting with 0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8 not found: ID does not exist" containerID="0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8" Mar 20 08:35:52.452766 master-0 kubenswrapper[3976]: I0320 08:35:52.452718 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8"} err="failed to get container status \"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8\": rpc error: code = NotFound desc = could not find container \"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8\": container with ID starting with 0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8 not found: ID does not exist" Mar 20 08:35:52.452766 master-0 kubenswrapper[3976]: I0320 08:35:52.452733 3976 scope.go:117] "RemoveContainer" containerID="71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4" Mar 20 08:35:52.453019 master-0 kubenswrapper[3976]: E0320 08:35:52.452992 3976 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4\": container with ID starting with 71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4 not found: ID does not exist" containerID="71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4" Mar 20 08:35:52.453054 master-0 kubenswrapper[3976]: I0320 08:35:52.453019 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4"} err="failed to get container status \"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4\": rpc error: code = NotFound desc = could not find container \"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4\": container with ID starting with 71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4 not found: ID does not exist" Mar 20 08:35:52.453054 master-0 kubenswrapper[3976]: I0320 08:35:52.453037 3976 scope.go:117] "RemoveContainer" containerID="452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212" Mar 20 08:35:52.453717 master-0 kubenswrapper[3976]: E0320 08:35:52.453686 3976 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212\": container with ID starting with 452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212 not found: ID does not exist" containerID="452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212" Mar 20 08:35:52.453773 master-0 kubenswrapper[3976]: I0320 08:35:52.453713 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212"} err="failed to get container status \"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212\": rpc error: code = NotFound desc = could not find container \"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212\": container with ID starting with 452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212 not found: ID does not exist" Mar 20 08:35:52.453773 master-0 kubenswrapper[3976]: I0320 08:35:52.453727 3976 scope.go:117] "RemoveContainer" containerID="b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4" Mar 20 08:35:52.454169 master-0 kubenswrapper[3976]: E0320 08:35:52.454065 3976 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4\": container with ID starting with b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4 not found: ID does not exist" containerID="b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4" Mar 20 08:35:52.454290 master-0 kubenswrapper[3976]: I0320 08:35:52.454183 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4"} err="failed to get container status \"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4\": rpc error: code = NotFound desc = could not find container \"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4\": container with ID starting with b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4 not found: ID does not exist" Mar 20 08:35:52.454336 master-0 kubenswrapper[3976]: I0320 08:35:52.454300 3976 scope.go:117] "RemoveContainer" containerID="2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7" Mar 20 08:35:52.454787 master-0 kubenswrapper[3976]: E0320 08:35:52.454759 3976 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7\": container with ID starting with 2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7 not found: ID does not exist" containerID="2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7" Mar 20 08:35:52.454827 master-0 kubenswrapper[3976]: I0320 08:35:52.454787 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7"} err="failed to get container status \"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7\": rpc error: code = NotFound desc = could not find container \"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7\": container with ID starting with 2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7 not found: ID does not exist" Mar 20 08:35:52.454827 master-0 kubenswrapper[3976]: I0320 08:35:52.454803 3976 scope.go:117] "RemoveContainer" containerID="bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3" Mar 20 08:35:52.455165 master-0 kubenswrapper[3976]: E0320 08:35:52.455141 3976 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3\": container with ID starting with bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3 not found: ID does not exist" containerID="bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3" Mar 20 08:35:52.455217 master-0 kubenswrapper[3976]: I0320 08:35:52.455165 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3"} err="failed to get container status \"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3\": rpc error: code = NotFound desc = could not find container \"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3\": container with ID starting with bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3 not found: ID does not exist" Mar 20 08:35:52.455217 master-0 kubenswrapper[3976]: I0320 08:35:52.455203 3976 scope.go:117] "RemoveContainer" containerID="4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53" Mar 20 08:35:52.455892 master-0 kubenswrapper[3976]: E0320 08:35:52.455836 3976 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53\": container with ID starting with 4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53 not found: ID does not exist" containerID="4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53" Mar 20 08:35:52.455957 master-0 kubenswrapper[3976]: I0320 08:35:52.455905 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53"} err="failed to get container status \"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53\": rpc error: code = NotFound desc = could not find container \"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53\": container with ID starting with 4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53 not found: ID does not exist" Mar 20 08:35:52.455993 master-0 kubenswrapper[3976]: I0320 08:35:52.455959 3976 scope.go:117] "RemoveContainer" containerID="016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e" Mar 20 08:35:52.456490 master-0 kubenswrapper[3976]: I0320 08:35:52.456423 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e"} err="failed to get container status \"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e\": rpc error: code = NotFound desc = could not find container \"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e\": container with ID starting with 016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e not found: ID does not exist" Mar 20 08:35:52.456539 master-0 kubenswrapper[3976]: I0320 08:35:52.456486 3976 scope.go:117] "RemoveContainer" containerID="1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1" Mar 20 08:35:52.456838 master-0 kubenswrapper[3976]: I0320 08:35:52.456807 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1"} err="failed to get container status \"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1\": rpc error: code = NotFound desc = could not find container \"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1\": container with ID starting with 1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1 not found: ID does not exist" Mar 20 08:35:52.456870 master-0 kubenswrapper[3976]: I0320 08:35:52.456835 3976 scope.go:117] "RemoveContainer" containerID="0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8" Mar 20 08:35:52.457250 master-0 kubenswrapper[3976]: I0320 08:35:52.457227 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8"} err="failed to get container status \"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8\": rpc error: code = NotFound desc = could not find container \"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8\": container with ID starting with 0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8 not found: ID does not exist" Mar 20 08:35:52.457250 master-0 kubenswrapper[3976]: I0320 08:35:52.457247 3976 scope.go:117] "RemoveContainer" containerID="71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4" Mar 20 08:35:52.457675 master-0 kubenswrapper[3976]: I0320 08:35:52.457596 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4"} err="failed to get container status \"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4\": rpc error: code = NotFound desc = could not find container \"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4\": container with ID starting with 71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4 not found: ID does not exist" Mar 20 08:35:52.457675 master-0 kubenswrapper[3976]: I0320 08:35:52.457663 3976 scope.go:117] "RemoveContainer" containerID="452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212" Mar 20 08:35:52.458017 master-0 kubenswrapper[3976]: I0320 08:35:52.457981 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212"} err="failed to get container status \"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212\": rpc error: code = NotFound desc = could not find container \"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212\": container with ID starting with 452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212 not found: ID does not exist" Mar 20 08:35:52.458017 master-0 kubenswrapper[3976]: I0320 08:35:52.458006 3976 scope.go:117] "RemoveContainer" containerID="b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4" Mar 20 08:35:52.458487 master-0 kubenswrapper[3976]: I0320 08:35:52.458460 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4"} err="failed to get container status \"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4\": rpc error: code = NotFound desc = could not find container \"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4\": container with ID starting with b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4 not found: ID does not exist" Mar 20 08:35:52.458487 master-0 kubenswrapper[3976]: I0320 08:35:52.458482 3976 scope.go:117] "RemoveContainer" containerID="2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7" Mar 20 08:35:52.458952 master-0 kubenswrapper[3976]: I0320 08:35:52.458917 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7"} err="failed to get container status \"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7\": rpc error: code = NotFound desc = could not find container \"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7\": container with ID starting with 2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7 not found: ID does not exist" Mar 20 08:35:52.458952 master-0 kubenswrapper[3976]: I0320 08:35:52.458939 3976 scope.go:117] "RemoveContainer" containerID="bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3" Mar 20 08:35:52.459369 master-0 kubenswrapper[3976]: I0320 08:35:52.459334 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3"} err="failed to get container status \"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3\": rpc error: code = NotFound desc = could not find container \"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3\": container with ID starting with bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3 not found: ID does not exist" Mar 20 08:35:52.459369 master-0 kubenswrapper[3976]: I0320 08:35:52.459361 3976 scope.go:117] "RemoveContainer" containerID="4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53" Mar 20 08:35:52.459795 master-0 kubenswrapper[3976]: I0320 08:35:52.459725 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53"} err="failed to get container status \"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53\": rpc error: code = NotFound desc = could not find container \"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53\": container with ID starting with 4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53 not found: ID does not exist" Mar 20 08:35:52.459795 master-0 kubenswrapper[3976]: I0320 08:35:52.459787 3976 scope.go:117] "RemoveContainer" containerID="016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e" Mar 20 08:35:52.460239 master-0 kubenswrapper[3976]: I0320 08:35:52.460209 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e"} err="failed to get container status \"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e\": rpc error: code = NotFound desc = could not find container \"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e\": container with ID starting with 016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e not found: ID does not exist" Mar 20 08:35:52.460239 master-0 kubenswrapper[3976]: I0320 08:35:52.460233 3976 scope.go:117] "RemoveContainer" containerID="1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1" Mar 20 08:35:52.460550 master-0 kubenswrapper[3976]: I0320 08:35:52.460503 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1"} err="failed to get container status \"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1\": rpc error: code = NotFound desc = could not find container \"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1\": container with ID starting with 1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1 not found: ID does not exist" Mar 20 08:35:52.460550 master-0 kubenswrapper[3976]: I0320 08:35:52.460525 3976 scope.go:117] "RemoveContainer" containerID="0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8" Mar 20 08:35:52.460804 master-0 kubenswrapper[3976]: I0320 08:35:52.460773 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8"} err="failed to get container status \"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8\": rpc error: code = NotFound desc = could not find container \"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8\": container with ID starting with 0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8 not found: ID does not exist" Mar 20 08:35:52.460804 master-0 kubenswrapper[3976]: I0320 08:35:52.460793 3976 scope.go:117] "RemoveContainer" containerID="71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4" Mar 20 08:35:52.461226 master-0 kubenswrapper[3976]: I0320 08:35:52.461164 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4"} err="failed to get container status \"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4\": rpc error: code = NotFound desc = could not find container \"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4\": container with ID starting with 71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4 not found: ID does not exist" Mar 20 08:35:52.461226 master-0 kubenswrapper[3976]: I0320 08:35:52.461197 3976 scope.go:117] "RemoveContainer" containerID="452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212" Mar 20 08:35:52.461500 master-0 kubenswrapper[3976]: I0320 08:35:52.461467 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212"} err="failed to get container status \"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212\": rpc error: code = NotFound desc = could not find container \"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212\": container with ID starting with 452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212 not found: ID does not exist" Mar 20 08:35:52.461500 master-0 kubenswrapper[3976]: I0320 08:35:52.461489 3976 scope.go:117] "RemoveContainer" containerID="b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4" Mar 20 08:35:52.461878 master-0 kubenswrapper[3976]: I0320 08:35:52.461841 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4"} err="failed to get container status \"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4\": rpc error: code = NotFound desc = could not find container \"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4\": container with ID starting with b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4 not found: ID does not exist" Mar 20 08:35:52.461878 master-0 kubenswrapper[3976]: I0320 08:35:52.461861 3976 scope.go:117] "RemoveContainer" containerID="2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7" Mar 20 08:35:52.462255 master-0 kubenswrapper[3976]: I0320 08:35:52.462228 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7"} err="failed to get container status \"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7\": rpc error: code = NotFound desc = could not find container \"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7\": container with ID starting with 2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7 not found: ID does not exist" Mar 20 08:35:52.462255 master-0 kubenswrapper[3976]: I0320 08:35:52.462249 3976 scope.go:117] "RemoveContainer" containerID="bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3" Mar 20 08:35:52.462500 master-0 kubenswrapper[3976]: I0320 08:35:52.462478 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3"} err="failed to get container status \"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3\": rpc error: code = NotFound desc = could not find container \"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3\": container with ID starting with bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3 not found: ID does not exist" Mar 20 08:35:52.462500 master-0 kubenswrapper[3976]: I0320 08:35:52.462495 3976 scope.go:117] "RemoveContainer" containerID="4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53" Mar 20 08:35:52.462731 master-0 kubenswrapper[3976]: I0320 08:35:52.462706 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53"} err="failed to get container status \"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53\": rpc error: code = NotFound desc = could not find container \"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53\": container with ID starting with 4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53 not found: ID does not exist" Mar 20 08:35:52.462731 master-0 kubenswrapper[3976]: I0320 08:35:52.462726 3976 scope.go:117] "RemoveContainer" containerID="016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e" Mar 20 08:35:52.462951 master-0 kubenswrapper[3976]: I0320 08:35:52.462928 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e"} err="failed to get container status \"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e\": rpc error: code = NotFound desc = could not find container \"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e\": container with ID starting with 016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e not found: ID does not exist" Mar 20 08:35:52.462951 master-0 kubenswrapper[3976]: I0320 08:35:52.462947 3976 scope.go:117] "RemoveContainer" containerID="1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1" Mar 20 08:35:52.463274 master-0 kubenswrapper[3976]: I0320 08:35:52.463253 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1"} err="failed to get container status \"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1\": rpc error: code = NotFound desc = could not find container \"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1\": container with ID starting with 1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1 not found: ID does not exist" Mar 20 08:35:52.463319 master-0 kubenswrapper[3976]: I0320 08:35:52.463276 3976 scope.go:117] "RemoveContainer" containerID="0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8" Mar 20 08:35:52.463721 master-0 kubenswrapper[3976]: I0320 08:35:52.463653 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8"} err="failed to get container status \"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8\": rpc error: code = NotFound desc = could not find container \"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8\": container with ID starting with 0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8 not found: ID does not exist" Mar 20 08:35:52.463759 master-0 kubenswrapper[3976]: I0320 08:35:52.463720 3976 scope.go:117] "RemoveContainer" containerID="71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4" Mar 20 08:35:52.464089 master-0 kubenswrapper[3976]: I0320 08:35:52.464062 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4"} err="failed to get container status \"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4\": rpc error: code = NotFound desc = could not find container \"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4\": container with ID starting with 71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4 not found: ID does not exist" Mar 20 08:35:52.464119 master-0 kubenswrapper[3976]: I0320 08:35:52.464085 3976 scope.go:117] "RemoveContainer" containerID="452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212" Mar 20 08:35:52.464504 master-0 kubenswrapper[3976]: I0320 08:35:52.464475 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212"} err="failed to get container status \"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212\": rpc error: code = NotFound desc = could not find container \"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212\": container with ID starting with 452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212 not found: ID does not exist" Mar 20 08:35:52.464504 master-0 kubenswrapper[3976]: I0320 08:35:52.464495 3976 scope.go:117] "RemoveContainer" containerID="b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4" Mar 20 08:35:52.464788 master-0 kubenswrapper[3976]: I0320 08:35:52.464764 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4"} err="failed to get container status \"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4\": rpc error: code = NotFound desc = could not find container \"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4\": container with ID starting with b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4 not found: ID does not exist" Mar 20 08:35:52.464788 master-0 kubenswrapper[3976]: I0320 08:35:52.464783 3976 scope.go:117] "RemoveContainer" containerID="2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7" Mar 20 08:35:52.465088 master-0 kubenswrapper[3976]: I0320 08:35:52.465041 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7"} err="failed to get container status \"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7\": rpc error: code = NotFound desc = could not find container \"2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7\": container with ID starting with 2a805fb129587d5d2ae4e5cf80f85d5d379e439aa21895434050bc7acdb160b7 not found: ID does not exist" Mar 20 08:35:52.465088 master-0 kubenswrapper[3976]: I0320 08:35:52.465077 3976 scope.go:117] "RemoveContainer" containerID="bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3" Mar 20 08:35:52.465406 master-0 kubenswrapper[3976]: I0320 08:35:52.465377 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3"} err="failed to get container status \"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3\": rpc error: code = NotFound desc = could not find container \"bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3\": container with ID starting with bd10ebe1e8d30df5ab1ad5f5ef688be9d313c632ab347467743d04c5f1a7f4f3 not found: ID does not exist" Mar 20 08:35:52.465406 master-0 kubenswrapper[3976]: I0320 08:35:52.465401 3976 scope.go:117] "RemoveContainer" containerID="4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53" Mar 20 08:35:52.465775 master-0 kubenswrapper[3976]: I0320 08:35:52.465715 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53"} err="failed to get container status \"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53\": rpc error: code = NotFound desc = could not find container \"4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53\": container with ID starting with 4048076b591094430e6d171f467ea4ac36821dfb9e1cbdaee3361f080d438d53 not found: ID does not exist" Mar 20 08:35:52.465810 master-0 kubenswrapper[3976]: I0320 08:35:52.465772 3976 scope.go:117] "RemoveContainer" containerID="016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e" Mar 20 08:35:52.466330 master-0 kubenswrapper[3976]: I0320 08:35:52.466298 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e"} err="failed to get container status \"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e\": rpc error: code = NotFound desc = could not find container \"016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e\": container with ID starting with 016961c51b4d9676bf0782a3f83ce7f61662a1e083740b6755da14686d4bd64e not found: ID does not exist" Mar 20 08:35:52.466368 master-0 kubenswrapper[3976]: I0320 08:35:52.466326 3976 scope.go:117] "RemoveContainer" containerID="1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1" Mar 20 08:35:52.466658 master-0 kubenswrapper[3976]: I0320 08:35:52.466634 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1"} err="failed to get container status \"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1\": rpc error: code = NotFound desc = could not find container \"1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1\": container with ID starting with 1cb8b49ef76efaf316d3b60c2d0c15b5580bbdd3e918961959484497c832e9b1 not found: ID does not exist" Mar 20 08:35:52.466658 master-0 kubenswrapper[3976]: I0320 08:35:52.466653 3976 scope.go:117] "RemoveContainer" containerID="0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8" Mar 20 08:35:52.467306 master-0 kubenswrapper[3976]: I0320 08:35:52.467237 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8"} err="failed to get container status \"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8\": rpc error: code = NotFound desc = could not find container \"0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8\": container with ID starting with 0c633ef3368ba5eca5752e7486b9fc9d7812eeca66f4ec6486d958619fd22eb8 not found: ID does not exist" Mar 20 08:35:52.467348 master-0 kubenswrapper[3976]: I0320 08:35:52.467305 3976 scope.go:117] "RemoveContainer" containerID="71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4" Mar 20 08:35:52.467888 master-0 kubenswrapper[3976]: I0320 08:35:52.467833 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4"} err="failed to get container status \"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4\": rpc error: code = NotFound desc = could not find container \"71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4\": container with ID starting with 71e36f445146c329f210f8725ac7b45ebd3a724216e64fd1261f86f5bd2a45b4 not found: ID does not exist" Mar 20 08:35:52.467888 master-0 kubenswrapper[3976]: I0320 08:35:52.467880 3976 scope.go:117] "RemoveContainer" containerID="452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212" Mar 20 08:35:52.468405 master-0 kubenswrapper[3976]: I0320 08:35:52.468334 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212"} err="failed to get container status \"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212\": rpc error: code = NotFound desc = could not find container \"452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212\": container with ID starting with 452594c023f71672e7e3126570125257f0bf3ef31609339cd76ed40051c6a212 not found: ID does not exist" Mar 20 08:35:52.468453 master-0 kubenswrapper[3976]: I0320 08:35:52.468408 3976 scope.go:117] "RemoveContainer" containerID="b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4" Mar 20 08:35:52.469024 master-0 kubenswrapper[3976]: I0320 08:35:52.468986 3976 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4"} err="failed to get container status \"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4\": rpc error: code = NotFound desc = could not find container \"b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4\": container with ID starting with b9f05df495a2544daa8e3a81110cb885a85787162489f04ad7f0358d572461c4 not found: ID does not exist" Mar 20 08:35:52.523523 master-0 kubenswrapper[3976]: I0320 08:35:52.523451 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:52.642889 master-0 kubenswrapper[3976]: I0320 08:35:52.642834 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:52.643060 master-0 kubenswrapper[3976]: E0320 08:35:52.643018 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:52.742179 master-0 kubenswrapper[3976]: I0320 08:35:52.741912 3976 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nsmzc"] Mar 20 08:35:52.749324 master-0 kubenswrapper[3976]: I0320 08:35:52.746542 3976 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nsmzc"] Mar 20 08:35:53.337574 master-0 kubenswrapper[3976]: I0320 08:35:53.337472 3976 generic.go:334] "Generic (PLEG): container finished" podID="248a3d2f-3be4-46bf-959c-79d28736c0d6" containerID="f4a74ff585c6a7d1deca8c58f38e8ca10a816620bf09146c1f9ff9a31d89c1a7" exitCode=0 Mar 20 08:35:53.337574 master-0 kubenswrapper[3976]: I0320 08:35:53.337540 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerDied","Data":"f4a74ff585c6a7d1deca8c58f38e8ca10a816620bf09146c1f9ff9a31d89c1a7"} Mar 20 08:35:53.337574 master-0 kubenswrapper[3976]: I0320 08:35:53.337584 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"ef2664ab7688b2c180ba8801467801f25c2ef381f1cb268907fa44549876a809"} Mar 20 08:35:53.647256 master-0 kubenswrapper[3976]: I0320 08:35:53.646959 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:53.647256 master-0 kubenswrapper[3976]: E0320 08:35:53.647150 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:53.656537 master-0 kubenswrapper[3976]: I0320 08:35:53.654540 3976 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3dc1be4-f742-47cc-95b6-82e0bc34a716" path="/var/lib/kubelet/pods/d3dc1be4-f742-47cc-95b6-82e0bc34a716/volumes" Mar 20 08:35:54.167268 master-0 kubenswrapper[3976]: I0320 08:35:54.166841 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2zzd\" (UniqueName: \"kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd\") pod \"network-check-target-xnrw6\" (UID: \"c0142d4e-9fd4-4375-a773-bb89b38af654\") " pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:54.167268 master-0 kubenswrapper[3976]: E0320 08:35:54.166990 3976 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:35:54.167268 master-0 kubenswrapper[3976]: E0320 08:35:54.167035 3976 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:35:54.167268 master-0 kubenswrapper[3976]: E0320 08:35:54.167056 3976 projected.go:194] Error preparing data for projected volume kube-api-access-w2zzd for pod openshift-network-diagnostics/network-check-target-xnrw6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:35:54.167268 master-0 kubenswrapper[3976]: E0320 08:35:54.167144 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd podName:c0142d4e-9fd4-4375-a773-bb89b38af654 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:26.167119756 +0000 UTC m=+157.435943083 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-w2zzd" (UniqueName: "kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd") pod "network-check-target-xnrw6" (UID: "c0142d4e-9fd4-4375-a773-bb89b38af654") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:35:54.347061 master-0 kubenswrapper[3976]: I0320 08:35:54.346973 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"c852ab295eb4b8ecb5f260bea21018b1e68a8d8bb5cfc3cbdbcada9f4439cadb"} Mar 20 08:35:54.347061 master-0 kubenswrapper[3976]: I0320 08:35:54.347064 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"dae890376a07a525dd48bc3450179c32c083e3bb463f45b933386a53e1383fa6"} Mar 20 08:35:54.347061 master-0 kubenswrapper[3976]: I0320 08:35:54.347086 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"346250461efb9157130994da8b48e7d0351db81bbd48a603a6aff3e21924579d"} Mar 20 08:35:54.348207 master-0 kubenswrapper[3976]: I0320 08:35:54.347104 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"64ff076f67738b1cdfe3015490df851855bf3f60f28119fcbb4633f4e27fd2e7"} Mar 20 08:35:54.348207 master-0 kubenswrapper[3976]: I0320 08:35:54.347122 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"7169d72dd34d5edaa756497d6149ce488f989f970423b41a486bae8df6c73e89"} Mar 20 08:35:54.348207 master-0 kubenswrapper[3976]: I0320 08:35:54.347139 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"f48fa507ff5a87b0b98dbe8df038ac6da6336decb46491d119e2c7e9b5563a25"} Mar 20 08:35:54.642413 master-0 kubenswrapper[3976]: I0320 08:35:54.642315 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:54.642683 master-0 kubenswrapper[3976]: E0320 08:35:54.642571 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:55.068046 master-0 kubenswrapper[3976]: E0320 08:35:55.067937 3976 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 08:35:55.643356 master-0 kubenswrapper[3976]: I0320 08:35:55.643289 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:55.644257 master-0 kubenswrapper[3976]: E0320 08:35:55.643746 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:56.362023 master-0 kubenswrapper[3976]: I0320 08:35:56.361945 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"97323a42971f498c1d4a021f8b56f02bad8b5f835d83cca8811f50e11754376d"} Mar 20 08:35:56.642531 master-0 kubenswrapper[3976]: I0320 08:35:56.642349 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:56.642907 master-0 kubenswrapper[3976]: E0320 08:35:56.642562 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:57.642254 master-0 kubenswrapper[3976]: I0320 08:35:57.642124 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:57.643453 master-0 kubenswrapper[3976]: E0320 08:35:57.642361 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:58.642965 master-0 kubenswrapper[3976]: I0320 08:35:58.642511 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:58.643934 master-0 kubenswrapper[3976]: E0320 08:35:58.643353 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:59.381255 master-0 kubenswrapper[3976]: I0320 08:35:59.381167 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"5bde0c42e4478c0c2b1f9cfccbfe9763429b578e44acd46f68165c02cc1775d9"} Mar 20 08:35:59.381614 master-0 kubenswrapper[3976]: I0320 08:35:59.381521 3976 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:59.381614 master-0 kubenswrapper[3976]: I0320 08:35:59.381581 3976 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:59.410523 master-0 kubenswrapper[3976]: I0320 08:35:59.408076 3976 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:35:59.435663 master-0 kubenswrapper[3976]: I0320 08:35:59.435590 3976 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" podStartSLOduration=7.435565516 podStartE2EDuration="7.435565516s" podCreationTimestamp="2026-03-20 08:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:35:59.408495242 +0000 UTC m=+130.677318519" watchObservedRunningTime="2026-03-20 08:35:59.435565516 +0000 UTC m=+130.704388803" Mar 20 08:35:59.644313 master-0 kubenswrapper[3976]: I0320 08:35:59.642706 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:35:59.644313 master-0 kubenswrapper[3976]: E0320 08:35:59.643630 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:35:59.659304 master-0 kubenswrapper[3976]: I0320 08:35:59.659241 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 20 08:35:59.814478 master-0 kubenswrapper[3976]: I0320 08:35:59.813940 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xnrw6"] Mar 20 08:35:59.814721 master-0 kubenswrapper[3976]: I0320 08:35:59.814676 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:35:59.815900 master-0 kubenswrapper[3976]: E0320 08:35:59.814789 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:35:59.815900 master-0 kubenswrapper[3976]: I0320 08:35:59.815692 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-srdjm"] Mar 20 08:36:00.068678 master-0 kubenswrapper[3976]: E0320 08:36:00.068566 3976 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 08:36:00.384639 master-0 kubenswrapper[3976]: I0320 08:36:00.384568 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:00.384932 master-0 kubenswrapper[3976]: E0320 08:36:00.384695 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:36:00.385563 master-0 kubenswrapper[3976]: I0320 08:36:00.385504 3976 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:00.411732 master-0 kubenswrapper[3976]: I0320 08:36:00.411441 3976 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:01.642297 master-0 kubenswrapper[3976]: I0320 08:36:01.642172 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:36:01.642297 master-0 kubenswrapper[3976]: I0320 08:36:01.642267 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:01.643305 master-0 kubenswrapper[3976]: E0320 08:36:01.642487 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:36:01.643305 master-0 kubenswrapper[3976]: E0320 08:36:01.642568 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:36:03.643213 master-0 kubenswrapper[3976]: I0320 08:36:03.643085 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:36:03.644838 master-0 kubenswrapper[3976]: I0320 08:36:03.643235 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:03.644838 master-0 kubenswrapper[3976]: E0320 08:36:03.643480 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnrw6" podUID="c0142d4e-9fd4-4375-a773-bb89b38af654" Mar 20 08:36:03.644838 master-0 kubenswrapper[3976]: E0320 08:36:03.643664 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srdjm" podUID="813f91c2-2b37-4681-968d-4217e286e22f" Mar 20 08:36:05.643033 master-0 kubenswrapper[3976]: I0320 08:36:05.642942 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:36:05.644112 master-0 kubenswrapper[3976]: I0320 08:36:05.642996 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:05.646915 master-0 kubenswrapper[3976]: I0320 08:36:05.646854 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 08:36:05.647498 master-0 kubenswrapper[3976]: I0320 08:36:05.647461 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 08:36:05.649349 master-0 kubenswrapper[3976]: I0320 08:36:05.647826 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 08:36:06.086613 master-0 kubenswrapper[3976]: I0320 08:36:06.086506 3976 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Mar 20 08:36:06.914944 master-0 kubenswrapper[3976]: I0320 08:36:06.914772 3976 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=7.914738895 podStartE2EDuration="7.914738895s" podCreationTimestamp="2026-03-20 08:35:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:36:00.449342446 +0000 UTC m=+131.718165733" watchObservedRunningTime="2026-03-20 08:36:06.914738895 +0000 UTC m=+138.183562212" Mar 20 08:36:06.916747 master-0 kubenswrapper[3976]: I0320 08:36:06.915322 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh"] Mar 20 08:36:06.916747 master-0 kubenswrapper[3976]: I0320 08:36:06.915968 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh" Mar 20 08:36:06.918414 master-0 kubenswrapper[3976]: I0320 08:36:06.918343 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 20 08:36:06.919244 master-0 kubenswrapper[3976]: I0320 08:36:06.919203 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 20 08:36:06.965319 master-0 kubenswrapper[3976]: I0320 08:36:06.954267 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742"] Mar 20 08:36:06.966262 master-0 kubenswrapper[3976]: I0320 08:36:06.966219 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l"] Mar 20 08:36:06.966568 master-0 kubenswrapper[3976]: I0320 08:36:06.966527 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:06.967512 master-0 kubenswrapper[3976]: I0320 08:36:06.967493 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj"] Mar 20 08:36:06.967928 master-0 kubenswrapper[3976]: I0320 08:36:06.967723 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg"] Mar 20 08:36:06.969102 master-0 kubenswrapper[3976]: I0320 08:36:06.968046 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:06.969102 master-0 kubenswrapper[3976]: I0320 08:36:06.968448 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx"] Mar 20 08:36:06.969102 master-0 kubenswrapper[3976]: I0320 08:36:06.968503 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:06.969102 master-0 kubenswrapper[3976]: I0320 08:36:06.968537 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:06.969297 master-0 kubenswrapper[3976]: I0320 08:36:06.969107 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h"] Mar 20 08:36:06.969398 master-0 kubenswrapper[3976]: I0320 08:36:06.969374 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:06.969682 master-0 kubenswrapper[3976]: I0320 08:36:06.969653 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:06.970549 master-0 kubenswrapper[3976]: I0320 08:36:06.970501 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd"] Mar 20 08:36:06.974558 master-0 kubenswrapper[3976]: I0320 08:36:06.971742 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77"] Mar 20 08:36:06.974558 master-0 kubenswrapper[3976]: I0320 08:36:06.972031 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6"] Mar 20 08:36:06.974558 master-0 kubenswrapper[3976]: I0320 08:36:06.972361 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:06.974558 master-0 kubenswrapper[3976]: I0320 08:36:06.972719 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:06.974558 master-0 kubenswrapper[3976]: I0320 08:36:06.972967 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:06.975975 master-0 kubenswrapper[3976]: I0320 08:36:06.975943 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 20 08:36:06.976231 master-0 kubenswrapper[3976]: I0320 08:36:06.976207 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 20 08:36:06.976463 master-0 kubenswrapper[3976]: I0320 08:36:06.976411 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 20 08:36:06.976605 master-0 kubenswrapper[3976]: I0320 08:36:06.976575 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 20 08:36:06.976885 master-0 kubenswrapper[3976]: I0320 08:36:06.976869 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 20 08:36:06.977105 master-0 kubenswrapper[3976]: I0320 08:36:06.977092 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 20 08:36:06.977331 master-0 kubenswrapper[3976]: I0320 08:36:06.977318 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 20 08:36:06.978397 master-0 kubenswrapper[3976]: I0320 08:36:06.978350 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq"] Mar 20 08:36:06.979028 master-0 kubenswrapper[3976]: I0320 08:36:06.978987 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz"] Mar 20 08:36:06.980596 master-0 kubenswrapper[3976]: I0320 08:36:06.979646 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:06.980596 master-0 kubenswrapper[3976]: I0320 08:36:06.980258 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:06.981250 master-0 kubenswrapper[3976]: I0320 08:36:06.980877 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5"] Mar 20 08:36:06.981941 master-0 kubenswrapper[3976]: I0320 08:36:06.981341 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-95bf4f4d-25cml"] Mar 20 08:36:06.981941 master-0 kubenswrapper[3976]: I0320 08:36:06.981650 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h"] Mar 20 08:36:06.981941 master-0 kubenswrapper[3976]: I0320 08:36:06.981746 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:06.981941 master-0 kubenswrapper[3976]: I0320 08:36:06.981773 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:06.982396 master-0 kubenswrapper[3976]: I0320 08:36:06.982328 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:36:06.982718 master-0 kubenswrapper[3976]: I0320 08:36:06.982683 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 08:36:06.983883 master-0 kubenswrapper[3976]: I0320 08:36:06.983844 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 08:36:06.984073 master-0 kubenswrapper[3976]: I0320 08:36:06.984049 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 08:36:06.985340 master-0 kubenswrapper[3976]: I0320 08:36:06.985314 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 08:36:06.985507 master-0 kubenswrapper[3976]: I0320 08:36:06.985480 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 08:36:06.985653 master-0 kubenswrapper[3976]: I0320 08:36:06.985630 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 08:36:06.985961 master-0 kubenswrapper[3976]: I0320 08:36:06.985939 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 08:36:06.986261 master-0 kubenswrapper[3976]: I0320 08:36:06.986234 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 08:36:06.986885 master-0 kubenswrapper[3976]: I0320 08:36:06.986831 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 08:36:06.987144 master-0 kubenswrapper[3976]: I0320 08:36:06.987090 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 08:36:06.987406 master-0 kubenswrapper[3976]: I0320 08:36:06.987347 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 08:36:06.987734 master-0 kubenswrapper[3976]: I0320 08:36:06.987661 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 08:36:06.988177 master-0 kubenswrapper[3976]: I0320 08:36:06.988152 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 08:36:06.995913 master-0 kubenswrapper[3976]: I0320 08:36:06.995845 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 08:36:06.996444 master-0 kubenswrapper[3976]: I0320 08:36:06.996390 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 08:36:06.996499 master-0 kubenswrapper[3976]: I0320 08:36:06.996443 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 08:36:06.996499 master-0 kubenswrapper[3976]: I0320 08:36:06.996475 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 08:36:06.996838 master-0 kubenswrapper[3976]: I0320 08:36:06.996680 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 08:36:06.996838 master-0 kubenswrapper[3976]: I0320 08:36:06.996792 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 20 08:36:06.996838 master-0 kubenswrapper[3976]: I0320 08:36:06.996830 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 08:36:06.997074 master-0 kubenswrapper[3976]: I0320 08:36:06.997051 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 20 08:36:06.997149 master-0 kubenswrapper[3976]: I0320 08:36:06.997051 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 20 08:36:06.997149 master-0 kubenswrapper[3976]: I0320 08:36:06.997137 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 08:36:06.997294 master-0 kubenswrapper[3976]: I0320 08:36:06.997179 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 08:36:06.997399 master-0 kubenswrapper[3976]: I0320 08:36:06.997367 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:36:06.997511 master-0 kubenswrapper[3976]: I0320 08:36:06.997459 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 08:36:06.997511 master-0 kubenswrapper[3976]: I0320 08:36:06.997476 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 08:36:06.997821 master-0 kubenswrapper[3976]: I0320 08:36:06.997789 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 20 08:36:06.998159 master-0 kubenswrapper[3976]: I0320 08:36:06.998101 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m"] Mar 20 08:36:06.998302 master-0 kubenswrapper[3976]: I0320 08:36:06.998288 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 08:36:06.998549 master-0 kubenswrapper[3976]: I0320 08:36:06.998536 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 08:36:06.998736 master-0 kubenswrapper[3976]: I0320 08:36:06.998725 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 08:36:06.998948 master-0 kubenswrapper[3976]: I0320 08:36:06.998920 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-9c5679d8f-r6dm8"] Mar 20 08:36:06.999240 master-0 kubenswrapper[3976]: I0320 08:36:06.999219 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:06.999351 master-0 kubenswrapper[3976]: I0320 08:36:06.999334 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa16c3bf-2350-46d1-afa0-9477b3ec8877-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-vlq7h\" (UID: \"aa16c3bf-2350-46d1-afa0-9477b3ec8877\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:36:06.999447 master-0 kubenswrapper[3976]: I0320 08:36:06.999431 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:06.999526 master-0 kubenswrapper[3976]: I0320 08:36:06.999514 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:06.999598 master-0 kubenswrapper[3976]: I0320 08:36:06.999583 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:06.999675 master-0 kubenswrapper[3976]: I0320 08:36:06.999661 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:06.999750 master-0 kubenswrapper[3976]: I0320 08:36:06.999737 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpdk5\" (UniqueName: \"kubernetes.io/projected/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-kube-api-access-lpdk5\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:06.999828 master-0 kubenswrapper[3976]: I0320 08:36:06.999813 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68252533-bd64-4fc5-838a-cc350cbe77f0-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:06.999898 master-0 kubenswrapper[3976]: I0320 08:36:06.999884 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab175f7e-a5e8-4fda-98c9-6d052a221a83-serving-cert\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:06.999966 master-0 kubenswrapper[3976]: I0320 08:36:06.999952 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqdlf\" (UniqueName: \"kubernetes.io/projected/325f0a83-d56d-4b62-977b-088a7d5f0e00-kube-api-access-lqdlf\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:07.000038 master-0 kubenswrapper[3976]: I0320 08:36:07.000027 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s69rd\" (UniqueName: \"kubernetes.io/projected/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-kube-api-access-s69rd\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:07.000105 master-0 kubenswrapper[3976]: I0320 08:36:07.000092 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:07.000231 master-0 kubenswrapper[3976]: I0320 08:36:07.000217 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:07.000333 master-0 kubenswrapper[3976]: I0320 08:36:07.000300 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ab175f7e-a5e8-4fda-98c9-6d052a221a83-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:07.000402 master-0 kubenswrapper[3976]: I0320 08:36:07.000390 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/86cb5d23-df7f-4f67-8086-1789d8e68544-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:07.000471 master-0 kubenswrapper[3976]: I0320 08:36:07.000459 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68252533-bd64-4fc5-838a-cc350cbe77f0-config\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:07.000534 master-0 kubenswrapper[3976]: I0320 08:36:07.000523 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/325f0a83-d56d-4b62-977b-088a7d5f0e00-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:07.000600 master-0 kubenswrapper[3976]: I0320 08:36:07.000589 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27qvw\" (UniqueName: \"kubernetes.io/projected/bbc0b783-28d5-4554-b49d-c66082546f44-kube-api-access-27qvw\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:07.000677 master-0 kubenswrapper[3976]: I0320 08:36:07.000665 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75e3e2cc-aa56-41f3-8859-1c086f419d05-serving-cert\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:07.000922 master-0 kubenswrapper[3976]: I0320 08:36:07.000910 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btdjt\" (UniqueName: \"kubernetes.io/projected/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-kube-api-access-btdjt\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:07.000988 master-0 kubenswrapper[3976]: I0320 08:36:07.000974 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:07.001062 master-0 kubenswrapper[3976]: I0320 08:36:07.001050 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:07.001136 master-0 kubenswrapper[3976]: I0320 08:36:07.001125 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/86cb5d23-df7f-4f67-8086-1789d8e68544-operand-assets\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:07.001218 master-0 kubenswrapper[3976]: I0320 08:36:07.001204 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2d4r\" (UniqueName: \"kubernetes.io/projected/f53bc282-5937-49ac-ac98-2ee37ccb268d-kube-api-access-b2d4r\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:07.001298 master-0 kubenswrapper[3976]: I0320 08:36:07.001286 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75r7k\" (UniqueName: \"kubernetes.io/projected/68252533-bd64-4fc5-838a-cc350cbe77f0-kube-api-access-75r7k\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:07.012583 master-0 kubenswrapper[3976]: I0320 08:36:07.012310 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7"] Mar 20 08:36:07.013687 master-0 kubenswrapper[3976]: I0320 08:36:07.013651 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:07.018984 master-0 kubenswrapper[3976]: I0320 08:36:06.999517 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 08:36:07.019686 master-0 kubenswrapper[3976]: I0320 08:36:07.019577 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4"] Mar 20 08:36:07.025382 master-0 kubenswrapper[3976]: I0320 08:36:06.999396 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:07.025382 master-0 kubenswrapper[3976]: I0320 08:36:06.999441 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:07.025382 master-0 kubenswrapper[3976]: I0320 08:36:06.999559 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 08:36:07.025382 master-0 kubenswrapper[3976]: I0320 08:36:07.022440 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:07.025382 master-0 kubenswrapper[3976]: I0320 08:36:06.999599 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 08:36:07.025382 master-0 kubenswrapper[3976]: I0320 08:36:06.999648 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 20 08:36:07.025382 master-0 kubenswrapper[3976]: I0320 08:36:06.999706 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 08:36:07.025382 master-0 kubenswrapper[3976]: I0320 08:36:06.999828 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 08:36:07.025382 master-0 kubenswrapper[3976]: I0320 08:36:06.999866 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 08:36:07.025382 master-0 kubenswrapper[3976]: I0320 08:36:06.999908 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 08:36:07.025382 master-0 kubenswrapper[3976]: I0320 08:36:07.000698 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 08:36:07.025975 master-0 kubenswrapper[3976]: I0320 08:36:07.000767 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 08:36:07.028861 master-0 kubenswrapper[3976]: I0320 08:36:07.028807 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:07.033990 master-0 kubenswrapper[3976]: I0320 08:36:07.033916 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.034404 master-0 kubenswrapper[3976]: I0320 08:36:07.034370 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4"] Mar 20 08:36:07.043340 master-0 kubenswrapper[3976]: I0320 08:36:07.043285 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-89ccd998f-mvn4t"] Mar 20 08:36:07.043940 master-0 kubenswrapper[3976]: I0320 08:36:07.043906 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:07.044238 master-0 kubenswrapper[3976]: I0320 08:36:07.044210 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:07.045753 master-0 kubenswrapper[3976]: I0320 08:36:07.042182 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e3e2cc-aa56-41f3-8859-1c086f419d05-config\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:07.045975 master-0 kubenswrapper[3976]: I0320 08:36:07.045940 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clz6w\" (UniqueName: \"kubernetes.io/projected/75e3e2cc-aa56-41f3-8859-1c086f419d05-kube-api-access-clz6w\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:07.046046 master-0 kubenswrapper[3976]: I0320 08:36:07.045982 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:07.046138 master-0 kubenswrapper[3976]: I0320 08:36:07.046090 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:07.046243 master-0 kubenswrapper[3976]: I0320 08:36:07.046162 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-images\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:07.046243 master-0 kubenswrapper[3976]: I0320 08:36:07.046229 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvjkr\" (UniqueName: \"kubernetes.io/projected/7c4e7e57-43be-4d31-b523-f7e4d316dce3-kube-api-access-bvjkr\") pod \"csi-snapshot-controller-operator-5f5d689c6b-4dqrh\" (UID: \"7c4e7e57-43be-4d31-b523-f7e4d316dce3\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh" Mar 20 08:36:07.046358 master-0 kubenswrapper[3976]: I0320 08:36:07.046251 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg6th\" (UniqueName: \"kubernetes.io/projected/7b489385-2c96-4a97-8b31-362162de020e-kube-api-access-pg6th\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:07.046358 master-0 kubenswrapper[3976]: I0320 08:36:07.046304 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:07.046358 master-0 kubenswrapper[3976]: I0320 08:36:07.046323 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-serving-cert\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:07.046358 master-0 kubenswrapper[3976]: I0320 08:36:07.046351 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmndg\" (UniqueName: \"kubernetes.io/projected/aa16c3bf-2350-46d1-afa0-9477b3ec8877-kube-api-access-qmndg\") pod \"cluster-storage-operator-7d87854d6-vlq7h\" (UID: \"aa16c3bf-2350-46d1-afa0-9477b3ec8877\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:36:07.046585 master-0 kubenswrapper[3976]: I0320 08:36:07.046368 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:07.046585 master-0 kubenswrapper[3976]: I0320 08:36:07.046395 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7k8m\" (UniqueName: \"kubernetes.io/projected/86cb5d23-df7f-4f67-8086-1789d8e68544-kube-api-access-j7k8m\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:07.046585 master-0 kubenswrapper[3976]: I0320 08:36:07.046413 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/325f0a83-d56d-4b62-977b-088a7d5f0e00-config\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:07.046585 master-0 kubenswrapper[3976]: I0320 08:36:07.046429 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-config\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:07.046585 master-0 kubenswrapper[3976]: I0320 08:36:07.046453 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4291bfd-53d9-4c78-b7cb-d7eb46560528-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:07.046585 master-0 kubenswrapper[3976]: I0320 08:36:07.046478 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nvl4\" (UniqueName: \"kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-kube-api-access-9nvl4\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:07.046585 master-0 kubenswrapper[3976]: I0320 08:36:07.046497 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmrxh\" (UniqueName: \"kubernetes.io/projected/ab175f7e-a5e8-4fda-98c9-6d052a221a83-kube-api-access-zmrxh\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:07.046585 master-0 kubenswrapper[3976]: I0320 08:36:07.046515 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-config\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:07.046585 master-0 kubenswrapper[3976]: I0320 08:36:07.046530 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-config\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:07.047019 master-0 kubenswrapper[3976]: I0320 08:36:07.046929 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 08:36:07.047175 master-0 kubenswrapper[3976]: I0320 08:36:07.047141 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr"] Mar 20 08:36:07.047827 master-0 kubenswrapper[3976]: I0320 08:36:07.047782 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:07.048221 master-0 kubenswrapper[3976]: I0320 08:36:07.048172 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr"] Mar 20 08:36:07.048614 master-0 kubenswrapper[3976]: I0320 08:36:07.048581 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:07.049344 master-0 kubenswrapper[3976]: I0320 08:36:07.049304 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 08:36:07.049642 master-0 kubenswrapper[3976]: I0320 08:36:07.049615 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 08:36:07.049794 master-0 kubenswrapper[3976]: I0320 08:36:07.049770 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 08:36:07.052943 master-0 kubenswrapper[3976]: I0320 08:36:07.052906 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 20 08:36:07.058240 master-0 kubenswrapper[3976]: I0320 08:36:07.057197 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 08:36:07.058574 master-0 kubenswrapper[3976]: I0320 08:36:07.058536 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28"] Mar 20 08:36:07.060265 master-0 kubenswrapper[3976]: I0320 08:36:07.059331 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh"] Mar 20 08:36:07.060265 master-0 kubenswrapper[3976]: I0320 08:36:07.059445 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:07.060265 master-0 kubenswrapper[3976]: I0320 08:36:07.059457 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 08:36:07.060265 master-0 kubenswrapper[3976]: I0320 08:36:07.060139 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 08:36:07.060423 master-0 kubenswrapper[3976]: I0320 08:36:07.060394 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh"] Mar 20 08:36:07.060460 master-0 kubenswrapper[3976]: I0320 08:36:07.060409 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 08:36:07.060639 master-0 kubenswrapper[3976]: I0320 08:36:07.060622 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:07.063892 master-0 kubenswrapper[3976]: I0320 08:36:07.061784 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 08:36:07.063892 master-0 kubenswrapper[3976]: I0320 08:36:07.061918 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 08:36:07.063892 master-0 kubenswrapper[3976]: I0320 08:36:07.062169 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 08:36:07.063892 master-0 kubenswrapper[3976]: I0320 08:36:07.062238 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 08:36:07.063892 master-0 kubenswrapper[3976]: I0320 08:36:07.062375 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 08:36:07.063892 master-0 kubenswrapper[3976]: I0320 08:36:07.062213 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 08:36:07.063892 master-0 kubenswrapper[3976]: I0320 08:36:07.062526 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 08:36:07.063892 master-0 kubenswrapper[3976]: I0320 08:36:07.062688 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 20 08:36:07.063892 master-0 kubenswrapper[3976]: I0320 08:36:07.062750 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 08:36:07.063892 master-0 kubenswrapper[3976]: I0320 08:36:07.063227 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 08:36:07.063892 master-0 kubenswrapper[3976]: I0320 08:36:07.063310 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 08:36:07.063892 master-0 kubenswrapper[3976]: I0320 08:36:07.063506 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 08:36:07.063892 master-0 kubenswrapper[3976]: I0320 08:36:07.063777 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 08:36:07.063892 master-0 kubenswrapper[3976]: I0320 08:36:07.063863 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 08:36:07.063892 master-0 kubenswrapper[3976]: I0320 08:36:07.063913 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 08:36:07.064340 master-0 kubenswrapper[3976]: I0320 08:36:07.063992 3976 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 08:36:07.064340 master-0 kubenswrapper[3976]: I0320 08:36:07.064050 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 08:36:07.064340 master-0 kubenswrapper[3976]: I0320 08:36:07.064166 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 08:36:07.064340 master-0 kubenswrapper[3976]: I0320 08:36:07.064312 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 08:36:07.064491 master-0 kubenswrapper[3976]: I0320 08:36:07.064428 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 08:36:07.064713 master-0 kubenswrapper[3976]: I0320 08:36:07.064536 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 08:36:07.064713 master-0 kubenswrapper[3976]: I0320 08:36:07.064660 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 20 08:36:07.064713 master-0 kubenswrapper[3976]: I0320 08:36:07.064675 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 08:36:07.065793 master-0 kubenswrapper[3976]: I0320 08:36:07.065758 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742"] Mar 20 08:36:07.069795 master-0 kubenswrapper[3976]: I0320 08:36:07.067983 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l"] Mar 20 08:36:07.069795 master-0 kubenswrapper[3976]: I0320 08:36:07.068612 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg"] Mar 20 08:36:07.072031 master-0 kubenswrapper[3976]: I0320 08:36:07.070210 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h"] Mar 20 08:36:07.072031 master-0 kubenswrapper[3976]: I0320 08:36:07.071088 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:36:07.072031 master-0 kubenswrapper[3976]: I0320 08:36:07.071561 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 20 08:36:07.072627 master-0 kubenswrapper[3976]: I0320 08:36:07.072585 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 20 08:36:07.074722 master-0 kubenswrapper[3976]: I0320 08:36:07.073501 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz"] Mar 20 08:36:07.074722 master-0 kubenswrapper[3976]: I0320 08:36:07.074085 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 08:36:07.084531 master-0 kubenswrapper[3976]: I0320 08:36:07.084466 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 08:36:07.087293 master-0 kubenswrapper[3976]: I0320 08:36:07.086288 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq"] Mar 20 08:36:07.087293 master-0 kubenswrapper[3976]: I0320 08:36:07.087259 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 08:36:07.089518 master-0 kubenswrapper[3976]: I0320 08:36:07.089297 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77"] Mar 20 08:36:07.089518 master-0 kubenswrapper[3976]: I0320 08:36:07.089351 3976 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-dd9wv"] Mar 20 08:36:07.090399 master-0 kubenswrapper[3976]: I0320 08:36:07.090367 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:07.100496 master-0 kubenswrapper[3976]: I0320 08:36:07.100463 3976 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 08:36:07.100846 master-0 kubenswrapper[3976]: I0320 08:36:07.100813 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj"] Mar 20 08:36:07.101487 master-0 kubenswrapper[3976]: I0320 08:36:07.101453 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5"] Mar 20 08:36:07.102961 master-0 kubenswrapper[3976]: I0320 08:36:07.102923 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd"] Mar 20 08:36:07.104901 master-0 kubenswrapper[3976]: I0320 08:36:07.103248 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-9c5679d8f-r6dm8"] Mar 20 08:36:07.104901 master-0 kubenswrapper[3976]: I0320 08:36:07.104864 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-95bf4f4d-25cml"] Mar 20 08:36:07.106277 master-0 kubenswrapper[3976]: I0320 08:36:07.106258 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr"] Mar 20 08:36:07.106479 master-0 kubenswrapper[3976]: I0320 08:36:07.106442 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6"] Mar 20 08:36:07.109634 master-0 kubenswrapper[3976]: I0320 08:36:07.109146 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4"] Mar 20 08:36:07.109634 master-0 kubenswrapper[3976]: I0320 08:36:07.109492 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-89ccd998f-mvn4t"] Mar 20 08:36:07.109634 master-0 kubenswrapper[3976]: I0320 08:36:07.109504 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h"] Mar 20 08:36:07.111096 master-0 kubenswrapper[3976]: I0320 08:36:07.110375 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28"] Mar 20 08:36:07.111629 master-0 kubenswrapper[3976]: I0320 08:36:07.111595 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr"] Mar 20 08:36:07.113108 master-0 kubenswrapper[3976]: I0320 08:36:07.112444 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7"] Mar 20 08:36:07.113256 master-0 kubenswrapper[3976]: I0320 08:36:07.113211 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx"] Mar 20 08:36:07.114772 master-0 kubenswrapper[3976]: I0320 08:36:07.114011 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m"] Mar 20 08:36:07.115010 master-0 kubenswrapper[3976]: I0320 08:36:07.114871 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4"] Mar 20 08:36:07.115789 master-0 kubenswrapper[3976]: I0320 08:36:07.115757 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh"] Mar 20 08:36:07.147827 master-0 kubenswrapper[3976]: I0320 08:36:07.146937 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-images\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:07.147827 master-0 kubenswrapper[3976]: I0320 08:36:07.146977 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d57k\" (UniqueName: \"kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-kube-api-access-8d57k\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:07.147827 master-0 kubenswrapper[3976]: I0320 08:36:07.146998 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:07.147827 master-0 kubenswrapper[3976]: I0320 08:36:07.147017 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-bound-sa-token\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:07.147827 master-0 kubenswrapper[3976]: I0320 08:36:07.147039 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvjkr\" (UniqueName: \"kubernetes.io/projected/7c4e7e57-43be-4d31-b523-f7e4d316dce3-kube-api-access-bvjkr\") pod \"csi-snapshot-controller-operator-5f5d689c6b-4dqrh\" (UID: \"7c4e7e57-43be-4d31-b523-f7e4d316dce3\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh" Mar 20 08:36:07.147827 master-0 kubenswrapper[3976]: I0320 08:36:07.147058 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-224dg\" (UniqueName: \"kubernetes.io/projected/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-kube-api-access-224dg\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:07.147827 master-0 kubenswrapper[3976]: I0320 08:36:07.147075 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvx6w\" (UniqueName: \"kubernetes.io/projected/acb704a9-6c8d-4378-ae93-e7095b1fce85-kube-api-access-xvx6w\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:07.147827 master-0 kubenswrapper[3976]: I0320 08:36:07.147101 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg6th\" (UniqueName: \"kubernetes.io/projected/7b489385-2c96-4a97-8b31-362162de020e-kube-api-access-pg6th\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:07.147827 master-0 kubenswrapper[3976]: I0320 08:36:07.147116 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:07.147827 master-0 kubenswrapper[3976]: I0320 08:36:07.147130 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:07.147827 master-0 kubenswrapper[3976]: I0320 08:36:07.147159 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:07.147827 master-0 kubenswrapper[3976]: I0320 08:36:07.147179 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-serving-cert\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:07.147827 master-0 kubenswrapper[3976]: I0320 08:36:07.147219 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:07.147827 master-0 kubenswrapper[3976]: I0320 08:36:07.147238 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:07.147827 master-0 kubenswrapper[3976]: I0320 08:36:07.147260 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmndg\" (UniqueName: \"kubernetes.io/projected/aa16c3bf-2350-46d1-afa0-9477b3ec8877-kube-api-access-qmndg\") pod \"cluster-storage-operator-7d87854d6-vlq7h\" (UID: \"aa16c3bf-2350-46d1-afa0-9477b3ec8877\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:36:07.148365 master-0 kubenswrapper[3976]: I0320 08:36:07.147305 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:07.148365 master-0 kubenswrapper[3976]: I0320 08:36:07.147327 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrws5\" (UniqueName: \"kubernetes.io/projected/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-kube-api-access-wrws5\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:07.148365 master-0 kubenswrapper[3976]: I0320 08:36:07.147347 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7k8m\" (UniqueName: \"kubernetes.io/projected/86cb5d23-df7f-4f67-8086-1789d8e68544-kube-api-access-j7k8m\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:07.148365 master-0 kubenswrapper[3976]: I0320 08:36:07.147366 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/325f0a83-d56d-4b62-977b-088a7d5f0e00-config\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:07.148365 master-0 kubenswrapper[3976]: I0320 08:36:07.147411 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-config\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:07.148365 master-0 kubenswrapper[3976]: I0320 08:36:07.147432 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l4b7\" (UniqueName: \"kubernetes.io/projected/ee3cc021-67d8-4b7f-b443-16f18228712e-kube-api-access-7l4b7\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:07.148365 master-0 kubenswrapper[3976]: I0320 08:36:07.147450 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/45e8b72b-564c-4bb1-b911-baff2d6c87ad-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:07.148365 master-0 kubenswrapper[3976]: I0320 08:36:07.147471 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee3cc021-67d8-4b7f-b443-16f18228712e-host-slash\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:07.148365 master-0 kubenswrapper[3976]: I0320 08:36:07.147490 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4291bfd-53d9-4c78-b7cb-d7eb46560528-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:07.148365 master-0 kubenswrapper[3976]: I0320 08:36:07.147508 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nvl4\" (UniqueName: \"kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-kube-api-access-9nvl4\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:07.148365 master-0 kubenswrapper[3976]: I0320 08:36:07.147527 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmrxh\" (UniqueName: \"kubernetes.io/projected/ab175f7e-a5e8-4fda-98c9-6d052a221a83-kube-api-access-zmrxh\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:07.148365 master-0 kubenswrapper[3976]: I0320 08:36:07.147545 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-config\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:07.148365 master-0 kubenswrapper[3976]: I0320 08:36:07.147563 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-config\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:07.148365 master-0 kubenswrapper[3976]: I0320 08:36:07.147579 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-ca\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.148365 master-0 kubenswrapper[3976]: I0320 08:36:07.147598 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:07.149029 master-0 kubenswrapper[3976]: I0320 08:36:07.147614 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a57854ac-809a-4745-aaa1-774f0a08a560-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:07.149029 master-0 kubenswrapper[3976]: I0320 08:36:07.147635 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-config\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.149029 master-0 kubenswrapper[3976]: I0320 08:36:07.147659 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:07.149029 master-0 kubenswrapper[3976]: I0320 08:36:07.147680 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa16c3bf-2350-46d1-afa0-9477b3ec8877-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-vlq7h\" (UID: \"aa16c3bf-2350-46d1-afa0-9477b3ec8877\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:36:07.149029 master-0 kubenswrapper[3976]: I0320 08:36:07.147697 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:07.149029 master-0 kubenswrapper[3976]: I0320 08:36:07.147733 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa759777-de22-4440-a3d3-ad429a3b8e7b-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:07.149029 master-0 kubenswrapper[3976]: I0320 08:36:07.147750 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa759777-de22-4440-a3d3-ad429a3b8e7b-config\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:07.149029 master-0 kubenswrapper[3976]: I0320 08:36:07.147766 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.149029 master-0 kubenswrapper[3976]: I0320 08:36:07.147785 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:07.149029 master-0 kubenswrapper[3976]: I0320 08:36:07.147805 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:07.149029 master-0 kubenswrapper[3976]: I0320 08:36:07.147823 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ee3cc021-67d8-4b7f-b443-16f18228712e-iptables-alerter-script\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:07.149029 master-0 kubenswrapper[3976]: I0320 08:36:07.147842 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpdk5\" (UniqueName: \"kubernetes.io/projected/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-kube-api-access-lpdk5\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:07.149029 master-0 kubenswrapper[3976]: I0320 08:36:07.147860 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-trusted-ca\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:07.149029 master-0 kubenswrapper[3976]: I0320 08:36:07.147876 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a57854ac-809a-4745-aaa1-774f0a08a560-config\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:07.149590 master-0 kubenswrapper[3976]: I0320 08:36:07.147894 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhkh7\" (UniqueName: \"kubernetes.io/projected/f046860d-2d54-4746-8ba2-f8e90fa55e38-kube-api-access-xhkh7\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.149590 master-0 kubenswrapper[3976]: I0320 08:36:07.147911 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:07.149590 master-0 kubenswrapper[3976]: I0320 08:36:07.147933 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68252533-bd64-4fc5-838a-cc350cbe77f0-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:07.149590 master-0 kubenswrapper[3976]: I0320 08:36:07.147954 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab175f7e-a5e8-4fda-98c9-6d052a221a83-serving-cert\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:07.149590 master-0 kubenswrapper[3976]: I0320 08:36:07.147971 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:07.149590 master-0 kubenswrapper[3976]: I0320 08:36:07.147992 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqdlf\" (UniqueName: \"kubernetes.io/projected/325f0a83-d56d-4b62-977b-088a7d5f0e00-kube-api-access-lqdlf\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:07.149590 master-0 kubenswrapper[3976]: I0320 08:36:07.148012 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s69rd\" (UniqueName: \"kubernetes.io/projected/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-kube-api-access-s69rd\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:07.149590 master-0 kubenswrapper[3976]: I0320 08:36:07.148028 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-serving-cert\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.149590 master-0 kubenswrapper[3976]: I0320 08:36:07.148049 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ab175f7e-a5e8-4fda-98c9-6d052a221a83-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:07.149590 master-0 kubenswrapper[3976]: I0320 08:36:07.148082 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/86cb5d23-df7f-4f67-8086-1789d8e68544-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:07.149590 master-0 kubenswrapper[3976]: I0320 08:36:07.148100 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:07.149590 master-0 kubenswrapper[3976]: I0320 08:36:07.148120 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:07.149590 master-0 kubenswrapper[3976]: I0320 08:36:07.148139 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68252533-bd64-4fc5-838a-cc350cbe77f0-config\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:07.149590 master-0 kubenswrapper[3976]: I0320 08:36:07.148157 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/325f0a83-d56d-4b62-977b-088a7d5f0e00-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:07.149590 master-0 kubenswrapper[3976]: I0320 08:36:07.148176 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:07.150094 master-0 kubenswrapper[3976]: I0320 08:36:07.148246 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtcnq\" (UniqueName: \"kubernetes.io/projected/df428d5a-c722-4536-8e7f-cdd85c560481-kube-api-access-dtcnq\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:07.150094 master-0 kubenswrapper[3976]: I0320 08:36:07.148263 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-images\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:07.150094 master-0 kubenswrapper[3976]: I0320 08:36:07.148283 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27qvw\" (UniqueName: \"kubernetes.io/projected/bbc0b783-28d5-4554-b49d-c66082546f44-kube-api-access-27qvw\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:07.150094 master-0 kubenswrapper[3976]: I0320 08:36:07.148301 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75e3e2cc-aa56-41f3-8859-1c086f419d05-serving-cert\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:07.150094 master-0 kubenswrapper[3976]: I0320 08:36:07.148320 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zpkn\" (UniqueName: \"kubernetes.io/projected/45e8b72b-564c-4bb1-b911-baff2d6c87ad-kube-api-access-9zpkn\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:07.150094 master-0 kubenswrapper[3976]: I0320 08:36:07.148337 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a57854ac-809a-4745-aaa1-774f0a08a560-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:07.150094 master-0 kubenswrapper[3976]: I0320 08:36:07.148356 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btdjt\" (UniqueName: \"kubernetes.io/projected/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-kube-api-access-btdjt\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:07.150094 master-0 kubenswrapper[3976]: I0320 08:36:07.148377 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:07.150094 master-0 kubenswrapper[3976]: I0320 08:36:07.148403 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:07.150094 master-0 kubenswrapper[3976]: I0320 08:36:07.148422 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/86cb5d23-df7f-4f67-8086-1789d8e68544-operand-assets\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:07.150094 master-0 kubenswrapper[3976]: I0320 08:36:07.148438 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2d4r\" (UniqueName: \"kubernetes.io/projected/f53bc282-5937-49ac-ac98-2ee37ccb268d-kube-api-access-b2d4r\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:07.150094 master-0 kubenswrapper[3976]: I0320 08:36:07.148454 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqd2v\" (UniqueName: \"kubernetes.io/projected/6a80bd6f-2263-4251-8197-5173193f8afc-kube-api-access-tqd2v\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:07.150094 master-0 kubenswrapper[3976]: I0320 08:36:07.148474 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75r7k\" (UniqueName: \"kubernetes.io/projected/68252533-bd64-4fc5-838a-cc350cbe77f0-kube-api-access-75r7k\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:07.150094 master-0 kubenswrapper[3976]: I0320 08:36:07.148491 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:07.150713 master-0 kubenswrapper[3976]: I0320 08:36:07.148508 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:07.150713 master-0 kubenswrapper[3976]: I0320 08:36:07.148524 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e3e2cc-aa56-41f3-8859-1c086f419d05-config\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:07.150713 master-0 kubenswrapper[3976]: I0320 08:36:07.148543 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clz6w\" (UniqueName: \"kubernetes.io/projected/75e3e2cc-aa56-41f3-8859-1c086f419d05-kube-api-access-clz6w\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:07.150713 master-0 kubenswrapper[3976]: I0320 08:36:07.148561 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:07.150713 master-0 kubenswrapper[3976]: I0320 08:36:07.148580 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:07.150713 master-0 kubenswrapper[3976]: I0320 08:36:07.148596 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa759777-de22-4440-a3d3-ad429a3b8e7b-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:07.150713 master-0 kubenswrapper[3976]: I0320 08:36:07.148613 3976 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-client\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.150713 master-0 kubenswrapper[3976]: I0320 08:36:07.148633 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:07.150713 master-0 kubenswrapper[3976]: E0320 08:36:07.148756 3976 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 20 08:36:07.150713 master-0 kubenswrapper[3976]: E0320 08:36:07.148804 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert podName:bbc0b783-28d5-4554-b49d-c66082546f44 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:07.648786445 +0000 UTC m=+138.917609732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-2pg77" (UID: "bbc0b783-28d5-4554-b49d-c66082546f44") : secret "package-server-manager-serving-cert" not found Mar 20 08:36:07.150713 master-0 kubenswrapper[3976]: I0320 08:36:07.149834 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-images\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:07.151949 master-0 kubenswrapper[3976]: I0320 08:36:07.151901 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4291bfd-53d9-4c78-b7cb-d7eb46560528-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:07.152104 master-0 kubenswrapper[3976]: E0320 08:36:07.152074 3976 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:07.152162 master-0 kubenswrapper[3976]: E0320 08:36:07.152145 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:07.652121212 +0000 UTC m=+138.920944499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:07.154001 master-0 kubenswrapper[3976]: I0320 08:36:07.153729 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:07.154610 master-0 kubenswrapper[3976]: I0320 08:36:07.154409 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ab175f7e-a5e8-4fda-98c9-6d052a221a83-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:07.154610 master-0 kubenswrapper[3976]: I0320 08:36:07.154449 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68252533-bd64-4fc5-838a-cc350cbe77f0-config\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:07.155693 master-0 kubenswrapper[3976]: I0320 08:36:07.155643 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/325f0a83-d56d-4b62-977b-088a7d5f0e00-config\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:07.155954 master-0 kubenswrapper[3976]: E0320 08:36:07.155924 3976 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 20 08:36:07.156060 master-0 kubenswrapper[3976]: E0320 08:36:07.156009 3976 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 20 08:36:07.156106 master-0 kubenswrapper[3976]: E0320 08:36:07.156055 3976 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:07.156138 master-0 kubenswrapper[3976]: E0320 08:36:07.156061 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:07.656044677 +0000 UTC m=+138.924867964 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "node-tuning-operator-tls" not found Mar 20 08:36:07.156179 master-0 kubenswrapper[3976]: E0320 08:36:07.156156 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:07.656128389 +0000 UTC m=+138.924951856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:07.156496 master-0 kubenswrapper[3976]: I0320 08:36:07.156466 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/86cb5d23-df7f-4f67-8086-1789d8e68544-operand-assets\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:07.156565 master-0 kubenswrapper[3976]: E0320 08:36:07.156539 3976 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 20 08:36:07.156602 master-0 kubenswrapper[3976]: E0320 08:36:07.156581 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls podName:b4291bfd-53d9-4c78-b7cb-d7eb46560528 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:07.656570602 +0000 UTC m=+138.925394119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-fmhbq" (UID: "b4291bfd-53d9-4c78-b7cb-d7eb46560528") : secret "image-registry-operator-tls" not found Mar 20 08:36:07.156736 master-0 kubenswrapper[3976]: I0320 08:36:07.156698 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:07.157417 master-0 kubenswrapper[3976]: I0320 08:36:07.157379 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:07.157417 master-0 kubenswrapper[3976]: I0320 08:36:07.157394 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-config\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:07.158068 master-0 kubenswrapper[3976]: I0320 08:36:07.158016 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-config\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:07.160522 master-0 kubenswrapper[3976]: I0320 08:36:07.160483 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:07.160722 master-0 kubenswrapper[3976]: I0320 08:36:07.160688 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68252533-bd64-4fc5-838a-cc350cbe77f0-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:07.161123 master-0 kubenswrapper[3976]: I0320 08:36:07.161092 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e3e2cc-aa56-41f3-8859-1c086f419d05-config\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:07.161198 master-0 kubenswrapper[3976]: E0320 08:36:07.161163 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert podName:7b489385-2c96-4a97-8b31-362162de020e nodeName:}" failed. No retries permitted until 2026-03-20 08:36:07.661149726 +0000 UTC m=+138.929973123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert") pod "olm-operator-5c9796789-tjm9l" (UID: "7b489385-2c96-4a97-8b31-362162de020e") : secret "olm-operator-serving-cert" not found Mar 20 08:36:07.161364 master-0 kubenswrapper[3976]: E0320 08:36:07.161340 3976 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:07.161430 master-0 kubenswrapper[3976]: E0320 08:36:07.161376 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:07.661368942 +0000 UTC m=+138.930192229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:07.166048 master-0 kubenswrapper[3976]: I0320 08:36:07.165956 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-serving-cert\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:07.166583 master-0 kubenswrapper[3976]: I0320 08:36:07.166557 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-config\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:07.167307 master-0 kubenswrapper[3976]: I0320 08:36:07.167255 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75e3e2cc-aa56-41f3-8859-1c086f419d05-serving-cert\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:07.167425 master-0 kubenswrapper[3976]: I0320 08:36:07.167321 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:07.167425 master-0 kubenswrapper[3976]: I0320 08:36:07.167373 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/325f0a83-d56d-4b62-977b-088a7d5f0e00-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:07.167527 master-0 kubenswrapper[3976]: I0320 08:36:07.167460 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:07.167527 master-0 kubenswrapper[3976]: I0320 08:36:07.167492 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa16c3bf-2350-46d1-afa0-9477b3ec8877-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-vlq7h\" (UID: \"aa16c3bf-2350-46d1-afa0-9477b3ec8877\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:36:07.169848 master-0 kubenswrapper[3976]: I0320 08:36:07.169768 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/86cb5d23-df7f-4f67-8086-1789d8e68544-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:07.170042 master-0 kubenswrapper[3976]: I0320 08:36:07.169901 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab175f7e-a5e8-4fda-98c9-6d052a221a83-serving-cert\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:07.171712 master-0 kubenswrapper[3976]: I0320 08:36:07.171661 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:07.171869 master-0 kubenswrapper[3976]: I0320 08:36:07.171819 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpdk5\" (UniqueName: \"kubernetes.io/projected/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-kube-api-access-lpdk5\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:07.172560 master-0 kubenswrapper[3976]: I0320 08:36:07.172520 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg6th\" (UniqueName: \"kubernetes.io/projected/7b489385-2c96-4a97-8b31-362162de020e-kube-api-access-pg6th\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:07.173898 master-0 kubenswrapper[3976]: I0320 08:36:07.173855 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvjkr\" (UniqueName: \"kubernetes.io/projected/7c4e7e57-43be-4d31-b523-f7e4d316dce3-kube-api-access-bvjkr\") pod \"csi-snapshot-controller-operator-5f5d689c6b-4dqrh\" (UID: \"7c4e7e57-43be-4d31-b523-f7e4d316dce3\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh" Mar 20 08:36:07.193393 master-0 kubenswrapper[3976]: I0320 08:36:07.193320 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nvl4\" (UniqueName: \"kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-kube-api-access-9nvl4\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:07.194303 master-0 kubenswrapper[3976]: I0320 08:36:07.194258 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqdlf\" (UniqueName: \"kubernetes.io/projected/325f0a83-d56d-4b62-977b-088a7d5f0e00-kube-api-access-lqdlf\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:07.194360 master-0 kubenswrapper[3976]: I0320 08:36:07.194328 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7k8m\" (UniqueName: \"kubernetes.io/projected/86cb5d23-df7f-4f67-8086-1789d8e68544-kube-api-access-j7k8m\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:07.195375 master-0 kubenswrapper[3976]: I0320 08:36:07.195342 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmrxh\" (UniqueName: \"kubernetes.io/projected/ab175f7e-a5e8-4fda-98c9-6d052a221a83-kube-api-access-zmrxh\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:07.195548 master-0 kubenswrapper[3976]: I0320 08:36:07.195439 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmndg\" (UniqueName: \"kubernetes.io/projected/aa16c3bf-2350-46d1-afa0-9477b3ec8877-kube-api-access-qmndg\") pod \"cluster-storage-operator-7d87854d6-vlq7h\" (UID: \"aa16c3bf-2350-46d1-afa0-9477b3ec8877\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:36:07.195725 master-0 kubenswrapper[3976]: I0320 08:36:07.195649 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:07.196229 master-0 kubenswrapper[3976]: I0320 08:36:07.196170 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s69rd\" (UniqueName: \"kubernetes.io/projected/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-kube-api-access-s69rd\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:07.197238 master-0 kubenswrapper[3976]: I0320 08:36:07.197177 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btdjt\" (UniqueName: \"kubernetes.io/projected/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-kube-api-access-btdjt\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:07.213342 master-0 kubenswrapper[3976]: I0320 08:36:07.213304 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2d4r\" (UniqueName: \"kubernetes.io/projected/f53bc282-5937-49ac-ac98-2ee37ccb268d-kube-api-access-b2d4r\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:07.224643 master-0 kubenswrapper[3976]: I0320 08:36:07.224608 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:07.232419 master-0 kubenswrapper[3976]: I0320 08:36:07.232348 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh" Mar 20 08:36:07.233284 master-0 kubenswrapper[3976]: I0320 08:36:07.233067 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75r7k\" (UniqueName: \"kubernetes.io/projected/68252533-bd64-4fc5-838a-cc350cbe77f0-kube-api-access-75r7k\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:07.239570 master-0 kubenswrapper[3976]: I0320 08:36:07.238832 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:07.247405 master-0 kubenswrapper[3976]: I0320 08:36:07.247327 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:07.249095 master-0 kubenswrapper[3976]: I0320 08:36:07.249054 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:07.249143 master-0 kubenswrapper[3976]: I0320 08:36:07.249129 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d57k\" (UniqueName: \"kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-kube-api-access-8d57k\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:07.249174 master-0 kubenswrapper[3976]: I0320 08:36:07.249158 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-bound-sa-token\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:07.249227 master-0 kubenswrapper[3976]: I0320 08:36:07.249202 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-224dg\" (UniqueName: \"kubernetes.io/projected/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-kube-api-access-224dg\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:07.249263 master-0 kubenswrapper[3976]: I0320 08:36:07.249225 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvx6w\" (UniqueName: \"kubernetes.io/projected/acb704a9-6c8d-4378-ae93-e7095b1fce85-kube-api-access-xvx6w\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:07.249263 master-0 kubenswrapper[3976]: I0320 08:36:07.249253 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:07.249318 master-0 kubenswrapper[3976]: I0320 08:36:07.249292 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:07.249379 master-0 kubenswrapper[3976]: I0320 08:36:07.249353 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:07.249426 master-0 kubenswrapper[3976]: I0320 08:36:07.249385 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:07.249426 master-0 kubenswrapper[3976]: I0320 08:36:07.249416 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrws5\" (UniqueName: \"kubernetes.io/projected/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-kube-api-access-wrws5\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:07.249485 master-0 kubenswrapper[3976]: I0320 08:36:07.249467 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l4b7\" (UniqueName: \"kubernetes.io/projected/ee3cc021-67d8-4b7f-b443-16f18228712e-kube-api-access-7l4b7\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:07.249518 master-0 kubenswrapper[3976]: I0320 08:36:07.249495 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/45e8b72b-564c-4bb1-b911-baff2d6c87ad-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:07.249548 master-0 kubenswrapper[3976]: I0320 08:36:07.249524 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee3cc021-67d8-4b7f-b443-16f18228712e-host-slash\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:07.249581 master-0 kubenswrapper[3976]: I0320 08:36:07.249566 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-ca\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.249610 master-0 kubenswrapper[3976]: I0320 08:36:07.249595 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a57854ac-809a-4745-aaa1-774f0a08a560-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:07.249640 master-0 kubenswrapper[3976]: I0320 08:36:07.249620 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-config\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.249678 master-0 kubenswrapper[3976]: I0320 08:36:07.249652 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:07.249718 master-0 kubenswrapper[3976]: I0320 08:36:07.249689 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa759777-de22-4440-a3d3-ad429a3b8e7b-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:07.249718 master-0 kubenswrapper[3976]: I0320 08:36:07.249714 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa759777-de22-4440-a3d3-ad429a3b8e7b-config\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:07.249796 master-0 kubenswrapper[3976]: I0320 08:36:07.249739 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.249827 master-0 kubenswrapper[3976]: I0320 08:36:07.249804 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ee3cc021-67d8-4b7f-b443-16f18228712e-iptables-alerter-script\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:07.249866 master-0 kubenswrapper[3976]: I0320 08:36:07.249831 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a57854ac-809a-4745-aaa1-774f0a08a560-config\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:07.249866 master-0 kubenswrapper[3976]: I0320 08:36:07.249858 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhkh7\" (UniqueName: \"kubernetes.io/projected/f046860d-2d54-4746-8ba2-f8e90fa55e38-kube-api-access-xhkh7\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.249947 master-0 kubenswrapper[3976]: I0320 08:36:07.249887 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-trusted-ca\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:07.249947 master-0 kubenswrapper[3976]: I0320 08:36:07.249914 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:07.250005 master-0 kubenswrapper[3976]: I0320 08:36:07.249985 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-serving-cert\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.250069 master-0 kubenswrapper[3976]: I0320 08:36:07.250045 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:07.250173 master-0 kubenswrapper[3976]: I0320 08:36:07.250080 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtcnq\" (UniqueName: \"kubernetes.io/projected/df428d5a-c722-4536-8e7f-cdd85c560481-kube-api-access-dtcnq\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:07.250173 master-0 kubenswrapper[3976]: I0320 08:36:07.250111 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-images\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:07.250173 master-0 kubenswrapper[3976]: I0320 08:36:07.250165 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zpkn\" (UniqueName: \"kubernetes.io/projected/45e8b72b-564c-4bb1-b911-baff2d6c87ad-kube-api-access-9zpkn\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:07.250284 master-0 kubenswrapper[3976]: I0320 08:36:07.250224 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a57854ac-809a-4745-aaa1-774f0a08a560-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:07.250284 master-0 kubenswrapper[3976]: I0320 08:36:07.250279 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqd2v\" (UniqueName: \"kubernetes.io/projected/6a80bd6f-2263-4251-8197-5173193f8afc-kube-api-access-tqd2v\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:07.250378 master-0 kubenswrapper[3976]: I0320 08:36:07.250354 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-client\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.250422 master-0 kubenswrapper[3976]: I0320 08:36:07.250400 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:07.250453 master-0 kubenswrapper[3976]: I0320 08:36:07.250434 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa759777-de22-4440-a3d3-ad429a3b8e7b-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:07.250806 master-0 kubenswrapper[3976]: E0320 08:36:07.250730 3976 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 20 08:36:07.250806 master-0 kubenswrapper[3976]: E0320 08:36:07.250790 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls podName:42df77ec-94aa-48ba-bb35-7b1f1e8b8e97 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:07.750770631 +0000 UTC m=+139.019593918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls") pod "machine-config-operator-84d549f6d5-gm4qr" (UID: "42df77ec-94aa-48ba-bb35-7b1f1e8b8e97") : secret "mco-proxy-tls" not found Mar 20 08:36:07.259231 master-0 kubenswrapper[3976]: E0320 08:36:07.258597 3976 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 20 08:36:07.259231 master-0 kubenswrapper[3976]: E0320 08:36:07.258695 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert podName:df428d5a-c722-4536-8e7f-cdd85c560481 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:07.758663321 +0000 UTC m=+139.027486608 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert") pod "catalog-operator-68f85b4d6c-fzm28" (UID: "df428d5a-c722-4536-8e7f-cdd85c560481") : secret "catalog-operator-serving-cert" not found Mar 20 08:36:07.259231 master-0 kubenswrapper[3976]: I0320 08:36:07.259132 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a57854ac-809a-4745-aaa1-774f0a08a560-config\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:07.260165 master-0 kubenswrapper[3976]: E0320 08:36:07.259663 3976 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:07.260165 master-0 kubenswrapper[3976]: E0320 08:36:07.259698 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls podName:ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:07.759688331 +0000 UTC m=+139.028511618 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls") pod "ingress-operator-66b84d69b-gzg9m" (UID: "ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02") : secret "metrics-tls" not found Mar 20 08:36:07.260165 master-0 kubenswrapper[3976]: I0320 08:36:07.259854 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:07.261806 master-0 kubenswrapper[3976]: I0320 08:36:07.260356 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-images\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:07.261806 master-0 kubenswrapper[3976]: I0320 08:36:07.261319 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.262707 master-0 kubenswrapper[3976]: I0320 08:36:07.262659 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:07.262768 master-0 kubenswrapper[3976]: I0320 08:36:07.262745 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/45e8b72b-564c-4bb1-b911-baff2d6c87ad-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:07.264763 master-0 kubenswrapper[3976]: I0320 08:36:07.264719 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-trusted-ca\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:07.265726 master-0 kubenswrapper[3976]: I0320 08:36:07.265654 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:36:07.266654 master-0 kubenswrapper[3976]: I0320 08:36:07.266502 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-serving-cert\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.271858 master-0 kubenswrapper[3976]: E0320 08:36:07.271792 3976 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 20 08:36:07.271987 master-0 kubenswrapper[3976]: E0320 08:36:07.271954 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs podName:6a80bd6f-2263-4251-8197-5173193f8afc nodeName:}" failed. No retries permitted until 2026-03-20 08:36:07.771896628 +0000 UTC m=+139.040719925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-5rrrh" (UID: "6a80bd6f-2263-4251-8197-5173193f8afc") : secret "multus-admission-controller-secret" not found Mar 20 08:36:07.272143 master-0 kubenswrapper[3976]: E0320 08:36:07.272112 3976 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 20 08:36:07.272203 master-0 kubenswrapper[3976]: E0320 08:36:07.272193 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics podName:acb704a9-6c8d-4378-ae93-e7095b1fce85 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:07.772163985 +0000 UTC m=+139.040987272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-mvn4t" (UID: "acb704a9-6c8d-4378-ae93-e7095b1fce85") : secret "marketplace-operator-metrics" not found Mar 20 08:36:07.272772 master-0 kubenswrapper[3976]: I0320 08:36:07.272726 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee3cc021-67d8-4b7f-b443-16f18228712e-host-slash\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:07.273373 master-0 kubenswrapper[3976]: E0320 08:36:07.273265 3976 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:07.273373 master-0 kubenswrapper[3976]: E0320 08:36:07.273338 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls podName:45e8b72b-564c-4bb1-b911-baff2d6c87ad nodeName:}" failed. No retries permitted until 2026-03-20 08:36:07.773310509 +0000 UTC m=+139.042133806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-nljsr" (UID: "45e8b72b-564c-4bb1-b911-baff2d6c87ad") : secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:07.273760 master-0 kubenswrapper[3976]: I0320 08:36:07.273727 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clz6w\" (UniqueName: \"kubernetes.io/projected/75e3e2cc-aa56-41f3-8859-1c086f419d05-kube-api-access-clz6w\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:07.276821 master-0 kubenswrapper[3976]: I0320 08:36:07.276742 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ee3cc021-67d8-4b7f-b443-16f18228712e-iptables-alerter-script\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:07.276821 master-0 kubenswrapper[3976]: I0320 08:36:07.276803 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-ca\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.276920 master-0 kubenswrapper[3976]: I0320 08:36:07.276896 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa759777-de22-4440-a3d3-ad429a3b8e7b-config\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:07.277024 master-0 kubenswrapper[3976]: I0320 08:36:07.276968 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-config\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.277333 master-0 kubenswrapper[3976]: I0320 08:36:07.277294 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-client\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.277622 master-0 kubenswrapper[3976]: I0320 08:36:07.277567 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a57854ac-809a-4745-aaa1-774f0a08a560-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:07.277683 master-0 kubenswrapper[3976]: E0320 08:36:07.277620 3976 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:07.277726 master-0 kubenswrapper[3976]: E0320 08:36:07.277684 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls podName:3f471ecc-922c-4cb1-9bdd-fdb5da08c592 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:07.777659116 +0000 UTC m=+139.046482633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls") pod "dns-operator-9c5679d8f-r6dm8" (UID: "3f471ecc-922c-4cb1-9bdd-fdb5da08c592") : secret "metrics-tls" not found Mar 20 08:36:07.277940 master-0 kubenswrapper[3976]: I0320 08:36:07.277881 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27qvw\" (UniqueName: \"kubernetes.io/projected/bbc0b783-28d5-4554-b49d-c66082546f44-kube-api-access-27qvw\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:07.281628 master-0 kubenswrapper[3976]: I0320 08:36:07.281586 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa759777-de22-4440-a3d3-ad429a3b8e7b-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:07.319545 master-0 kubenswrapper[3976]: I0320 08:36:07.319302 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa759777-de22-4440-a3d3-ad429a3b8e7b-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:07.319545 master-0 kubenswrapper[3976]: I0320 08:36:07.319319 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:07.346501 master-0 kubenswrapper[3976]: I0320 08:36:07.346436 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d57k\" (UniqueName: \"kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-kube-api-access-8d57k\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:07.372034 master-0 kubenswrapper[3976]: I0320 08:36:07.371990 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtcnq\" (UniqueName: \"kubernetes.io/projected/df428d5a-c722-4536-8e7f-cdd85c560481-kube-api-access-dtcnq\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:07.377658 master-0 kubenswrapper[3976]: I0320 08:36:07.377575 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqd2v\" (UniqueName: \"kubernetes.io/projected/6a80bd6f-2263-4251-8197-5173193f8afc-kube-api-access-tqd2v\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:07.396126 master-0 kubenswrapper[3976]: I0320 08:36:07.396037 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-bound-sa-token\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:07.406941 master-0 kubenswrapper[3976]: I0320 08:36:07.406768 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:07.437545 master-0 kubenswrapper[3976]: I0320 08:36:07.415845 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:07.437545 master-0 kubenswrapper[3976]: I0320 08:36:07.425759 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:07.471034 master-0 kubenswrapper[3976]: I0320 08:36:07.441039 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zpkn\" (UniqueName: \"kubernetes.io/projected/45e8b72b-564c-4bb1-b911-baff2d6c87ad-kube-api-access-9zpkn\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:07.471034 master-0 kubenswrapper[3976]: I0320 08:36:07.445621 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvx6w\" (UniqueName: \"kubernetes.io/projected/acb704a9-6c8d-4378-ae93-e7095b1fce85-kube-api-access-xvx6w\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:07.471034 master-0 kubenswrapper[3976]: I0320 08:36:07.448062 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:07.481378 master-0 kubenswrapper[3976]: I0320 08:36:07.481307 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l4b7\" (UniqueName: \"kubernetes.io/projected/ee3cc021-67d8-4b7f-b443-16f18228712e-kube-api-access-7l4b7\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:07.493039 master-0 kubenswrapper[3976]: I0320 08:36:07.486390 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a57854ac-809a-4745-aaa1-774f0a08a560-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:07.506389 master-0 kubenswrapper[3976]: I0320 08:36:07.506314 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhkh7\" (UniqueName: \"kubernetes.io/projected/f046860d-2d54-4746-8ba2-f8e90fa55e38-kube-api-access-xhkh7\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.512404 master-0 kubenswrapper[3976]: I0320 08:36:07.511263 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:07.540265 master-0 kubenswrapper[3976]: I0320 08:36:07.539493 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrws5\" (UniqueName: \"kubernetes.io/projected/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-kube-api-access-wrws5\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:07.543960 master-0 kubenswrapper[3976]: I0320 08:36:07.543924 3976 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-224dg\" (UniqueName: \"kubernetes.io/projected/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-kube-api-access-224dg\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:07.550434 master-0 kubenswrapper[3976]: I0320 08:36:07.550385 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh"] Mar 20 08:36:07.562986 master-0 kubenswrapper[3976]: I0320 08:36:07.562943 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz"] Mar 20 08:36:07.581295 master-0 kubenswrapper[3976]: I0320 08:36:07.581246 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:07.604399 master-0 kubenswrapper[3976]: I0320 08:36:07.604105 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:07.612329 master-0 kubenswrapper[3976]: I0320 08:36:07.611179 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5"] Mar 20 08:36:07.615223 master-0 kubenswrapper[3976]: I0320 08:36:07.614570 3976 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:07.621959 master-0 kubenswrapper[3976]: W0320 08:36:07.621525 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1854ea4_c8e2_4289_84b6_1f18b2ac684f.slice/crio-b77a1ff885b1de50033d17a32311fabaefba7e39efce419b16febb35e5db2498 WatchSource:0}: Error finding container b77a1ff885b1de50033d17a32311fabaefba7e39efce419b16febb35e5db2498: Status 404 returned error can't find the container with id b77a1ff885b1de50033d17a32311fabaefba7e39efce419b16febb35e5db2498 Mar 20 08:36:07.637451 master-0 kubenswrapper[3976]: I0320 08:36:07.637358 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h"] Mar 20 08:36:07.669357 master-0 kubenswrapper[3976]: I0320 08:36:07.668852 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-95bf4f4d-25cml"] Mar 20 08:36:07.680253 master-0 kubenswrapper[3976]: I0320 08:36:07.680212 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:07.680347 master-0 kubenswrapper[3976]: I0320 08:36:07.680273 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:07.680347 master-0 kubenswrapper[3976]: I0320 08:36:07.680326 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:07.680415 master-0 kubenswrapper[3976]: I0320 08:36:07.680393 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:07.680446 master-0 kubenswrapper[3976]: I0320 08:36:07.680424 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:07.680475 master-0 kubenswrapper[3976]: I0320 08:36:07.680457 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:07.680592 master-0 kubenswrapper[3976]: I0320 08:36:07.680562 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:07.680767 master-0 kubenswrapper[3976]: E0320 08:36:07.680744 3976 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:07.680834 master-0 kubenswrapper[3976]: E0320 08:36:07.680818 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:08.680796151 +0000 UTC m=+139.949619438 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:07.681375 master-0 kubenswrapper[3976]: E0320 08:36:07.681348 3976 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:07.681447 master-0 kubenswrapper[3976]: E0320 08:36:07.681395 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:08.681381018 +0000 UTC m=+139.950204305 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:07.681543 master-0 kubenswrapper[3976]: E0320 08:36:07.681468 3976 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:07.681543 master-0 kubenswrapper[3976]: E0320 08:36:07.681497 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:08.681489381 +0000 UTC m=+139.950312668 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:07.681661 master-0 kubenswrapper[3976]: E0320 08:36:07.681552 3976 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 20 08:36:07.681661 master-0 kubenswrapper[3976]: E0320 08:36:07.681585 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert podName:7b489385-2c96-4a97-8b31-362162de020e nodeName:}" failed. No retries permitted until 2026-03-20 08:36:08.681573483 +0000 UTC m=+139.950396770 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert") pod "olm-operator-5c9796789-tjm9l" (UID: "7b489385-2c96-4a97-8b31-362162de020e") : secret "olm-operator-serving-cert" not found Mar 20 08:36:07.681753 master-0 kubenswrapper[3976]: E0320 08:36:07.681706 3976 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 20 08:36:07.681791 master-0 kubenswrapper[3976]: E0320 08:36:07.681771 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:08.681761469 +0000 UTC m=+139.950584756 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "node-tuning-operator-tls" not found Mar 20 08:36:07.681845 master-0 kubenswrapper[3976]: E0320 08:36:07.681829 3976 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 20 08:36:07.681892 master-0 kubenswrapper[3976]: E0320 08:36:07.681859 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls podName:b4291bfd-53d9-4c78-b7cb-d7eb46560528 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:08.681851731 +0000 UTC m=+139.950675018 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-fmhbq" (UID: "b4291bfd-53d9-4c78-b7cb-d7eb46560528") : secret "image-registry-operator-tls" not found Mar 20 08:36:07.681962 master-0 kubenswrapper[3976]: E0320 08:36:07.681917 3976 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 20 08:36:07.682044 master-0 kubenswrapper[3976]: E0320 08:36:07.681978 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert podName:bbc0b783-28d5-4554-b49d-c66082546f44 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:08.681969945 +0000 UTC m=+139.950793232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-2pg77" (UID: "bbc0b783-28d5-4554-b49d-c66082546f44") : secret "package-server-manager-serving-cert" not found Mar 20 08:36:07.757686 master-0 kubenswrapper[3976]: I0320 08:36:07.756329 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742"] Mar 20 08:36:07.760901 master-0 kubenswrapper[3976]: W0320 08:36:07.760676 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86cb5d23_df7f_4f67_8086_1789d8e68544.slice/crio-dc885deb2f8a42b2a9647ca9a7a52b0dd31aa93757cc5e47ba821f5cab2f46b8 WatchSource:0}: Error finding container dc885deb2f8a42b2a9647ca9a7a52b0dd31aa93757cc5e47ba821f5cab2f46b8: Status 404 returned error can't find the container with id dc885deb2f8a42b2a9647ca9a7a52b0dd31aa93757cc5e47ba821f5cab2f46b8 Mar 20 08:36:07.762679 master-0 kubenswrapper[3976]: I0320 08:36:07.762311 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h"] Mar 20 08:36:07.781490 master-0 kubenswrapper[3976]: I0320 08:36:07.781438 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:07.781640 master-0 kubenswrapper[3976]: I0320 08:36:07.781532 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:07.781640 master-0 kubenswrapper[3976]: I0320 08:36:07.781564 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:07.781640 master-0 kubenswrapper[3976]: I0320 08:36:07.781586 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:07.781640 master-0 kubenswrapper[3976]: I0320 08:36:07.781608 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:07.781640 master-0 kubenswrapper[3976]: I0320 08:36:07.781643 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:07.781798 master-0 kubenswrapper[3976]: I0320 08:36:07.781680 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:07.782014 master-0 kubenswrapper[3976]: E0320 08:36:07.781844 3976 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:07.782014 master-0 kubenswrapper[3976]: E0320 08:36:07.781901 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls podName:45e8b72b-564c-4bb1-b911-baff2d6c87ad nodeName:}" failed. No retries permitted until 2026-03-20 08:36:08.781884021 +0000 UTC m=+140.050707308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-nljsr" (UID: "45e8b72b-564c-4bb1-b911-baff2d6c87ad") : secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:07.782014 master-0 kubenswrapper[3976]: E0320 08:36:07.781950 3976 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:07.782014 master-0 kubenswrapper[3976]: E0320 08:36:07.781970 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls podName:3f471ecc-922c-4cb1-9bdd-fdb5da08c592 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:08.781963443 +0000 UTC m=+140.050786730 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls") pod "dns-operator-9c5679d8f-r6dm8" (UID: "3f471ecc-922c-4cb1-9bdd-fdb5da08c592") : secret "metrics-tls" not found Mar 20 08:36:07.782014 master-0 kubenswrapper[3976]: E0320 08:36:07.782007 3976 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 20 08:36:07.782014 master-0 kubenswrapper[3976]: E0320 08:36:07.782025 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs podName:6a80bd6f-2263-4251-8197-5173193f8afc nodeName:}" failed. No retries permitted until 2026-03-20 08:36:08.782019575 +0000 UTC m=+140.050842852 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-5rrrh" (UID: "6a80bd6f-2263-4251-8197-5173193f8afc") : secret "multus-admission-controller-secret" not found Mar 20 08:36:07.782241 master-0 kubenswrapper[3976]: E0320 08:36:07.782065 3976 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 20 08:36:07.782241 master-0 kubenswrapper[3976]: E0320 08:36:07.782082 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls podName:42df77ec-94aa-48ba-bb35-7b1f1e8b8e97 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:08.782076386 +0000 UTC m=+140.050899673 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls") pod "machine-config-operator-84d549f6d5-gm4qr" (UID: "42df77ec-94aa-48ba-bb35-7b1f1e8b8e97") : secret "mco-proxy-tls" not found Mar 20 08:36:07.782241 master-0 kubenswrapper[3976]: E0320 08:36:07.782125 3976 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:07.782241 master-0 kubenswrapper[3976]: E0320 08:36:07.782150 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls podName:ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:08.782142948 +0000 UTC m=+140.050966235 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls") pod "ingress-operator-66b84d69b-gzg9m" (UID: "ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02") : secret "metrics-tls" not found Mar 20 08:36:07.782241 master-0 kubenswrapper[3976]: E0320 08:36:07.782206 3976 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 20 08:36:07.782241 master-0 kubenswrapper[3976]: E0320 08:36:07.782227 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert podName:df428d5a-c722-4536-8e7f-cdd85c560481 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:08.78222082 +0000 UTC m=+140.051044107 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert") pod "catalog-operator-68f85b4d6c-fzm28" (UID: "df428d5a-c722-4536-8e7f-cdd85c560481") : secret "catalog-operator-serving-cert" not found Mar 20 08:36:07.782551 master-0 kubenswrapper[3976]: E0320 08:36:07.782265 3976 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 20 08:36:07.782551 master-0 kubenswrapper[3976]: E0320 08:36:07.782283 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics podName:acb704a9-6c8d-4378-ae93-e7095b1fce85 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:08.782278052 +0000 UTC m=+140.051101339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-mvn4t" (UID: "acb704a9-6c8d-4378-ae93-e7095b1fce85") : secret "marketplace-operator-metrics" not found Mar 20 08:36:07.894698 master-0 kubenswrapper[3976]: I0320 08:36:07.894610 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7"] Mar 20 08:36:07.895569 master-0 kubenswrapper[3976]: I0320 08:36:07.895535 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4"] Mar 20 08:36:07.897839 master-0 kubenswrapper[3976]: W0320 08:36:07.897815 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa759777_de22_4440_a3d3_ad429a3b8e7b.slice/crio-46a12190f11c7c4d27d18246d17e718f0fd8c23ea7d374844186aa5295484abf WatchSource:0}: Error finding container 46a12190f11c7c4d27d18246d17e718f0fd8c23ea7d374844186aa5295484abf: Status 404 returned error can't find the container with id 46a12190f11c7c4d27d18246d17e718f0fd8c23ea7d374844186aa5295484abf Mar 20 08:36:07.898622 master-0 kubenswrapper[3976]: W0320 08:36:07.898587 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda57854ac_809a_4745_aaa1_774f0a08a560.slice/crio-4d1e15f7704367048583514d8253b668db81d2da2b51801dfb366fd0e7170679 WatchSource:0}: Error finding container 4d1e15f7704367048583514d8253b668db81d2da2b51801dfb366fd0e7170679: Status 404 returned error can't find the container with id 4d1e15f7704367048583514d8253b668db81d2da2b51801dfb366fd0e7170679 Mar 20 08:36:07.929704 master-0 kubenswrapper[3976]: I0320 08:36:07.929641 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4"] Mar 20 08:36:07.936690 master-0 kubenswrapper[3976]: W0320 08:36:07.936645 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf046860d_2d54_4746_8ba2_f8e90fa55e38.slice/crio-16961d83ade56434cc5d6847f42954f60a19529cc2fd922d6eec9e1e749ea461 WatchSource:0}: Error finding container 16961d83ade56434cc5d6847f42954f60a19529cc2fd922d6eec9e1e749ea461: Status 404 returned error can't find the container with id 16961d83ade56434cc5d6847f42954f60a19529cc2fd922d6eec9e1e749ea461 Mar 20 08:36:08.036592 master-0 kubenswrapper[3976]: I0320 08:36:08.036508 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6"] Mar 20 08:36:08.039001 master-0 kubenswrapper[3976]: I0320 08:36:08.038965 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx"] Mar 20 08:36:08.047952 master-0 kubenswrapper[3976]: W0320 08:36:08.047510 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29b5b089_fb1d_46a1_bd67_2e0ba03c76a6.slice/crio-51fe4cded0c2312a6b3bfd1f48ff9089513eaa05b60208bd1ced0f39d8ffe36e WatchSource:0}: Error finding container 51fe4cded0c2312a6b3bfd1f48ff9089513eaa05b60208bd1ced0f39d8ffe36e: Status 404 returned error can't find the container with id 51fe4cded0c2312a6b3bfd1f48ff9089513eaa05b60208bd1ced0f39d8ffe36e Mar 20 08:36:08.050411 master-0 kubenswrapper[3976]: W0320 08:36:08.050372 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68252533_bd64_4fc5_838a_cc350cbe77f0.slice/crio-9af8ad1671806bd505a390180b2388970cc534101cf6a5cf64e76e13311c0cb9 WatchSource:0}: Error finding container 9af8ad1671806bd505a390180b2388970cc534101cf6a5cf64e76e13311c0cb9: Status 404 returned error can't find the container with id 9af8ad1671806bd505a390180b2388970cc534101cf6a5cf64e76e13311c0cb9 Mar 20 08:36:08.051940 master-0 kubenswrapper[3976]: I0320 08:36:08.051915 3976 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj"] Mar 20 08:36:08.057580 master-0 kubenswrapper[3976]: W0320 08:36:08.057544 3976 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod325f0a83_d56d_4b62_977b_088a7d5f0e00.slice/crio-f826050a5c784de5b331265098c5bf62987ce32e6de55cf05cab0f3f76894ea8 WatchSource:0}: Error finding container f826050a5c784de5b331265098c5bf62987ce32e6de55cf05cab0f3f76894ea8: Status 404 returned error can't find the container with id f826050a5c784de5b331265098c5bf62987ce32e6de55cf05cab0f3f76894ea8 Mar 20 08:36:08.062166 master-0 kubenswrapper[3976]: E0320 08:36:08.062110 3976 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openshift-apiserver-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e,Command:[cluster-openshift-apiserver-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:KUBE_APISERVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lqdlf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-apiserver-operator-d65958b8-th2vj_openshift-apiserver-operator(325f0a83-d56d-4b62-977b-088a7d5f0e00): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 08:36:08.063468 master-0 kubenswrapper[3976]: E0320 08:36:08.063424 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" podUID="325f0a83-d56d-4b62-977b-088a7d5f0e00" Mar 20 08:36:08.418017 master-0 kubenswrapper[3976]: I0320 08:36:08.417964 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh" event={"ID":"7c4e7e57-43be-4d31-b523-f7e4d316dce3","Type":"ContainerStarted","Data":"9a8426b4146cf2f000b30dafab20a28003c24ae65a16f62f890c575d2f9770d9"} Mar 20 08:36:08.419513 master-0 kubenswrapper[3976]: I0320 08:36:08.419487 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" event={"ID":"fa759777-de22-4440-a3d3-ad429a3b8e7b","Type":"ContainerStarted","Data":"46a12190f11c7c4d27d18246d17e718f0fd8c23ea7d374844186aa5295484abf"} Mar 20 08:36:08.422988 master-0 kubenswrapper[3976]: I0320 08:36:08.422959 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" event={"ID":"c2a23d24-9e09-431e-8c3b-8456ff51a8d0","Type":"ContainerStarted","Data":"919c09e620d76c77a9425f79c1d3a73bcd978ef6fd46e99de901376458ec1273"} Mar 20 08:36:08.428272 master-0 kubenswrapper[3976]: I0320 08:36:08.428219 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" event={"ID":"86cb5d23-df7f-4f67-8086-1789d8e68544","Type":"ContainerStarted","Data":"dc885deb2f8a42b2a9647ca9a7a52b0dd31aa93757cc5e47ba821f5cab2f46b8"} Mar 20 08:36:08.433059 master-0 kubenswrapper[3976]: I0320 08:36:08.433029 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" event={"ID":"68252533-bd64-4fc5-838a-cc350cbe77f0","Type":"ContainerStarted","Data":"9af8ad1671806bd505a390180b2388970cc534101cf6a5cf64e76e13311c0cb9"} Mar 20 08:36:08.434077 master-0 kubenswrapper[3976]: I0320 08:36:08.434051 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" event={"ID":"f046860d-2d54-4746-8ba2-f8e90fa55e38","Type":"ContainerStarted","Data":"16961d83ade56434cc5d6847f42954f60a19529cc2fd922d6eec9e1e749ea461"} Mar 20 08:36:08.435905 master-0 kubenswrapper[3976]: I0320 08:36:08.435865 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" event={"ID":"ab175f7e-a5e8-4fda-98c9-6d052a221a83","Type":"ContainerStarted","Data":"5d21afac0935094080df835d139dd20efdf51fad3502782c60bb38ee7294f13b"} Mar 20 08:36:08.437593 master-0 kubenswrapper[3976]: I0320 08:36:08.437539 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dd9wv" event={"ID":"ee3cc021-67d8-4b7f-b443-16f18228712e","Type":"ContainerStarted","Data":"49cf279e25f73f2651832183fcd1e966f6a074b295d2f5e34a59a5be738d2377"} Mar 20 08:36:08.439216 master-0 kubenswrapper[3976]: I0320 08:36:08.439166 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" event={"ID":"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6","Type":"ContainerStarted","Data":"51fe4cded0c2312a6b3bfd1f48ff9089513eaa05b60208bd1ced0f39d8ffe36e"} Mar 20 08:36:08.440986 master-0 kubenswrapper[3976]: I0320 08:36:08.440925 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" event={"ID":"aa16c3bf-2350-46d1-afa0-9477b3ec8877","Type":"ContainerStarted","Data":"cadfc06b46a2370e89939aa270eb81d7b99f5548dca07c84a3c027dea72e913b"} Mar 20 08:36:08.442087 master-0 kubenswrapper[3976]: I0320 08:36:08.442040 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" event={"ID":"325f0a83-d56d-4b62-977b-088a7d5f0e00","Type":"ContainerStarted","Data":"f826050a5c784de5b331265098c5bf62987ce32e6de55cf05cab0f3f76894ea8"} Mar 20 08:36:08.445156 master-0 kubenswrapper[3976]: E0320 08:36:08.443675 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" podUID="325f0a83-d56d-4b62-977b-088a7d5f0e00" Mar 20 08:36:08.447510 master-0 kubenswrapper[3976]: I0320 08:36:08.447471 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" event={"ID":"75e3e2cc-aa56-41f3-8859-1c086f419d05","Type":"ContainerStarted","Data":"df43cdf08fb65d06b9db1ef59770a000ce2240e1f2e40f1d6a1619ba0e8b03df"} Mar 20 08:36:08.450332 master-0 kubenswrapper[3976]: I0320 08:36:08.450143 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" event={"ID":"c1854ea4-c8e2-4289-84b6-1f18b2ac684f","Type":"ContainerStarted","Data":"f44e15f6f5dbc67eb3bd400bcba69ac1adc2aa6d2546e0cbc2778889121d3bcf"} Mar 20 08:36:08.450332 master-0 kubenswrapper[3976]: I0320 08:36:08.450231 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" event={"ID":"c1854ea4-c8e2-4289-84b6-1f18b2ac684f","Type":"ContainerStarted","Data":"b77a1ff885b1de50033d17a32311fabaefba7e39efce419b16febb35e5db2498"} Mar 20 08:36:08.452608 master-0 kubenswrapper[3976]: I0320 08:36:08.452562 3976 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" event={"ID":"a57854ac-809a-4745-aaa1-774f0a08a560","Type":"ContainerStarted","Data":"4d1e15f7704367048583514d8253b668db81d2da2b51801dfb366fd0e7170679"} Mar 20 08:36:08.480134 master-0 kubenswrapper[3976]: I0320 08:36:08.480039 3976 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" podStartSLOduration=104.480008934 podStartE2EDuration="1m44.480008934s" podCreationTimestamp="2026-03-20 08:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:36:08.47917626 +0000 UTC m=+139.747999547" watchObservedRunningTime="2026-03-20 08:36:08.480008934 +0000 UTC m=+139.748832221" Mar 20 08:36:08.709568 master-0 kubenswrapper[3976]: I0320 08:36:08.709368 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:08.709568 master-0 kubenswrapper[3976]: I0320 08:36:08.709429 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:08.709568 master-0 kubenswrapper[3976]: I0320 08:36:08.709480 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:08.709568 master-0 kubenswrapper[3976]: I0320 08:36:08.709510 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:08.709568 master-0 kubenswrapper[3976]: I0320 08:36:08.709532 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:08.709568 master-0 kubenswrapper[3976]: I0320 08:36:08.709558 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:08.709970 master-0 kubenswrapper[3976]: I0320 08:36:08.709614 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:08.709970 master-0 kubenswrapper[3976]: E0320 08:36:08.709779 3976 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:08.709970 master-0 kubenswrapper[3976]: E0320 08:36:08.709845 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:10.709823551 +0000 UTC m=+141.978646838 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:08.710360 master-0 kubenswrapper[3976]: E0320 08:36:08.710280 3976 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:08.710360 master-0 kubenswrapper[3976]: E0320 08:36:08.710310 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:10.710301825 +0000 UTC m=+141.979125102 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:08.710360 master-0 kubenswrapper[3976]: E0320 08:36:08.710350 3976 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:08.710462 master-0 kubenswrapper[3976]: E0320 08:36:08.710370 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:10.710364116 +0000 UTC m=+141.979187393 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:08.710462 master-0 kubenswrapper[3976]: E0320 08:36:08.710408 3976 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 20 08:36:08.710462 master-0 kubenswrapper[3976]: E0320 08:36:08.710427 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert podName:7b489385-2c96-4a97-8b31-362162de020e nodeName:}" failed. No retries permitted until 2026-03-20 08:36:10.710421438 +0000 UTC m=+141.979244725 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert") pod "olm-operator-5c9796789-tjm9l" (UID: "7b489385-2c96-4a97-8b31-362162de020e") : secret "olm-operator-serving-cert" not found Mar 20 08:36:08.710701 master-0 kubenswrapper[3976]: E0320 08:36:08.710611 3976 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 20 08:36:08.710796 master-0 kubenswrapper[3976]: E0320 08:36:08.710735 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls podName:b4291bfd-53d9-4c78-b7cb-d7eb46560528 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:10.710700086 +0000 UTC m=+141.979523373 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-fmhbq" (UID: "b4291bfd-53d9-4c78-b7cb-d7eb46560528") : secret "image-registry-operator-tls" not found Mar 20 08:36:08.710796 master-0 kubenswrapper[3976]: E0320 08:36:08.710727 3976 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 20 08:36:08.710796 master-0 kubenswrapper[3976]: E0320 08:36:08.710771 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert podName:bbc0b783-28d5-4554-b49d-c66082546f44 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:10.710765278 +0000 UTC m=+141.979588565 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-2pg77" (UID: "bbc0b783-28d5-4554-b49d-c66082546f44") : secret "package-server-manager-serving-cert" not found Mar 20 08:36:08.710913 master-0 kubenswrapper[3976]: E0320 08:36:08.710793 3976 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 20 08:36:08.710913 master-0 kubenswrapper[3976]: E0320 08:36:08.710906 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:10.710877141 +0000 UTC m=+141.979700428 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "node-tuning-operator-tls" not found Mar 20 08:36:08.810771 master-0 kubenswrapper[3976]: I0320 08:36:08.810717 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:08.810771 master-0 kubenswrapper[3976]: I0320 08:36:08.810766 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: E0320 08:36:08.811228 3976 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: E0320 08:36:08.811270 3976 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: E0320 08:36:08.811293 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert podName:df428d5a-c722-4536-8e7f-cdd85c560481 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:10.811277931 +0000 UTC m=+142.080101208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert") pod "catalog-operator-68f85b4d6c-fzm28" (UID: "df428d5a-c722-4536-8e7f-cdd85c560481") : secret "catalog-operator-serving-cert" not found Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: E0320 08:36:08.811435 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls podName:ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:10.811401285 +0000 UTC m=+142.080224572 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls") pod "ingress-operator-66b84d69b-gzg9m" (UID: "ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02") : secret "metrics-tls" not found Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: I0320 08:36:08.811520 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: I0320 08:36:08.811644 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: E0320 08:36:08.811698 3976 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: E0320 08:36:08.811727 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics podName:acb704a9-6c8d-4378-ae93-e7095b1fce85 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:10.811719124 +0000 UTC m=+142.080542411 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-mvn4t" (UID: "acb704a9-6c8d-4378-ae93-e7095b1fce85") : secret "marketplace-operator-metrics" not found Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: I0320 08:36:08.811753 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: I0320 08:36:08.811862 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: I0320 08:36:08.811912 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: E0320 08:36:08.812015 3976 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: E0320 08:36:08.812048 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls podName:42df77ec-94aa-48ba-bb35-7b1f1e8b8e97 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:10.812039984 +0000 UTC m=+142.080863271 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls") pod "machine-config-operator-84d549f6d5-gm4qr" (UID: "42df77ec-94aa-48ba-bb35-7b1f1e8b8e97") : secret "mco-proxy-tls" not found Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: E0320 08:36:08.812113 3976 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: E0320 08:36:08.812141 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls podName:45e8b72b-564c-4bb1-b911-baff2d6c87ad nodeName:}" failed. No retries permitted until 2026-03-20 08:36:10.812132836 +0000 UTC m=+142.080956323 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-nljsr" (UID: "45e8b72b-564c-4bb1-b911-baff2d6c87ad") : secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:08.812283 master-0 kubenswrapper[3976]: E0320 08:36:08.812215 3976 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:08.813514 master-0 kubenswrapper[3976]: E0320 08:36:08.812236 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls podName:3f471ecc-922c-4cb1-9bdd-fdb5da08c592 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:10.812230559 +0000 UTC m=+142.081053846 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls") pod "dns-operator-9c5679d8f-r6dm8" (UID: "3f471ecc-922c-4cb1-9bdd-fdb5da08c592") : secret "metrics-tls" not found Mar 20 08:36:08.813514 master-0 kubenswrapper[3976]: E0320 08:36:08.812273 3976 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 20 08:36:08.813514 master-0 kubenswrapper[3976]: E0320 08:36:08.812293 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs podName:6a80bd6f-2263-4251-8197-5173193f8afc nodeName:}" failed. No retries permitted until 2026-03-20 08:36:10.812287061 +0000 UTC m=+142.081110338 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-5rrrh" (UID: "6a80bd6f-2263-4251-8197-5173193f8afc") : secret "multus-admission-controller-secret" not found Mar 20 08:36:09.912273 master-0 kubenswrapper[3976]: E0320 08:36:09.911947 3976 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" podUID="325f0a83-d56d-4b62-977b-088a7d5f0e00" Mar 20 08:36:10.808363 master-0 kubenswrapper[3976]: I0320 08:36:10.808100 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:10.808363 master-0 kubenswrapper[3976]: I0320 08:36:10.808175 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:10.808608 master-0 kubenswrapper[3976]: I0320 08:36:10.808373 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:10.808608 master-0 kubenswrapper[3976]: E0320 08:36:10.808389 3976 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 20 08:36:10.808608 master-0 kubenswrapper[3976]: I0320 08:36:10.808418 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:10.808608 master-0 kubenswrapper[3976]: E0320 08:36:10.808514 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert podName:7b489385-2c96-4a97-8b31-362162de020e nodeName:}" failed. No retries permitted until 2026-03-20 08:36:14.808487405 +0000 UTC m=+146.077310692 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert") pod "olm-operator-5c9796789-tjm9l" (UID: "7b489385-2c96-4a97-8b31-362162de020e") : secret "olm-operator-serving-cert" not found Mar 20 08:36:10.808727 master-0 kubenswrapper[3976]: E0320 08:36:10.808652 3976 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 20 08:36:10.808951 master-0 kubenswrapper[3976]: E0320 08:36:10.808776 3976 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 20 08:36:10.808951 master-0 kubenswrapper[3976]: E0320 08:36:10.808807 3976 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 20 08:36:10.808951 master-0 kubenswrapper[3976]: E0320 08:36:10.808781 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls podName:b4291bfd-53d9-4c78-b7cb-d7eb46560528 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:14.808747163 +0000 UTC m=+146.077570640 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-fmhbq" (UID: "b4291bfd-53d9-4c78-b7cb-d7eb46560528") : secret "image-registry-operator-tls" not found Mar 20 08:36:10.809046 master-0 kubenswrapper[3976]: I0320 08:36:10.808984 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:10.809046 master-0 kubenswrapper[3976]: E0320 08:36:10.809027 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert podName:bbc0b783-28d5-4554-b49d-c66082546f44 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:14.809014711 +0000 UTC m=+146.077837998 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-2pg77" (UID: "bbc0b783-28d5-4554-b49d-c66082546f44") : secret "package-server-manager-serving-cert" not found Mar 20 08:36:10.809117 master-0 kubenswrapper[3976]: E0320 08:36:10.809056 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:14.809048312 +0000 UTC m=+146.077871599 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "node-tuning-operator-tls" not found Mar 20 08:36:10.809117 master-0 kubenswrapper[3976]: E0320 08:36:10.809089 3976 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:10.809340 master-0 kubenswrapper[3976]: I0320 08:36:10.809146 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:10.809340 master-0 kubenswrapper[3976]: E0320 08:36:10.809204 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:14.809174635 +0000 UTC m=+146.077997922 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:10.809340 master-0 kubenswrapper[3976]: I0320 08:36:10.809323 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:10.809432 master-0 kubenswrapper[3976]: E0320 08:36:10.809385 3976 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:10.809592 master-0 kubenswrapper[3976]: E0320 08:36:10.809457 3976 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:10.809592 master-0 kubenswrapper[3976]: E0320 08:36:10.809519 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:14.809487614 +0000 UTC m=+146.078310901 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:10.809592 master-0 kubenswrapper[3976]: E0320 08:36:10.809542 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:14.809532636 +0000 UTC m=+146.078355913 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:10.910259 master-0 kubenswrapper[3976]: I0320 08:36:10.910202 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:10.910486 master-0 kubenswrapper[3976]: E0320 08:36:10.910440 3976 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:10.910534 master-0 kubenswrapper[3976]: I0320 08:36:10.910516 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:10.910577 master-0 kubenswrapper[3976]: E0320 08:36:10.910557 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls podName:45e8b72b-564c-4bb1-b911-baff2d6c87ad nodeName:}" failed. No retries permitted until 2026-03-20 08:36:14.910528895 +0000 UTC m=+146.179352182 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-nljsr" (UID: "45e8b72b-564c-4bb1-b911-baff2d6c87ad") : secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:10.910674 master-0 kubenswrapper[3976]: E0320 08:36:10.910652 3976 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:10.910740 master-0 kubenswrapper[3976]: I0320 08:36:10.910705 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:10.910780 master-0 kubenswrapper[3976]: E0320 08:36:10.910718 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls podName:3f471ecc-922c-4cb1-9bdd-fdb5da08c592 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:14.91069897 +0000 UTC m=+146.179522257 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls") pod "dns-operator-9c5679d8f-r6dm8" (UID: "3f471ecc-922c-4cb1-9bdd-fdb5da08c592") : secret "metrics-tls" not found Mar 20 08:36:10.910780 master-0 kubenswrapper[3976]: E0320 08:36:10.910772 3976 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 20 08:36:10.910845 master-0 kubenswrapper[3976]: I0320 08:36:10.910778 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:10.910845 master-0 kubenswrapper[3976]: E0320 08:36:10.910799 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs podName:6a80bd6f-2263-4251-8197-5173193f8afc nodeName:}" failed. No retries permitted until 2026-03-20 08:36:14.910792503 +0000 UTC m=+146.179615790 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-5rrrh" (UID: "6a80bd6f-2263-4251-8197-5173193f8afc") : secret "multus-admission-controller-secret" not found Mar 20 08:36:10.910845 master-0 kubenswrapper[3976]: I0320 08:36:10.910823 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:10.910940 master-0 kubenswrapper[3976]: I0320 08:36:10.910850 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:10.911095 master-0 kubenswrapper[3976]: E0320 08:36:10.911058 3976 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 20 08:36:10.911151 master-0 kubenswrapper[3976]: E0320 08:36:10.911133 3976 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 20 08:36:10.911210 master-0 kubenswrapper[3976]: I0320 08:36:10.911072 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:10.911210 master-0 kubenswrapper[3976]: E0320 08:36:10.911202 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls podName:42df77ec-94aa-48ba-bb35-7b1f1e8b8e97 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:14.911152944 +0000 UTC m=+146.179976231 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls") pod "machine-config-operator-84d549f6d5-gm4qr" (UID: "42df77ec-94aa-48ba-bb35-7b1f1e8b8e97") : secret "mco-proxy-tls" not found Mar 20 08:36:10.911304 master-0 kubenswrapper[3976]: E0320 08:36:10.911225 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics podName:acb704a9-6c8d-4378-ae93-e7095b1fce85 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:14.911216525 +0000 UTC m=+146.180039812 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-mvn4t" (UID: "acb704a9-6c8d-4378-ae93-e7095b1fce85") : secret "marketplace-operator-metrics" not found Mar 20 08:36:10.911304 master-0 kubenswrapper[3976]: E0320 08:36:10.911257 3976 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:10.911403 master-0 kubenswrapper[3976]: E0320 08:36:10.911342 3976 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 20 08:36:10.911403 master-0 kubenswrapper[3976]: E0320 08:36:10.911351 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls podName:ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:14.911328379 +0000 UTC m=+146.180151666 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls") pod "ingress-operator-66b84d69b-gzg9m" (UID: "ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02") : secret "metrics-tls" not found Mar 20 08:36:10.911403 master-0 kubenswrapper[3976]: E0320 08:36:10.911388 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert podName:df428d5a-c722-4536-8e7f-cdd85c560481 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:14.91138133 +0000 UTC m=+146.180204617 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert") pod "catalog-operator-68f85b4d6c-fzm28" (UID: "df428d5a-c722-4536-8e7f-cdd85c560481") : secret "catalog-operator-serving-cert" not found Mar 20 08:36:11.331963 master-0 kubenswrapper[3976]: I0320 08:36:11.331884 3976 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:11.332648 master-0 kubenswrapper[3976]: E0320 08:36:11.332124 3976 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 20 08:36:11.332648 master-0 kubenswrapper[3976]: E0320 08:36:11.332219 3976 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs podName:813f91c2-2b37-4681-968d-4217e286e22f nodeName:}" failed. No retries permitted until 2026-03-20 08:37:15.33217752 +0000 UTC m=+206.601000817 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs") pod "network-metrics-daemon-srdjm" (UID: "813f91c2-2b37-4681-968d-4217e286e22f") : secret "metrics-daemon-secret" not found Mar 20 08:36:14.244922 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 20 08:36:14.263462 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 20 08:36:14.263878 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 20 08:36:14.267249 master-0 systemd[1]: kubelet.service: Consumed 11.090s CPU time. Mar 20 08:36:14.284954 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 20 08:36:14.405767 master-0 kubenswrapper[7465]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:36:14.405767 master-0 kubenswrapper[7465]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 08:36:14.405767 master-0 kubenswrapper[7465]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:36:14.405767 master-0 kubenswrapper[7465]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:36:14.405767 master-0 kubenswrapper[7465]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 08:36:14.405767 master-0 kubenswrapper[7465]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:36:14.407747 master-0 kubenswrapper[7465]: I0320 08:36:14.405883 7465 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408544 7465 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408574 7465 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408580 7465 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408584 7465 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408589 7465 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408594 7465 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408598 7465 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408601 7465 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408611 7465 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408616 7465 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408619 7465 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408623 7465 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408627 7465 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408632 7465 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408637 7465 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408642 7465 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408647 7465 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408653 7465 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408659 7465 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:36:14.408615 master-0 kubenswrapper[7465]: W0320 08:36:14.408664 7465 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408669 7465 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408674 7465 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408679 7465 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408685 7465 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408690 7465 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408694 7465 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408699 7465 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408702 7465 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408706 7465 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408710 7465 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408714 7465 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408719 7465 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408723 7465 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408726 7465 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408731 7465 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408735 7465 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408739 7465 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408742 7465 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408746 7465 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:36:14.409910 master-0 kubenswrapper[7465]: W0320 08:36:14.408750 7465 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408755 7465 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408759 7465 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408763 7465 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408768 7465 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408772 7465 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408777 7465 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408782 7465 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408786 7465 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408790 7465 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408794 7465 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408798 7465 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408806 7465 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408810 7465 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408814 7465 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408817 7465 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408822 7465 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408827 7465 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408831 7465 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408835 7465 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:36:14.411424 master-0 kubenswrapper[7465]: W0320 08:36:14.408839 7465 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: W0320 08:36:14.408844 7465 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: W0320 08:36:14.408851 7465 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: W0320 08:36:14.408855 7465 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: W0320 08:36:14.408859 7465 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: W0320 08:36:14.408863 7465 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: W0320 08:36:14.408866 7465 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: W0320 08:36:14.408870 7465 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: W0320 08:36:14.408874 7465 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: W0320 08:36:14.408877 7465 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: W0320 08:36:14.408881 7465 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: W0320 08:36:14.408885 7465 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: W0320 08:36:14.408888 7465 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: I0320 08:36:14.409002 7465 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: I0320 08:36:14.409014 7465 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: I0320 08:36:14.409023 7465 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: I0320 08:36:14.409029 7465 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: I0320 08:36:14.409035 7465 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: I0320 08:36:14.409040 7465 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: I0320 08:36:14.409045 7465 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: I0320 08:36:14.409052 7465 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 08:36:14.414161 master-0 kubenswrapper[7465]: I0320 08:36:14.409057 7465 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409068 7465 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409073 7465 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409077 7465 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409082 7465 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409086 7465 flags.go:64] FLAG: --cgroup-root="" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409091 7465 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409095 7465 flags.go:64] FLAG: --client-ca-file="" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409099 7465 flags.go:64] FLAG: --cloud-config="" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409102 7465 flags.go:64] FLAG: --cloud-provider="" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409106 7465 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409111 7465 flags.go:64] FLAG: --cluster-domain="" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409116 7465 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409120 7465 flags.go:64] FLAG: --config-dir="" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409124 7465 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409128 7465 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409134 7465 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409138 7465 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409142 7465 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409147 7465 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409151 7465 flags.go:64] FLAG: --contention-profiling="false" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409155 7465 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409159 7465 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409163 7465 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409167 7465 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 08:36:14.416993 master-0 kubenswrapper[7465]: I0320 08:36:14.409173 7465 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409177 7465 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409181 7465 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409200 7465 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409204 7465 flags.go:64] FLAG: --enable-server="true" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409209 7465 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409218 7465 flags.go:64] FLAG: --event-burst="100" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409223 7465 flags.go:64] FLAG: --event-qps="50" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409231 7465 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409235 7465 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409240 7465 flags.go:64] FLAG: --eviction-hard="" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409245 7465 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409249 7465 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409254 7465 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409258 7465 flags.go:64] FLAG: --eviction-soft="" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409262 7465 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409266 7465 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409270 7465 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409275 7465 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409279 7465 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409283 7465 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409287 7465 flags.go:64] FLAG: --feature-gates="" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409295 7465 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409300 7465 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409304 7465 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 08:36:14.420075 master-0 kubenswrapper[7465]: I0320 08:36:14.409308 7465 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409312 7465 flags.go:64] FLAG: --healthz-port="10248" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409317 7465 flags.go:64] FLAG: --help="false" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409321 7465 flags.go:64] FLAG: --hostname-override="" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409325 7465 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409329 7465 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409333 7465 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409337 7465 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409342 7465 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409349 7465 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409353 7465 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409356 7465 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409361 7465 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409365 7465 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409369 7465 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409380 7465 flags.go:64] FLAG: --kube-reserved="" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409386 7465 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409390 7465 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409395 7465 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409399 7465 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409403 7465 flags.go:64] FLAG: --lock-file="" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409407 7465 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409412 7465 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409416 7465 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409422 7465 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409426 7465 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 08:36:14.422790 master-0 kubenswrapper[7465]: I0320 08:36:14.409431 7465 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409435 7465 flags.go:64] FLAG: --logging-format="text" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409439 7465 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409443 7465 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409448 7465 flags.go:64] FLAG: --manifest-url="" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409452 7465 flags.go:64] FLAG: --manifest-url-header="" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409458 7465 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409462 7465 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409468 7465 flags.go:64] FLAG: --max-pods="110" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409472 7465 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409476 7465 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409480 7465 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409484 7465 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409489 7465 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409493 7465 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409497 7465 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409508 7465 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409512 7465 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409516 7465 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409520 7465 flags.go:64] FLAG: --pod-cidr="" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409525 7465 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409533 7465 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409540 7465 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 08:36:14.425584 master-0 kubenswrapper[7465]: I0320 08:36:14.409544 7465 flags.go:64] FLAG: --pods-per-core="0" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409548 7465 flags.go:64] FLAG: --port="10250" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409552 7465 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409556 7465 flags.go:64] FLAG: --provider-id="" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409560 7465 flags.go:64] FLAG: --qos-reserved="" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409564 7465 flags.go:64] FLAG: --read-only-port="10255" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409569 7465 flags.go:64] FLAG: --register-node="true" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409573 7465 flags.go:64] FLAG: --register-schedulable="true" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409578 7465 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409586 7465 flags.go:64] FLAG: --registry-burst="10" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409591 7465 flags.go:64] FLAG: --registry-qps="5" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409596 7465 flags.go:64] FLAG: --reserved-cpus="" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409600 7465 flags.go:64] FLAG: --reserved-memory="" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409605 7465 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409610 7465 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409614 7465 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409618 7465 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409623 7465 flags.go:64] FLAG: --runonce="false" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409627 7465 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409631 7465 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409636 7465 flags.go:64] FLAG: --seccomp-default="false" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409640 7465 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409644 7465 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409648 7465 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409652 7465 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409656 7465 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 08:36:14.428056 master-0 kubenswrapper[7465]: I0320 08:36:14.409660 7465 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409665 7465 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409668 7465 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409673 7465 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409677 7465 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409685 7465 flags.go:64] FLAG: --system-cgroups="" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409689 7465 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409695 7465 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409699 7465 flags.go:64] FLAG: --tls-cert-file="" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409703 7465 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409708 7465 flags.go:64] FLAG: --tls-min-version="" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409712 7465 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409716 7465 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409721 7465 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409725 7465 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409729 7465 flags.go:64] FLAG: --v="2" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409735 7465 flags.go:64] FLAG: --version="false" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409741 7465 flags.go:64] FLAG: --vmodule="" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409748 7465 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: I0320 08:36:14.409752 7465 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: W0320 08:36:14.410773 7465 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: W0320 08:36:14.410789 7465 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: W0320 08:36:14.410794 7465 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: W0320 08:36:14.410798 7465 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:36:14.431772 master-0 kubenswrapper[7465]: W0320 08:36:14.410802 7465 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410806 7465 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410810 7465 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410814 7465 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410818 7465 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410821 7465 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410825 7465 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410835 7465 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410839 7465 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410843 7465 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410847 7465 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410851 7465 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410855 7465 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410862 7465 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410868 7465 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410872 7465 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410876 7465 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410879 7465 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410883 7465 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410887 7465 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:36:14.434518 master-0 kubenswrapper[7465]: W0320 08:36:14.410894 7465 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410898 7465 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410902 7465 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410907 7465 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410912 7465 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410916 7465 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410919 7465 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410923 7465 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410927 7465 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410932 7465 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410938 7465 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410942 7465 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410950 7465 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410954 7465 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410958 7465 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410962 7465 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410966 7465 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410970 7465 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410973 7465 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410977 7465 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:36:14.437834 master-0 kubenswrapper[7465]: W0320 08:36:14.410981 7465 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.410985 7465 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.410988 7465 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.410992 7465 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.410996 7465 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.411004 7465 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.411010 7465 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.411014 7465 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.411019 7465 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.411023 7465 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.411028 7465 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.411032 7465 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.411035 7465 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.411040 7465 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.411044 7465 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.411048 7465 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.411052 7465 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.411061 7465 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:36:14.439425 master-0 kubenswrapper[7465]: W0320 08:36:14.411065 7465 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:36:14.440313 master-0 kubenswrapper[7465]: W0320 08:36:14.411071 7465 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:36:14.440313 master-0 kubenswrapper[7465]: W0320 08:36:14.411075 7465 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:36:14.440313 master-0 kubenswrapper[7465]: W0320 08:36:14.411079 7465 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:36:14.440313 master-0 kubenswrapper[7465]: W0320 08:36:14.411083 7465 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:36:14.440313 master-0 kubenswrapper[7465]: W0320 08:36:14.411087 7465 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:36:14.440313 master-0 kubenswrapper[7465]: W0320 08:36:14.411091 7465 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:36:14.440313 master-0 kubenswrapper[7465]: W0320 08:36:14.411095 7465 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:36:14.440313 master-0 kubenswrapper[7465]: W0320 08:36:14.411101 7465 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:36:14.440313 master-0 kubenswrapper[7465]: W0320 08:36:14.411105 7465 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:36:14.440313 master-0 kubenswrapper[7465]: I0320 08:36:14.411111 7465 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 08:36:14.440313 master-0 kubenswrapper[7465]: I0320 08:36:14.422205 7465 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 20 08:36:14.440313 master-0 kubenswrapper[7465]: I0320 08:36:14.422253 7465 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 08:36:14.440313 master-0 kubenswrapper[7465]: W0320 08:36:14.422380 7465 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:36:14.440313 master-0 kubenswrapper[7465]: W0320 08:36:14.422407 7465 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:36:14.440313 master-0 kubenswrapper[7465]: W0320 08:36:14.422413 7465 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422418 7465 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422424 7465 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422429 7465 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422437 7465 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422448 7465 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422454 7465 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422459 7465 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422463 7465 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422467 7465 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422471 7465 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422475 7465 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422480 7465 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422484 7465 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422489 7465 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422493 7465 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422497 7465 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422502 7465 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422507 7465 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 20 08:36:14.441025 master-0 kubenswrapper[7465]: W0320 08:36:14.422512 7465 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422535 7465 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422541 7465 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422545 7465 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422549 7465 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422556 7465 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422560 7465 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422564 7465 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422568 7465 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422572 7465 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422578 7465 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422582 7465 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422587 7465 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422591 7465 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422612 7465 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422617 7465 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422621 7465 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422626 7465 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422631 7465 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422637 7465 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:36:14.442311 master-0 kubenswrapper[7465]: W0320 08:36:14.422642 7465 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422647 7465 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422652 7465 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422657 7465 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422661 7465 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422665 7465 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422671 7465 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422678 7465 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422708 7465 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422713 7465 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422718 7465 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422723 7465 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422729 7465 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422733 7465 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422738 7465 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422743 7465 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422747 7465 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422751 7465 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422755 7465 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422758 7465 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:36:14.443583 master-0 kubenswrapper[7465]: W0320 08:36:14.422781 7465 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:36:14.444574 master-0 kubenswrapper[7465]: W0320 08:36:14.422785 7465 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:36:14.444574 master-0 kubenswrapper[7465]: W0320 08:36:14.422791 7465 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:36:14.444574 master-0 kubenswrapper[7465]: W0320 08:36:14.422795 7465 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:36:14.444574 master-0 kubenswrapper[7465]: W0320 08:36:14.422800 7465 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:36:14.444574 master-0 kubenswrapper[7465]: W0320 08:36:14.422804 7465 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:36:14.444574 master-0 kubenswrapper[7465]: W0320 08:36:14.422809 7465 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:36:14.444574 master-0 kubenswrapper[7465]: W0320 08:36:14.422814 7465 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:36:14.444574 master-0 kubenswrapper[7465]: W0320 08:36:14.422820 7465 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:36:14.444574 master-0 kubenswrapper[7465]: W0320 08:36:14.422826 7465 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:36:14.444574 master-0 kubenswrapper[7465]: W0320 08:36:14.422830 7465 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:36:14.444574 master-0 kubenswrapper[7465]: I0320 08:36:14.422839 7465 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 08:36:14.444574 master-0 kubenswrapper[7465]: W0320 08:36:14.423047 7465 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:36:14.444574 master-0 kubenswrapper[7465]: W0320 08:36:14.423060 7465 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:36:14.444574 master-0 kubenswrapper[7465]: W0320 08:36:14.423065 7465 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:36:14.444574 master-0 kubenswrapper[7465]: W0320 08:36:14.423069 7465 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423074 7465 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423081 7465 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423085 7465 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423089 7465 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423093 7465 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423097 7465 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423104 7465 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423125 7465 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423130 7465 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423135 7465 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423139 7465 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423143 7465 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423147 7465 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423152 7465 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423156 7465 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423160 7465 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423164 7465 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423168 7465 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:36:14.445107 master-0 kubenswrapper[7465]: W0320 08:36:14.423172 7465 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423206 7465 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423211 7465 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423216 7465 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423221 7465 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423226 7465 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423230 7465 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423235 7465 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423240 7465 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423244 7465 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423248 7465 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423253 7465 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423257 7465 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423262 7465 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423284 7465 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423290 7465 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423295 7465 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423299 7465 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423305 7465 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423309 7465 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:36:14.446154 master-0 kubenswrapper[7465]: W0320 08:36:14.423313 7465 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423318 7465 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423322 7465 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423328 7465 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423358 7465 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423364 7465 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423368 7465 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423373 7465 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423378 7465 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423383 7465 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423389 7465 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423395 7465 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423399 7465 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423404 7465 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423409 7465 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423414 7465 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423437 7465 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423443 7465 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423448 7465 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423453 7465 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:36:14.447594 master-0 kubenswrapper[7465]: W0320 08:36:14.423458 7465 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:36:14.448496 master-0 kubenswrapper[7465]: W0320 08:36:14.423462 7465 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:36:14.448496 master-0 kubenswrapper[7465]: W0320 08:36:14.423467 7465 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:36:14.448496 master-0 kubenswrapper[7465]: W0320 08:36:14.423472 7465 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:36:14.448496 master-0 kubenswrapper[7465]: W0320 08:36:14.423476 7465 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:36:14.448496 master-0 kubenswrapper[7465]: W0320 08:36:14.423481 7465 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:36:14.448496 master-0 kubenswrapper[7465]: W0320 08:36:14.423486 7465 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:36:14.448496 master-0 kubenswrapper[7465]: W0320 08:36:14.423491 7465 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:36:14.448496 master-0 kubenswrapper[7465]: W0320 08:36:14.423496 7465 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:36:14.448496 master-0 kubenswrapper[7465]: W0320 08:36:14.423500 7465 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:36:14.448496 master-0 kubenswrapper[7465]: I0320 08:36:14.423507 7465 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 08:36:14.448496 master-0 kubenswrapper[7465]: I0320 08:36:14.423823 7465 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 08:36:14.448496 master-0 kubenswrapper[7465]: I0320 08:36:14.426346 7465 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 20 08:36:14.448496 master-0 kubenswrapper[7465]: I0320 08:36:14.426501 7465 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 08:36:14.448496 master-0 kubenswrapper[7465]: I0320 08:36:14.426836 7465 server.go:997] "Starting client certificate rotation" Mar 20 08:36:14.449114 master-0 kubenswrapper[7465]: I0320 08:36:14.426849 7465 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 08:36:14.449114 master-0 kubenswrapper[7465]: I0320 08:36:14.427150 7465 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-21 08:25:35 +0000 UTC, rotation deadline is 2026-03-21 03:55:32.125112016 +0000 UTC Mar 20 08:36:14.449114 master-0 kubenswrapper[7465]: I0320 08:36:14.427280 7465 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 19h19m17.697834495s for next certificate rotation Mar 20 08:36:14.449114 master-0 kubenswrapper[7465]: I0320 08:36:14.427847 7465 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 08:36:14.449114 master-0 kubenswrapper[7465]: I0320 08:36:14.429735 7465 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 08:36:14.449114 master-0 kubenswrapper[7465]: I0320 08:36:14.432896 7465 log.go:25] "Validated CRI v1 runtime API" Mar 20 08:36:14.449114 master-0 kubenswrapper[7465]: I0320 08:36:14.435484 7465 log.go:25] "Validated CRI v1 image API" Mar 20 08:36:14.449114 master-0 kubenswrapper[7465]: I0320 08:36:14.436676 7465 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 08:36:14.449114 master-0 kubenswrapper[7465]: I0320 08:36:14.441052 7465 fs.go:135] Filesystem UUIDs: map[4a66d702-cf3e-4c68-968a-18f659b89ac6:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 20 08:36:14.449772 master-0 kubenswrapper[7465]: I0320 08:36:14.441122 7465 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/16961d83ade56434cc5d6847f42954f60a19529cc2fd922d6eec9e1e749ea461/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/16961d83ade56434cc5d6847f42954f60a19529cc2fd922d6eec9e1e749ea461/userdata/shm major:0 minor:282 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1cdce851761fa32d1bf31f749f94743d1cc0bca1a250ad08c114ccc4c2f77b9b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1cdce851761fa32d1bf31f749f94743d1cc0bca1a250ad08c114ccc4c2f77b9b/userdata/shm major:0 minor:105 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/24e53548727bf62b2d0124119a6e8fe69dde1c3031b4a91a063f8dd58773fece/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/24e53548727bf62b2d0124119a6e8fe69dde1c3031b4a91a063f8dd58773fece/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/37ba8dc9671f8fe4b5333f2dc9ab294ad8d004712a5f4bddaf2f6742452a4b3c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/37ba8dc9671f8fe4b5333f2dc9ab294ad8d004712a5f4bddaf2f6742452a4b3c/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/46a12190f11c7c4d27d18246d17e718f0fd8c23ea7d374844186aa5295484abf/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/46a12190f11c7c4d27d18246d17e718f0fd8c23ea7d374844186aa5295484abf/userdata/shm major:0 minor:284 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/49cf279e25f73f2651832183fcd1e966f6a074b295d2f5e34a59a5be738d2377/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/49cf279e25f73f2651832183fcd1e966f6a074b295d2f5e34a59a5be738d2377/userdata/shm major:0 minor:272 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4c2c291535895db17dfb8bf23092bd97a1919325de4d3a5857890c5d777dcb49/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4c2c291535895db17dfb8bf23092bd97a1919325de4d3a5857890c5d777dcb49/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4d1e15f7704367048583514d8253b668db81d2da2b51801dfb366fd0e7170679/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4d1e15f7704367048583514d8253b668db81d2da2b51801dfb366fd0e7170679/userdata/shm major:0 minor:281 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/51fe4cded0c2312a6b3bfd1f48ff9089513eaa05b60208bd1ced0f39d8ffe36e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/51fe4cded0c2312a6b3bfd1f48ff9089513eaa05b60208bd1ced0f39d8ffe36e/userdata/shm major:0 minor:265 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5d21afac0935094080df835d139dd20efdf51fad3502782c60bb38ee7294f13b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5d21afac0935094080df835d139dd20efdf51fad3502782c60bb38ee7294f13b/userdata/shm major:0 minor:241 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/91607e86857f069937668c70d7500f8b27325b07177f03b7a7ab831b3600ac2a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/91607e86857f069937668c70d7500f8b27325b07177f03b7a7ab831b3600ac2a/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/919c09e620d76c77a9425f79c1d3a73bcd978ef6fd46e99de901376458ec1273/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/919c09e620d76c77a9425f79c1d3a73bcd978ef6fd46e99de901376458ec1273/userdata/shm major:0 minor:239 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/972f78b16f9b47d7c2f794578bdc02049da4958a2a754da51e80ca31806a5573/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/972f78b16f9b47d7c2f794578bdc02049da4958a2a754da51e80ca31806a5573/userdata/shm major:0 minor:109 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9a8426b4146cf2f000b30dafab20a28003c24ae65a16f62f890c575d2f9770d9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9a8426b4146cf2f000b30dafab20a28003c24ae65a16f62f890c575d2f9770d9/userdata/shm major:0 minor:237 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9af8ad1671806bd505a390180b2388970cc534101cf6a5cf64e76e13311c0cb9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9af8ad1671806bd505a390180b2388970cc534101cf6a5cf64e76e13311c0cb9/userdata/shm major:0 minor:268 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a49d9b3d48753f2f078ac0130c6af5ca5c251ec6fd9f68a980d833a929f8987e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a49d9b3d48753f2f078ac0130c6af5ca5c251ec6fd9f68a980d833a929f8987e/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b77a1ff885b1de50033d17a32311fabaefba7e39efce419b16febb35e5db2498/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b77a1ff885b1de50033d17a32311fabaefba7e39efce419b16febb35e5db2498/userdata/shm major:0 minor:235 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bfb1d352b3974d283e9d723de75463f8ee764b61d5e2eec1ee72f20516314a15/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bfb1d352b3974d283e9d723de75463f8ee764b61d5e2eec1ee72f20516314a15/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cadfc06b46a2370e89939aa270eb81d7b99f5548dca07c84a3c027dea72e913b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cadfc06b46a2370e89939aa270eb81d7b99f5548dca07c84a3c027dea72e913b/userdata/shm major:0 minor:248 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/dc885deb2f8a42b2a9647ca9a7a52b0dd31aa93757cc5e47ba821f5cab2f46b8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/dc885deb2f8a42b2a9647ca9a7a52b0dd31aa93757cc5e47ba821f5cab2f46b8/userdata/shm major:0 minor:252 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/df43cdf08fb65d06b9db1ef59770a000ce2240e1f2e40f1d6a1619ba0e8b03df/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/df43cdf08fb65d06b9db1ef59770a000ce2240e1f2e40f1d6a1619ba0e8b03df/userdata/shm major:0 minor:258 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ef2664ab7688b2c180ba8801467801f25c2ef381f1cb268907fa44549876a809/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ef2664ab7688b2c180ba8801467801f25c2ef381f1cb268907fa44549876a809/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f63327bd315d50a90d1b9877abdb96105e01d6fdb0e17116424c13ecbe62dbc3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f63327bd315d50a90d1b9877abdb96105e01d6fdb0e17116424c13ecbe62dbc3/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f826050a5c784de5b331265098c5bf62987ce32e6de55cf05cab0f3f76894ea8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f826050a5c784de5b331265098c5bf62987ce32e6de55cf05cab0f3f76894ea8/userdata/shm major:0 minor:262 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fa276789b940e2d115bb484a83b6dacabc102c332a9c60e259aaf653e5bc53d5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fa276789b940e2d115bb484a83b6dacabc102c332a9c60e259aaf653e5bc53d5/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volumes/kubernetes.io~projected/kube-api-access-mtc2p:{mountpoint:/var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volumes/kubernetes.io~projected/kube-api-access-mtc2p major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/volumes/kubernetes.io~projected/kube-api-access-s69rd:{mountpoint:/var/lib/kubelet/pods/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/volumes/kubernetes.io~projected/kube-api-access-s69rd major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/volumes/kubernetes.io~secret/serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/325f0a83-d56d-4b62-977b-088a7d5f0e00/volumes/kubernetes.io~projected/kube-api-access-lqdlf:{mountpoint:/var/lib/kubelet/pods/325f0a83-d56d-4b62-977b-088a7d5f0e00/volumes/kubernetes.io~projected/kube-api-access-lqdlf major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/325f0a83-d56d-4b62-977b-088a7d5f0e00/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/325f0a83-d56d-4b62-977b-088a7d5f0e00/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f471ecc-922c-4cb1-9bdd-fdb5da08c592/volumes/kubernetes.io~projected/kube-api-access-224dg:{mountpoint:/var/lib/kubelet/pods/3f471ecc-922c-4cb1-9bdd-fdb5da08c592/volumes/kubernetes.io~projected/kube-api-access-224dg major:0 minor:276 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97/volumes/kubernetes.io~projected/kube-api-access-wrws5:{mountpoint:/var/lib/kubelet/pods/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97/volumes/kubernetes.io~projected/kube-api-access-wrws5 major:0 minor:271 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/45e8b72b-564c-4bb1-b911-baff2d6c87ad/volumes/kubernetes.io~projected/kube-api-access-9zpkn:{mountpoint:/var/lib/kubelet/pods/45e8b72b-564c-4bb1-b911-baff2d6c87ad/volumes/kubernetes.io~projected/kube-api-access-9zpkn major:0 minor:261 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5e3ddf9e-eeb5-4266-b675-092fd4e27623/volumes/kubernetes.io~projected/kube-api-access-xd6vv:{mountpoint:/var/lib/kubelet/pods/5e3ddf9e-eeb5-4266-b675-092fd4e27623/volumes/kubernetes.io~projected/kube-api-access-xd6vv major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5e3ddf9e-eeb5-4266-b675-092fd4e27623/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/5e3ddf9e-eeb5-4266-b675-092fd4e27623/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7/volumes/kubernetes.io~projected/kube-api-access-fxg4z:{mountpoint:/var/lib/kubelet/pods/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7/volumes/kubernetes.io~projected/kube-api-access-fxg4z major:0 minor:103 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/68252533-bd64-4fc5-838a-cc350cbe77f0/volumes/kubernetes.io~projected/kube-api-access-75r7k:{mountpoint:/var/lib/kubelet/pods/68252533-bd64-4fc5-838a-cc350cbe77f0/volumes/kubernetes.io~projected/kube-api-access-75r7k major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/68252533-bd64-4fc5-838a-cc350cbe77f0/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/68252533-bd64-4fc5-838a-cc350cbe77f0/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a80bd6f-2263-4251-8197-5173193f8afc/volumes/kubernetes.io~projected/kube-api-access-tqd2v:{mountpoint:/var/lib/kubelet/pods/6a80bd6f-2263-4251-8197-5173193f8afc/volumes/kubernetes.io~projected/kube-api-access-tqd2v major:0 minor:256 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/75e3e2cc-aa56-41f3-8859-1c086f419d05/volumes/kubernetes.io~projected/kube-api-access-clz6w:{mountpoint:/var/lib/kubelet/pods/75e3e2cc-aa56-41f3-8859-1c086f419d05/volumes/kubernetes.io~projected/kube-api-access-clz6w major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/75e3e2cc-aa56-41f3-8859-1c086f419d05/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/75e3e2cc-aa56-41f3-8859-1c086f419d05/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7b489385-2c96-4a97-8b31-362162de020e/volumes/kubernetes.io~projected/kube-api-access-pg6th:{mountpoint:/var/lib/kubelet/pods/7b489385-2c96-4a97-8b31-362162de020e/volumes/kubernetes.io~projected/kube-api-access-pg6th major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7c4e7e57-43be-4d31-b523-f7e4d316dce3/volumes/kubernetes.io~projected/kube-api-access-bvjkr:{mountpoint:/var/lib/kubelet/pods/7c4e7e57-43be-4d31-b523-f7e4d316dce3/volumes/kubernetes.io~projected/kube-api-access-bvjkr major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/813f91c2-2b37-4681-968d-4217e286e22f/volumes/kubernetes.io~projected/kube-api-access-njjkt:{mountpoint:/var/lib/kubelet/pods/813f91c2-2b37-4681-968d-4217e286e22f/volumes/kubernetes.io~projected/kube-api-access-njjkt major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/86cb5d23-df7f-4f67-8086-1789d8e68544/volumes/kubernetes.io~projected/kube-api-access-j7k8m:{mountpoint:/var/lib/kubelet/pods/86cb5d23-df7f-4f67-8086-1789d8e68544/volumes/kubernetes.io~projected/kube-api-access-j7k8m major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/86cb5d23-df7f-4f67-8086-1789d8e68544/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/86cb5d23-df7f-4f67-8086-1789d8e68544/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a57854ac-809a-4745-aaa1-774f0a08a560/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/a57854ac-809a-4745-aaa1-774f0a08a560/volumes/kubernetes.io~projected/kube-api-access major:0 minor:267 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a57854ac-809a-4745-aaa1-774f0a08a560/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a57854ac-809a-4745-aaa1-774f0a08a560/volumes/kubernetes.io~secret/serving-cert major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aa16c3bf-2350-46d1-afa0-9477b3ec8877/volumes/kubernetes.io~projected/kube-api-access-qmndg:{mountpoint:/var/lib/kubelet/pods/aa16c3bf-2350-46d1-afa0-9477b3ec8877/volumes/kubernetes.io~projected/kube-api-access-qmndg major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aa16c3bf-2350-46d1-afa0-9477b3ec8877/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/aa16c3bf-2350-46d1-afa0-9477b3ec8877/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ab175f7e-a5e8-4fda-98c9-6d052a221a83/volumes/kubernetes.io~projected/kube-api-access-zmrxh:{mountpoint:/var/lib/kubelet/pods/ab175f7e-a5e8-4fda-98c9-6d052a221a83/volumes/kubernetes.io~projected/kube-api-access-zmrxh major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ab175f7e-a5e8-4fda-98c9-6d052a221a83/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ab175f7e-a5e8-4fda-98c9-6d052a221a83/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/acb704a9-6c8d-4378-ae93-e7095b1fce85/volumes/kubernetes.io~projected/kube-api-access-xvx6w:{mountpoint:/var/lib/kubelet/pods/acb704a9-6c8d-4378-ae93-e7095b1fce85/volumes/kubernetes.io~projected/kube-api-access-xvx6w major:0 minor:260 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ad692349-5089-4afc-85b2-9b6e7997567c/volumes/kubernetes.io~projected/kube-api-access-h6mb5:{mountpoint:/var/lib/kubelet/pods/ad692349-5089-4afc-85b2-9b6e7997567c/volumes/kubernetes.io~projected/kube-api-access-h6mb5 major:0 minor:102 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ad692349-5089-4afc-85b2-9b6e7997567c/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/ad692349-5089-4afc-85b2-9b6e7997567c/volumes/kubernetes.io~secret/metrics-tls major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b4291bfd-53d9-4c78-b7cb-d7eb46560528/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/b4291bfd-53d9-4c78-b7cb-d7eb46560528/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b4291bfd-53d9-4c78-b7cb-d7eb46560528/volumes/kubernetes.io~projected/kube-api-access-9nvl4:{mountpoint:/var/lib/kubelet/pods/b4291bfd-53d9-4c78-b7cb-d7eb46560528/volumes/kubernetes.io~projected/kube-api-access-9nvl4 major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b98b4efc-6117-487f-9cfc-38ce66dd9570/volumes/kubernetes.io~projected/kube-api-access-6c5ch:{mountpoint:/var/lib/kubelet/pods/b98b4efc-6117-487f-9cfc-38ce66dd9570/volumes/kubernetes.io~projected/kube-api-access-6c5ch major:0 minor:117 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bbc0b783-28d5-4554-b49d-c66082546f44/volumes/kubernetes.io~projected/kube-api-access-27qvw:{mountpoint:/var/lib/kubelet/pods/bbc0b783-28d5-4554-b49d-c66082546f44/volumes/kubernetes.io~projected/kube-api-access-27qvw major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c1854ea4-c8e2-4289-84b6-1f18b2ac684f/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/c1854ea4-c8e2-4289-84b6-1f18b2ac684f/volumes/kubernetes.io~projected/kube-api-access major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c1854ea4-c8e2-4289-84b6-1f18b2ac684f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c1854ea4-c8e2-4289-84b6-1f18b2ac684f/volumes/kubernetes.io~secret/serving-cert major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c2a23d24-9e09-431e-8c3b-8456ff51a8d0/volumes/kubernetes.io~projected/kube-api-access-btdjt:{mountpoint:/var/lib/kubelet/pods/c2a23d24-9e09-431e-8c3b-8456ff51a8d0/volumes/kubernetes.io~projected/kube-api-access-btdjt major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c2a23d24-9e09-431e-8c3b-8456ff51a8d0/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c2a23d24-9e09-431e-8c3b-8456ff51a8d0/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dab97c35-fe60-4134-8715-a7c6dd085fb3/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/dab97c35-fe60-4134-8715-a7c6dd085fb3/volumes/kubernetes.io~projected/kube-api-access major:0 minor:104 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/df428d5a-c722-4536-8e7f-cdd85c560481/volumes/kubernetes.io~projected/kube-api-access-dtcnq:{mountpoint:/var/lib/kubelet/pods/df428d5a-c722-4536-8e7f-cdd85c560481/volumes/kubernetes.io~projected/kube-api-access-dtcnq major:0 minor:255 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:257 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/volumes/kubernetes.io~projected/kube-api-access-8d57k:{mountpoint:/var/lib/kubelet/pods/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/volumes/kubernetes.io~projected/kube-api-access-8d57k major:0 minor:254 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7/volumes/kubernetes.io~projected/kube-api-access-lpdk5:{mountpoint:/var/lib/kubelet/pods/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7/volumes/kubernetes.io~projected/kube-api-access-lpdk5 major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ee3cc021-67d8-4b7f-b443-16f18228712e/volumes/kubernetes.io~projected/kube-api-access-7l4b7:{mountpoint:/var/lib/kubelet/pods/ee3cc021-67d8-4b7f-b443-16f18228712e/volumes/kubernetes.io~projected/kube-api-access-7l4b7 major:0 minor:263 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~projected/kube-api-access-xhkh7:{mountpoint:/var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~projected/kube-api-access-xhkh7 major:0 minor:270 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~secret/etcd-client major:0 minor:247 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~secret/serving-cert major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f53bc282-5937-49ac-ac98-2ee37ccb268d/volumes/kubernetes.io~projected/kube-api-access-b2d4r:{mountpoint:/var/lib/kubelet/pods/f53bc282-5937-49ac-ac98-2ee37ccb268d/volumes/kubernetes.io~projected/kube-api-access-b2d4r major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fa759777-de22-4440-a3d3-ad429a3b8e7b/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/fa759777-de22-4440-a3d3-ad429a3b8e7b/volumes/kubernetes.io~projected/kube-api-access major:0 minor:251 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fa759777-de22-4440-a3d3-ad429a3b8e7b/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/fa759777-de22-4440-a3d3-ad429a3b8e7b/volumes/kubernetes.io~secret/serving-cert major:0 minor:249 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb0fc10f-5796-4cd5-b8f5-72d678054c24/volumes/kubernetes.io~projected/kube-api-access-lh47j:{mountpoint:/var/lib/kubelet/pods/fb0fc10f-5796-4cd5-b8f5-72d678054c24/volumes/kubernetes.io~projected/kube-api-access-lh47j major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb0fc10f-5796-4cd5-b8f5-72d678054c24/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/fb0fc10f-5796-4cd5-b8f5-72d678054c24/volumes/kubernetes.io~secret/webhook-cert major:0 minor:138 fsType:tmpfs blockSize:0} overlay_0-107:{mountpoint:/var/lib/containers/storage/overlay/9db0af0ee74e589c618a5907551df643aec9ff9c31360ef4454e4f03b64c32fe/merged major:0 minor:107 fsType:overlay blockSize:0} overlay_0-111:{mountpoint:/var/lib/containers/storage/overlay/f353bad261577ae807d0fe63002a5331ee0614ad70c503a063b254537fa8f2c7/merged major:0 minor:111 fsType:overlay blockSize:0} overlay_0-113:{mountpoint:/var/lib/containers/storage/overlay/ae99a0506900e26a728e10e458875e8fcfa52db817e22ae451304fce4bb0b15c/merged major:0 minor:113 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/a7f959166927aaf5a68b8bee14c0c3ca51daf8094ac6078bfa36b9df0e3f259d/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-132:{mountpoint:/var/lib/containers/storage/overlay/0a193133cf6dc512dd6222a88a26bfa79f23cba05b757da85d1ae39761098c7d/merged major:0 minor:132 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/9d997b980b356ddfdf630c398182bb77689fd36cf113748842cb52c7facca8d3/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/5092d95303c57ab1f1b4fe40cae1afd6909a307da9ade2856b6ed3956b2db708/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/357b3035e33baca71cc95b2521d479c36fbdeacd30ada35e763197476d643c5a/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/4590e312f2d394851c17306adc7ea6c048de2ad3d3dbc1c10201bdd57bfb696d/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/601abd4478b5aaf2eff8ec78e3d163d8396c0c1d0313041b4ef8f1ca95497a08/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/0095722db58997c16462a52bfadd816bdb0d045cdee7262bf3843448cff9a1dc/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/908ea88f00491c8e6098f7bb7f42695557c2204f39302a24647716619dadb1b7/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-157:{mountpoint:/var/lib/containers/storage/overlay/14d729e1e4d85c607b5fb243df298694984538e9b7f895cb4dd8e9060fea1af9/merged major:0 minor:157 fsType:overlay blockSize:0} overlay_0-160:{mountpoint:/var/lib/containers/storage/overlay/75a62a8fc8fb3648b712820272eb17860a758eb622e562d69887bfdbbcf67f24/merged major:0 minor:160 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/e82eeb503773060b0acb45dac9496642b42ae08f15da57c6cedeacb7bfd41c92/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/4df5be66be6b873cfd1c551b5b0b9fc6f4ea25268a299b477b758a8e1736c2cc/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/8c11e423afdce4615b05e2c97ec56e82ea9e222a51dad5e9b9cadefc7a50fbd4/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/7fa6ef038f39cce3ddd045f6808a08b8e20de3f00d0f3ffb70268473ebccfc8a/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/4bbfd60178e02b53b381dbc1196c28665eefccf695d461a5ce89ffdccece3a34/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/4b857198661048ccff5f5a12f7cd9bd7f3f3cd1ac178a2c27e29bb34ec87d33d/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/c9dffbbd4eb7ea11960e150211e802ef310e3b0e1f8bbd40baf231c77d324648/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-274:{mountpoint:/var/lib/containers/storage/overlay/0680bd6b5ea798ae9cb5f825ab60e8ceb1eae6c9ade662692469f0aed930b3c9/merged major:0 minor:274 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/4380b0f5133a3f0da51585ccae3689a80729f0158e1e3bbf3dda13c89ebd327b/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-279:{mountpoint:/var/lib/containers/storage/overlay/ba139f06c62cf9c8d42d01d80bf13caf3d1b2638660e16086eb9b0795fdc4d53/merged major:0 minor:279 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/072a2a473587fe1e44374ccad1f1d05e0ae690f58cfe27e3661336b9140e6d1e/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-289:{mountpoint:/var/lib/containers/storage/overlay/6d63eb109e8673fd69abf0f81fb4a96e2e680f52d43bd16ccfd5a19ccde395be/merged major:0 minor:289 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/88611e7d39caa0378811472c501e49c94f5a1ecb55ce4769e0eb33fa70530882/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/be804a4f303d3a23e06cba712029aec9a7efe2dc6e0ae2ae887c8c467a6ae84c/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/435a1c816845cf50d5a1531e07ff1be53a2522a86b57431fe92b5479ad77d97a/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/cab4552b7af734c9353bdcf574b4b45a19d3ad74266a8a2becae706d8209ec66/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-299:{mountpoint:/var/lib/containers/storage/overlay/888de48e1a561193d529eede40671dd5a1712bd9de9e6f3a9ac14f82c14c48d5/merged major:0 minor:299 fsType:overlay blockSize:0} overlay_0-300:{mountpoint:/var/lib/containers/storage/overlay/588b06418ad2bc65f5ea0486df355a89168e6cbfaedf5d2474926902564dad13/merged major:0 minor:300 fsType:overlay blockSize:0} overlay_0-302:{mountpoint:/var/lib/containers/storage/overlay/fba8af5043b0bd6f4de2d28415ff223d19a7b80f5290f36fc84a3cafc64c4f2d/merged major:0 minor:302 fsType:overlay blockSize:0} overlay_0-308:{mountpoint:/var/lib/containers/storage/overlay/81e75bc80a1e863a817509ea99bde2a809cb2a95e9680b9a461ccd13b04c4299/merged major:0 minor:308 fsType:overlay blockSize:0} overlay_0-311:{mountpoint:/var/lib/containers/storage/overlay/6dcc5d1173eb99d5846cc948a47570220d59e143dc1eec45d1d07c870df16bcd/merged major:0 minor:311 fsType:overlay blockSize:0} overlay_0-313:{mountpoint:/var/lib/containers/storage/overlay/550ed381e076eb3d67669006b48b26c2afb7f60a8b28e0a67e25d4b3983c4a36/merged major:0 minor:313 fsType:overlay blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/b95e28baf4997215c4ff7d883d25640c2eaa5bfdb32df03d38098cdc61f03b9d/merged major:0 minor:43 fsType:overlay blockSize:0} overlay_0-46:{mountpoint:/var/lib/containers/storage/overlay/fa4d8c6237857c8d3b161e359355834215421171a51d1acaa313fa9d5c80a36f/merged major:0 minor:46 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/48ced284913666a159f4d8e321f50193409aa253d2a0550267a46150e92cfe74/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/6d628089e979ea4ae73e1bf176f3f6d60a61285c2c2c5e2240f5d92b172ed027/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/40f9dc9eb1fcd2edf4ef13e2f1dbc45d4fba894aeda7a43a76bec77f0c8c27f6/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/949b36f308165d6d5be22a182f7d8be132f80f0fa5e0b965ac7fcf72b09788db/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/f5b658b9ea43023f711575ea0423fa9f90320b2d6d8593c592494827223409c3/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/1a15c4d5aa175c4e2351d4dc701dc5ad10988be5c7ef10f61bb99d5bf59e95fb/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/a87bc5de702e6d70b4b2d964d1fd2edaf662ec1be7d2531c2c4b6620a78f7e07/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/4468e81bea965916320fe0bdea4c452e3d0314c14780f535559f1c3639b8de4f/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-76:{mountpoint:/var/lib/containers/storage/overlay/c77dfba1f2c73cf9cbd794f77afb676b75af63f573acdb87a7d111ffee5b1acd/merged major:0 minor:76 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/a21e0e884af4678a1a8154cfb8f21881e10646d5318b8ead1904ec7c6aa1c689/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-89:{mountpoint:/var/lib/containers/storage/overlay/8ccd24adf85822de17486d0fdb7b252dc9b9bd066b0793b514705de9a87105d7/merged major:0 minor:89 fsType:overlay blockSize:0}] Mar 20 08:36:14.470730 master-0 kubenswrapper[7465]: I0320 08:36:14.469742 7465 manager.go:217] Machine: {Timestamp:2026-03-20 08:36:14.468768622 +0000 UTC m=+0.112084132 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:7cbba5bd4cad48d397925286776799f2 SystemUUID:7cbba5bd-4cad-48d3-9792-5286776799f2 BootID:2d4df506-7881-4563-b01f-2840d2bdb60b Filesystems:[{Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5e3ddf9e-eeb5-4266-b675-092fd4e27623/volumes/kubernetes.io~projected/kube-api-access-xd6vv DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/aa16c3bf-2350-46d1-afa0-9477b3ec8877/volumes/kubernetes.io~projected/kube-api-access-qmndg DeviceMajor:0 DeviceMinor:230 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ad692349-5089-4afc-85b2-9b6e7997567c/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:98 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/75e3e2cc-aa56-41f3-8859-1c086f419d05/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/86cb5d23-df7f-4f67-8086-1789d8e68544/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9af8ad1671806bd505a390180b2388970cc534101cf6a5cf64e76e13311c0cb9/userdata/shm DeviceMajor:0 DeviceMinor:268 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-299 DeviceMajor:0 DeviceMinor:299 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7c4e7e57-43be-4d31-b523-f7e4d316dce3/volumes/kubernetes.io~projected/kube-api-access-bvjkr DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b98b4efc-6117-487f-9cfc-38ce66dd9570/volumes/kubernetes.io~projected/kube-api-access-6c5ch DeviceMajor:0 DeviceMinor:117 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/46a12190f11c7c4d27d18246d17e718f0fd8c23ea7d374844186aa5295484abf/userdata/shm DeviceMajor:0 DeviceMinor:284 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5d21afac0935094080df835d139dd20efdf51fad3502782c60bb38ee7294f13b/userdata/shm DeviceMajor:0 DeviceMinor:241 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ef2664ab7688b2c180ba8801467801f25c2ef381f1cb268907fa44549876a809/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:247 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dab97c35-fe60-4134-8715-a7c6dd085fb3/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:104 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5e3ddf9e-eeb5-4266-b675-092fd4e27623/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/fb0fc10f-5796-4cd5-b8f5-72d678054c24/volumes/kubernetes.io~projected/kube-api-access-lh47j DeviceMajor:0 DeviceMinor:139 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-132 DeviceMajor:0 DeviceMinor:132 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b4291bfd-53d9-4c78-b7cb-d7eb46560528/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:222 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7b489385-2c96-4a97-8b31-362162de020e/volumes/kubernetes.io~projected/kube-api-access-pg6th DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-311 DeviceMajor:0 DeviceMinor:311 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/325f0a83-d56d-4b62-977b-088a7d5f0e00/volumes/kubernetes.io~projected/kube-api-access-lqdlf DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a57854ac-809a-4745-aaa1-774f0a08a560/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:267 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-274 DeviceMajor:0 DeviceMinor:274 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a49d9b3d48753f2f078ac0130c6af5ca5c251ec6fd9f68a980d833a929f8987e/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-279 DeviceMajor:0 DeviceMinor:279 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c2a23d24-9e09-431e-8c3b-8456ff51a8d0/volumes/kubernetes.io~projected/kube-api-access-btdjt DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~projected/kube-api-access-xhkh7 DeviceMajor:0 DeviceMinor:270 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/49cf279e25f73f2651832183fcd1e966f6a074b295d2f5e34a59a5be738d2377/userdata/shm DeviceMajor:0 DeviceMinor:272 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fa276789b940e2d115bb484a83b6dacabc102c332a9c60e259aaf653e5bc53d5/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fb0fc10f-5796-4cd5-b8f5-72d678054c24/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:138 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/df43cdf08fb65d06b9db1ef59770a000ce2240e1f2e40f1d6a1619ba0e8b03df/userdata/shm DeviceMajor:0 DeviceMinor:258 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volumes/kubernetes.io~projected/kube-api-access-mtc2p DeviceMajor:0 DeviceMinor:127 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c2a23d24-9e09-431e-8c3b-8456ff51a8d0/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b4291bfd-53d9-4c78-b7cb-d7eb46560528/volumes/kubernetes.io~projected/kube-api-access-9nvl4 DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/16961d83ade56434cc5d6847f42954f60a19529cc2fd922d6eec9e1e749ea461/userdata/shm DeviceMajor:0 DeviceMinor:282 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/df428d5a-c722-4536-8e7f-cdd85c560481/volumes/kubernetes.io~projected/kube-api-access-dtcnq DeviceMajor:0 DeviceMinor:255 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/acb704a9-6c8d-4378-ae93-e7095b1fce85/volumes/kubernetes.io~projected/kube-api-access-xvx6w DeviceMajor:0 DeviceMinor:260 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f63327bd315d50a90d1b9877abdb96105e01d6fdb0e17116424c13ecbe62dbc3/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f53bc282-5937-49ac-ac98-2ee37ccb268d/volumes/kubernetes.io~projected/kube-api-access-b2d4r DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6a80bd6f-2263-4251-8197-5173193f8afc/volumes/kubernetes.io~projected/kube-api-access-tqd2v DeviceMajor:0 DeviceMinor:256 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7/volumes/kubernetes.io~projected/kube-api-access-lpdk5 DeviceMajor:0 DeviceMinor:221 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/75e3e2cc-aa56-41f3-8859-1c086f419d05/volumes/kubernetes.io~projected/kube-api-access-clz6w DeviceMajor:0 DeviceMinor:243 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/51fe4cded0c2312a6b3bfd1f48ff9089513eaa05b60208bd1ced0f39d8ffe36e/userdata/shm DeviceMajor:0 DeviceMinor:265 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/24e53548727bf62b2d0124119a6e8fe69dde1c3031b4a91a063f8dd58773fece/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/dc885deb2f8a42b2a9647ca9a7a52b0dd31aa93757cc5e47ba821f5cab2f46b8/userdata/shm DeviceMajor:0 DeviceMinor:252 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-289 DeviceMajor:0 DeviceMinor:289 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bbc0b783-28d5-4554-b49d-c66082546f44/volumes/kubernetes.io~projected/kube-api-access-27qvw DeviceMajor:0 DeviceMinor:246 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cadfc06b46a2370e89939aa270eb81d7b99f5548dca07c84a3c027dea72e913b/userdata/shm DeviceMajor:0 DeviceMinor:248 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-302 DeviceMajor:0 DeviceMinor:302 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-113 DeviceMajor:0 DeviceMinor:113 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97/volumes/kubernetes.io~projected/kube-api-access-wrws5 DeviceMajor:0 DeviceMinor:271 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-300 DeviceMajor:0 DeviceMinor:300 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9a8426b4146cf2f000b30dafab20a28003c24ae65a16f62f890c575d2f9770d9/userdata/shm DeviceMajor:0 DeviceMinor:237 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bfb1d352b3974d283e9d723de75463f8ee764b61d5e2eec1ee72f20516314a15/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/972f78b16f9b47d7c2f794578bdc02049da4958a2a754da51e80ca31806a5573/userdata/shm DeviceMajor:0 DeviceMinor:109 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7/volumes/kubernetes.io~projected/kube-api-access-fxg4z DeviceMajor:0 DeviceMinor:103 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/813f91c2-2b37-4681-968d-4217e286e22f/volumes/kubernetes.io~projected/kube-api-access-njjkt DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a57854ac-809a-4745-aaa1-774f0a08a560/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:245 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ee3cc021-67d8-4b7f-b443-16f18228712e/volumes/kubernetes.io~projected/kube-api-access-7l4b7 DeviceMajor:0 DeviceMinor:263 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c1854ea4-c8e2-4289-84b6-1f18b2ac684f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3f471ecc-922c-4cb1-9bdd-fdb5da08c592/volumes/kubernetes.io~projected/kube-api-access-224dg DeviceMajor:0 DeviceMinor:276 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4d1e15f7704367048583514d8253b668db81d2da2b51801dfb366fd0e7170679/userdata/shm DeviceMajor:0 DeviceMinor:281 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/volumes/kubernetes.io~projected/kube-api-access-8d57k DeviceMajor:0 DeviceMinor:254 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4c2c291535895db17dfb8bf23092bd97a1919325de4d3a5857890c5d777dcb49/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/aa16c3bf-2350-46d1-afa0-9477b3ec8877/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/86cb5d23-df7f-4f67-8086-1789d8e68544/volumes/kubernetes.io~projected/kube-api-access-j7k8m DeviceMajor:0 DeviceMinor:227 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b77a1ff885b1de50033d17a32311fabaefba7e39efce419b16febb35e5db2498/userdata/shm DeviceMajor:0 DeviceMinor:235 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:242 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/45e8b72b-564c-4bb1-b911-baff2d6c87ad/volumes/kubernetes.io~projected/kube-api-access-9zpkn DeviceMajor:0 DeviceMinor:261 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f826050a5c784de5b331265098c5bf62987ce32e6de55cf05cab0f3f76894ea8/userdata/shm DeviceMajor:0 DeviceMinor:262 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-313 DeviceMajor:0 DeviceMinor:313 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ab175f7e-a5e8-4fda-98c9-6d052a221a83/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:257 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-89 DeviceMajor:0 DeviceMinor:89 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-111 DeviceMajor:0 DeviceMinor:111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c1854ea4-c8e2-4289-84b6-1f18b2ac684f/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:228 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/919c09e620d76c77a9425f79c1d3a73bcd978ef6fd46e99de901376458ec1273/userdata/shm DeviceMajor:0 DeviceMinor:239 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/volumes/kubernetes.io~projected/kube-api-access-s69rd DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-76 DeviceMajor:0 DeviceMinor:76 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ad692349-5089-4afc-85b2-9b6e7997567c/volumes/kubernetes.io~projected/kube-api-access-h6mb5 DeviceMajor:0 DeviceMinor:102 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1cdce851761fa32d1bf31f749f94743d1cc0bca1a250ad08c114ccc4c2f77b9b/userdata/shm DeviceMajor:0 DeviceMinor:105 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ab175f7e-a5e8-4fda-98c9-6d052a221a83/volumes/kubernetes.io~projected/kube-api-access-zmrxh DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/91607e86857f069937668c70d7500f8b27325b07177f03b7a7ab831b3600ac2a/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-107 DeviceMajor:0 DeviceMinor:107 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/37ba8dc9671f8fe4b5333f2dc9ab294ad8d004712a5f4bddaf2f6742452a4b3c/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-157 DeviceMajor:0 DeviceMinor:157 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fa759777-de22-4440-a3d3-ad429a3b8e7b/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:249 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-308 DeviceMajor:0 DeviceMinor:308 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-46 DeviceMajor:0 DeviceMinor:46 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-160 DeviceMajor:0 DeviceMinor:160 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/325f0a83-d56d-4b62-977b-088a7d5f0e00/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/68252533-bd64-4fc5-838a-cc350cbe77f0/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/68252533-bd64-4fc5-838a-cc350cbe77f0/volumes/kubernetes.io~projected/kube-api-access-75r7k DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/fa759777-de22-4440-a3d3-ad429a3b8e7b/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:251 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:16961d83ade5643 MacAddress:16:eb:94:fc:a4:cd Speed:10000 Mtu:8900} {Name:46a12190f11c7c4 MacAddress:2e:6c:63:06:86:8b Speed:10000 Mtu:8900} {Name:4d1e15f77043670 MacAddress:86:5e:d6:fa:87:f3 Speed:10000 Mtu:8900} {Name:51fe4cded0c2312 MacAddress:86:b0:6e:70:0a:d9 Speed:10000 Mtu:8900} {Name:5d21afac0935094 MacAddress:6e:ab:09:f1:f7:a2 Speed:10000 Mtu:8900} {Name:919c09e620d76c7 MacAddress:86:f0:53:69:aa:b5 Speed:10000 Mtu:8900} {Name:9a8426b4146cf2f MacAddress:be:2f:3c:a0:ce:1e Speed:10000 Mtu:8900} {Name:9af8ad1671806bd MacAddress:06:02:82:e8:9c:12 Speed:10000 Mtu:8900} {Name:b77a1ff885b1de5 MacAddress:66:c5:28:32:84:9f Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:7a:3a:b9:64:e3:6b Speed:0 Mtu:8900} {Name:cadfc06b46a2370 MacAddress:8e:07:e6:48:cb:ed Speed:10000 Mtu:8900} {Name:dc885deb2f8a42b MacAddress:5a:90:d2:4f:6f:ba Speed:10000 Mtu:8900} {Name:df43cdf08fb65d0 MacAddress:ea:04:6a:10:8e:3f Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:c7:39:2c Speed:-1 Mtu:9000} {Name:f826050a5c784de MacAddress:c2:28:ce:1c:17:22 Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:52:c7:57:df:ad:1d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 08:36:14.470730 master-0 kubenswrapper[7465]: I0320 08:36:14.470547 7465 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 08:36:14.471517 master-0 kubenswrapper[7465]: I0320 08:36:14.471320 7465 manager.go:233] Version: {KernelVersion:5.14.0-427.113.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202603021444-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 08:36:14.473303 master-0 kubenswrapper[7465]: I0320 08:36:14.471677 7465 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 08:36:14.473303 master-0 kubenswrapper[7465]: I0320 08:36:14.471838 7465 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 08:36:14.473303 master-0 kubenswrapper[7465]: I0320 08:36:14.471865 7465 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 08:36:14.473303 master-0 kubenswrapper[7465]: I0320 08:36:14.472091 7465 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 08:36:14.473303 master-0 kubenswrapper[7465]: I0320 08:36:14.472102 7465 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 08:36:14.473303 master-0 kubenswrapper[7465]: I0320 08:36:14.472118 7465 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 08:36:14.473303 master-0 kubenswrapper[7465]: I0320 08:36:14.472141 7465 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 08:36:14.473303 master-0 kubenswrapper[7465]: I0320 08:36:14.472378 7465 state_mem.go:36] "Initialized new in-memory state store" Mar 20 08:36:14.473303 master-0 kubenswrapper[7465]: I0320 08:36:14.472485 7465 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 08:36:14.473303 master-0 kubenswrapper[7465]: I0320 08:36:14.472554 7465 kubelet.go:418] "Attempting to sync node with API server" Mar 20 08:36:14.473303 master-0 kubenswrapper[7465]: I0320 08:36:14.472567 7465 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 08:36:14.473303 master-0 kubenswrapper[7465]: I0320 08:36:14.472583 7465 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 08:36:14.473303 master-0 kubenswrapper[7465]: I0320 08:36:14.472595 7465 kubelet.go:324] "Adding apiserver pod source" Mar 20 08:36:14.473303 master-0 kubenswrapper[7465]: I0320 08:36:14.472617 7465 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 08:36:14.475588 master-0 kubenswrapper[7465]: I0320 08:36:14.475530 7465 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 20 08:36:14.476357 master-0 kubenswrapper[7465]: I0320 08:36:14.476065 7465 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 08:36:14.476466 master-0 kubenswrapper[7465]: I0320 08:36:14.476444 7465 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 08:36:14.476729 master-0 kubenswrapper[7465]: I0320 08:36:14.476706 7465 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 08:36:14.477803 master-0 kubenswrapper[7465]: I0320 08:36:14.476731 7465 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 08:36:14.477803 master-0 kubenswrapper[7465]: I0320 08:36:14.476741 7465 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 08:36:14.477803 master-0 kubenswrapper[7465]: I0320 08:36:14.476749 7465 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 08:36:14.477803 master-0 kubenswrapper[7465]: I0320 08:36:14.476756 7465 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 08:36:14.477803 master-0 kubenswrapper[7465]: I0320 08:36:14.476764 7465 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 08:36:14.477803 master-0 kubenswrapper[7465]: I0320 08:36:14.476772 7465 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 08:36:14.477803 master-0 kubenswrapper[7465]: I0320 08:36:14.476779 7465 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 08:36:14.477803 master-0 kubenswrapper[7465]: I0320 08:36:14.476804 7465 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 08:36:14.477803 master-0 kubenswrapper[7465]: I0320 08:36:14.476813 7465 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 08:36:14.477803 master-0 kubenswrapper[7465]: I0320 08:36:14.476836 7465 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 08:36:14.477803 master-0 kubenswrapper[7465]: I0320 08:36:14.476927 7465 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 08:36:14.477803 master-0 kubenswrapper[7465]: I0320 08:36:14.476988 7465 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 08:36:14.477803 master-0 kubenswrapper[7465]: I0320 08:36:14.477431 7465 server.go:1280] "Started kubelet" Mar 20 08:36:14.478152 master-0 kubenswrapper[7465]: I0320 08:36:14.477824 7465 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 08:36:14.478152 master-0 kubenswrapper[7465]: I0320 08:36:14.477887 7465 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 08:36:14.478152 master-0 kubenswrapper[7465]: I0320 08:36:14.478005 7465 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 20 08:36:14.478646 master-0 kubenswrapper[7465]: I0320 08:36:14.478605 7465 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 08:36:14.479108 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 20 08:36:14.492980 master-0 kubenswrapper[7465]: I0320 08:36:14.492931 7465 server.go:449] "Adding debug handlers to kubelet server" Mar 20 08:36:14.498414 master-0 kubenswrapper[7465]: I0320 08:36:14.498299 7465 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 08:36:14.498414 master-0 kubenswrapper[7465]: I0320 08:36:14.498362 7465 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 08:36:14.498496 master-0 kubenswrapper[7465]: I0320 08:36:14.498426 7465 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-21 08:25:35 +0000 UTC, rotation deadline is 2026-03-21 01:26:26.585919606 +0000 UTC Mar 20 08:36:14.498533 master-0 kubenswrapper[7465]: I0320 08:36:14.498502 7465 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 16h50m12.087421427s for next certificate rotation Mar 20 08:36:14.498657 master-0 kubenswrapper[7465]: I0320 08:36:14.498621 7465 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 08:36:14.498732 master-0 kubenswrapper[7465]: I0320 08:36:14.498721 7465 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 08:36:14.499924 master-0 kubenswrapper[7465]: E0320 08:36:14.498649 7465 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 20 08:36:14.499999 master-0 kubenswrapper[7465]: I0320 08:36:14.498700 7465 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 20 08:36:14.499999 master-0 kubenswrapper[7465]: I0320 08:36:14.499540 7465 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 08:36:14.500201 master-0 kubenswrapper[7465]: I0320 08:36:14.499562 7465 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 08:36:14.501970 master-0 kubenswrapper[7465]: I0320 08:36:14.501579 7465 factory.go:55] Registering systemd factory Mar 20 08:36:14.502098 master-0 kubenswrapper[7465]: I0320 08:36:14.502075 7465 factory.go:221] Registration of the systemd container factory successfully Mar 20 08:36:14.502966 master-0 kubenswrapper[7465]: I0320 08:36:14.502951 7465 factory.go:153] Registering CRI-O factory Mar 20 08:36:14.503059 master-0 kubenswrapper[7465]: I0320 08:36:14.503047 7465 factory.go:221] Registration of the crio container factory successfully Mar 20 08:36:14.503311 master-0 kubenswrapper[7465]: I0320 08:36:14.503276 7465 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 08:36:14.503459 master-0 kubenswrapper[7465]: I0320 08:36:14.503441 7465 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 08:36:14.503537 master-0 kubenswrapper[7465]: I0320 08:36:14.503527 7465 factory.go:103] Registering Raw factory Mar 20 08:36:14.503604 master-0 kubenswrapper[7465]: I0320 08:36:14.503596 7465 manager.go:1196] Started watching for new ooms in manager Mar 20 08:36:14.504103 master-0 kubenswrapper[7465]: I0320 08:36:14.504092 7465 manager.go:319] Starting recovery of all containers Mar 20 08:36:14.505014 master-0 kubenswrapper[7465]: I0320 08:36:14.504930 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="248a3d2f-3be4-46bf-959c-79d28736c0d6" volumeName="kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-config" seLinuxMountContext="" Mar 20 08:36:14.505082 master-0 kubenswrapper[7465]: I0320 08:36:14.505012 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="248a3d2f-3be4-46bf-959c-79d28736c0d6" volumeName="kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-script-lib" seLinuxMountContext="" Mar 20 08:36:14.505082 master-0 kubenswrapper[7465]: I0320 08:36:14.505032 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" volumeName="kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-config" seLinuxMountContext="" Mar 20 08:36:14.505082 master-0 kubenswrapper[7465]: I0320 08:36:14.505049 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="45e8b72b-564c-4bb1-b911-baff2d6c87ad" volumeName="kubernetes.io/projected/45e8b72b-564c-4bb1-b911-baff2d6c87ad-kube-api-access-9zpkn" seLinuxMountContext="" Mar 20 08:36:14.505082 master-0 kubenswrapper[7465]: I0320 08:36:14.505064 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ad692349-5089-4afc-85b2-9b6e7997567c" volumeName="kubernetes.io/projected/ad692349-5089-4afc-85b2-9b6e7997567c-kube-api-access-h6mb5" seLinuxMountContext="" Mar 20 08:36:14.505082 master-0 kubenswrapper[7465]: I0320 08:36:14.505076 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ad692349-5089-4afc-85b2-9b6e7997567c" volumeName="kubernetes.io/secret/ad692349-5089-4afc-85b2-9b6e7997567c-metrics-tls" seLinuxMountContext="" Mar 20 08:36:14.505082 master-0 kubenswrapper[7465]: I0320 08:36:14.505089 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="248a3d2f-3be4-46bf-959c-79d28736c0d6" volumeName="kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-env-overrides" seLinuxMountContext="" Mar 20 08:36:14.505334 master-0 kubenswrapper[7465]: I0320 08:36:14.505103 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="68252533-bd64-4fc5-838a-cc350cbe77f0" volumeName="kubernetes.io/projected/68252533-bd64-4fc5-838a-cc350cbe77f0-kube-api-access-75r7k" seLinuxMountContext="" Mar 20 08:36:14.505334 master-0 kubenswrapper[7465]: I0320 08:36:14.505118 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="75e3e2cc-aa56-41f3-8859-1c086f419d05" volumeName="kubernetes.io/secret/75e3e2cc-aa56-41f3-8859-1c086f419d05-serving-cert" seLinuxMountContext="" Mar 20 08:36:14.505334 master-0 kubenswrapper[7465]: I0320 08:36:14.505131 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" volumeName="kubernetes.io/projected/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-kube-api-access-s69rd" seLinuxMountContext="" Mar 20 08:36:14.505334 master-0 kubenswrapper[7465]: I0320 08:36:14.505146 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5e3ddf9e-eeb5-4266-b675-092fd4e27623" volumeName="kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovnkube-config" seLinuxMountContext="" Mar 20 08:36:14.505334 master-0 kubenswrapper[7465]: I0320 08:36:14.505162 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="68252533-bd64-4fc5-838a-cc350cbe77f0" volumeName="kubernetes.io/secret/68252533-bd64-4fc5-838a-cc350cbe77f0-serving-cert" seLinuxMountContext="" Mar 20 08:36:14.505334 master-0 kubenswrapper[7465]: I0320 08:36:14.505177 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="813f91c2-2b37-4681-968d-4217e286e22f" volumeName="kubernetes.io/projected/813f91c2-2b37-4681-968d-4217e286e22f-kube-api-access-njjkt" seLinuxMountContext="" Mar 20 08:36:14.505334 master-0 kubenswrapper[7465]: I0320 08:36:14.505230 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a57854ac-809a-4745-aaa1-774f0a08a560" volumeName="kubernetes.io/secret/a57854ac-809a-4745-aaa1-774f0a08a560-serving-cert" seLinuxMountContext="" Mar 20 08:36:14.505334 master-0 kubenswrapper[7465]: I0320 08:36:14.505247 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4291bfd-53d9-4c78-b7cb-d7eb46560528" volumeName="kubernetes.io/configmap/b4291bfd-53d9-4c78-b7cb-d7eb46560528-trusted-ca" seLinuxMountContext="" Mar 20 08:36:14.505334 master-0 kubenswrapper[7465]: I0320 08:36:14.505262 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b98b4efc-6117-487f-9cfc-38ce66dd9570" volumeName="kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 08:36:14.505334 master-0 kubenswrapper[7465]: I0320 08:36:14.505281 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f046860d-2d54-4746-8ba2-f8e90fa55e38" volumeName="kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-service-ca" seLinuxMountContext="" Mar 20 08:36:14.505334 master-0 kubenswrapper[7465]: I0320 08:36:14.505295 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb0fc10f-5796-4cd5-b8f5-72d678054c24" volumeName="kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 08:36:14.505334 master-0 kubenswrapper[7465]: I0320 08:36:14.505310 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="248a3d2f-3be4-46bf-959c-79d28736c0d6" volumeName="kubernetes.io/projected/248a3d2f-3be4-46bf-959c-79d28736c0d6-kube-api-access-mtc2p" seLinuxMountContext="" Mar 20 08:36:14.505334 master-0 kubenswrapper[7465]: I0320 08:36:14.505324 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="42df77ec-94aa-48ba-bb35-7b1f1e8b8e97" volumeName="kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-images" seLinuxMountContext="" Mar 20 08:36:14.505334 master-0 kubenswrapper[7465]: I0320 08:36:14.505351 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7" volumeName="kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cni-binary-copy" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505368 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="86cb5d23-df7f-4f67-8086-1789d8e68544" volumeName="kubernetes.io/secret/86cb5d23-df7f-4f67-8086-1789d8e68544-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505395 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aa16c3bf-2350-46d1-afa0-9477b3ec8877" volumeName="kubernetes.io/projected/aa16c3bf-2350-46d1-afa0-9477b3ec8877-kube-api-access-qmndg" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505409 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="acb704a9-6c8d-4378-ae93-e7095b1fce85" volumeName="kubernetes.io/projected/acb704a9-6c8d-4378-ae93-e7095b1fce85-kube-api-access-xvx6w" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505422 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b98b4efc-6117-487f-9cfc-38ce66dd9570" volumeName="kubernetes.io/projected/b98b4efc-6117-487f-9cfc-38ce66dd9570-kube-api-access-6c5ch" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505435 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f046860d-2d54-4746-8ba2-f8e90fa55e38" volumeName="kubernetes.io/projected/f046860d-2d54-4746-8ba2-f8e90fa55e38-kube-api-access-xhkh7" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505450 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f53bc282-5937-49ac-ac98-2ee37ccb268d" volumeName="kubernetes.io/projected/f53bc282-5937-49ac-ac98-2ee37ccb268d-kube-api-access-b2d4r" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505463 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="325f0a83-d56d-4b62-977b-088a7d5f0e00" volumeName="kubernetes.io/configmap/325f0a83-d56d-4b62-977b-088a7d5f0e00-config" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505478 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5e3ddf9e-eeb5-4266-b675-092fd4e27623" volumeName="kubernetes.io/secret/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505515 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a57854ac-809a-4745-aaa1-774f0a08a560" volumeName="kubernetes.io/configmap/a57854ac-809a-4745-aaa1-774f0a08a560-config" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505528 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b98b4efc-6117-487f-9cfc-38ce66dd9570" volumeName="kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-whereabouts-flatfile-configmap" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505543 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f046860d-2d54-4746-8ba2-f8e90fa55e38" volumeName="kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-config" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505559 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fa759777-de22-4440-a3d3-ad429a3b8e7b" volumeName="kubernetes.io/secret/fa759777-de22-4440-a3d3-ad429a3b8e7b-serving-cert" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505623 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb0fc10f-5796-4cd5-b8f5-72d678054c24" volumeName="kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-env-overrides" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505637 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="68252533-bd64-4fc5-838a-cc350cbe77f0" volumeName="kubernetes.io/configmap/68252533-bd64-4fc5-838a-cc350cbe77f0-config" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505651 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7c4e7e57-43be-4d31-b523-f7e4d316dce3" volumeName="kubernetes.io/projected/7c4e7e57-43be-4d31-b523-f7e4d316dce3-kube-api-access-bvjkr" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505665 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab175f7e-a5e8-4fda-98c9-6d052a221a83" volumeName="kubernetes.io/projected/ab175f7e-a5e8-4fda-98c9-6d052a221a83-kube-api-access-zmrxh" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505685 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02" volumeName="kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-kube-api-access-8d57k" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505698 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f046860d-2d54-4746-8ba2-f8e90fa55e38" volumeName="kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-client" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505712 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fa759777-de22-4440-a3d3-ad429a3b8e7b" volumeName="kubernetes.io/projected/fa759777-de22-4440-a3d3-ad429a3b8e7b-kube-api-access" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505725 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="248a3d2f-3be4-46bf-959c-79d28736c0d6" volumeName="kubernetes.io/secret/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505739 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" volumeName="kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-trusted-ca-bundle" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505752 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="45e8b72b-564c-4bb1-b911-baff2d6c87ad" volumeName="kubernetes.io/configmap/45e8b72b-564c-4bb1-b911-baff2d6c87ad-telemetry-config" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505766 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5e3ddf9e-eeb5-4266-b675-092fd4e27623" volumeName="kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-env-overrides" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505780 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab175f7e-a5e8-4fda-98c9-6d052a221a83" volumeName="kubernetes.io/empty-dir/ab175f7e-a5e8-4fda-98c9-6d052a221a83-available-featuregates" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505794 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c1854ea4-c8e2-4289-84b6-1f18b2ac684f" volumeName="kubernetes.io/configmap/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-config" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505808 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dab97c35-fe60-4134-8715-a7c6dd085fb3" volumeName="kubernetes.io/configmap/dab97c35-fe60-4134-8715-a7c6dd085fb3-service-ca" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505823 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02" volumeName="kubernetes.io/configmap/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-trusted-ca" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505842 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f046860d-2d54-4746-8ba2-f8e90fa55e38" volumeName="kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-serving-cert" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505857 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="42df77ec-94aa-48ba-bb35-7b1f1e8b8e97" volumeName="kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-auth-proxy-config" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505873 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5e3ddf9e-eeb5-4266-b675-092fd4e27623" volumeName="kubernetes.io/projected/5e3ddf9e-eeb5-4266-b675-092fd4e27623-kube-api-access-xd6vv" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505888 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab175f7e-a5e8-4fda-98c9-6d052a221a83" volumeName="kubernetes.io/secret/ab175f7e-a5e8-4fda-98c9-6d052a221a83-serving-cert" seLinuxMountContext="" Mar 20 08:36:14.505841 master-0 kubenswrapper[7465]: I0320 08:36:14.505908 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="df428d5a-c722-4536-8e7f-cdd85c560481" volumeName="kubernetes.io/projected/df428d5a-c722-4536-8e7f-cdd85c560481-kube-api-access-dtcnq" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.505923 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7" volumeName="kubernetes.io/configmap/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-trusted-ca" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.505945 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb0fc10f-5796-4cd5-b8f5-72d678054c24" volumeName="kubernetes.io/secret/fb0fc10f-5796-4cd5-b8f5-72d678054c24-webhook-cert" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.505963 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="86cb5d23-df7f-4f67-8086-1789d8e68544" volumeName="kubernetes.io/projected/86cb5d23-df7f-4f67-8086-1789d8e68544-kube-api-access-j7k8m" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.505979 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4291bfd-53d9-4c78-b7cb-d7eb46560528" volumeName="kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-bound-sa-token" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.505994 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" volumeName="kubernetes.io/secret/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-serving-cert" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506009 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="75e3e2cc-aa56-41f3-8859-1c086f419d05" volumeName="kubernetes.io/configmap/75e3e2cc-aa56-41f3-8859-1c086f419d05-config" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506023 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a57854ac-809a-4745-aaa1-774f0a08a560" volumeName="kubernetes.io/projected/a57854ac-809a-4745-aaa1-774f0a08a560-kube-api-access" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506037 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aa16c3bf-2350-46d1-afa0-9477b3ec8877" volumeName="kubernetes.io/secret/aa16c3bf-2350-46d1-afa0-9477b3ec8877-cluster-storage-operator-serving-cert" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506051 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b98b4efc-6117-487f-9cfc-38ce66dd9570" volumeName="kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-binary-copy" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506064 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2a23d24-9e09-431e-8c3b-8456ff51a8d0" volumeName="kubernetes.io/configmap/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-config" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506079 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ee3cc021-67d8-4b7f-b443-16f18228712e" volumeName="kubernetes.io/configmap/ee3cc021-67d8-4b7f-b443-16f18228712e-iptables-alerter-script" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506095 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7" volumeName="kubernetes.io/projected/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-kube-api-access-fxg4z" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506108 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="acb704a9-6c8d-4378-ae93-e7095b1fce85" volumeName="kubernetes.io/configmap/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506120 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4291bfd-53d9-4c78-b7cb-d7eb46560528" volumeName="kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-kube-api-access-9nvl4" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506133 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ee3cc021-67d8-4b7f-b443-16f18228712e" volumeName="kubernetes.io/projected/ee3cc021-67d8-4b7f-b443-16f18228712e-kube-api-access-7l4b7" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506145 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" volumeName="kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-service-ca-bundle" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506251 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a80bd6f-2263-4251-8197-5173193f8afc" volumeName="kubernetes.io/projected/6a80bd6f-2263-4251-8197-5173193f8afc-kube-api-access-tqd2v" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506270 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dab97c35-fe60-4134-8715-a7c6dd085fb3" volumeName="kubernetes.io/projected/dab97c35-fe60-4134-8715-a7c6dd085fb3-kube-api-access" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506283 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="325f0a83-d56d-4b62-977b-088a7d5f0e00" volumeName="kubernetes.io/projected/325f0a83-d56d-4b62-977b-088a7d5f0e00-kube-api-access-lqdlf" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506297 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="42df77ec-94aa-48ba-bb35-7b1f1e8b8e97" volumeName="kubernetes.io/projected/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-kube-api-access-wrws5" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506309 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c1854ea4-c8e2-4289-84b6-1f18b2ac684f" volumeName="kubernetes.io/projected/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-kube-api-access" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506322 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c1854ea4-c8e2-4289-84b6-1f18b2ac684f" volumeName="kubernetes.io/secret/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-serving-cert" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506335 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f471ecc-922c-4cb1-9bdd-fdb5da08c592" volumeName="kubernetes.io/projected/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-kube-api-access-224dg" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506348 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7" volumeName="kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-daemon-config" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506361 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="75e3e2cc-aa56-41f3-8859-1c086f419d05" volumeName="kubernetes.io/projected/75e3e2cc-aa56-41f3-8859-1c086f419d05-kube-api-access-clz6w" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506375 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f046860d-2d54-4746-8ba2-f8e90fa55e38" volumeName="kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-ca" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506388 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f53bc282-5937-49ac-ac98-2ee37ccb268d" volumeName="kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-images" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506401 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="325f0a83-d56d-4b62-977b-088a7d5f0e00" volumeName="kubernetes.io/secret/325f0a83-d56d-4b62-977b-088a7d5f0e00-serving-cert" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506414 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="86cb5d23-df7f-4f67-8086-1789d8e68544" volumeName="kubernetes.io/empty-dir/86cb5d23-df7f-4f67-8086-1789d8e68544-operand-assets" seLinuxMountContext="" Mar 20 08:36:14.507170 master-0 kubenswrapper[7465]: I0320 08:36:14.506426 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2a23d24-9e09-431e-8c3b-8456ff51a8d0" volumeName="kubernetes.io/secret/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-serving-cert" seLinuxMountContext="" Mar 20 08:36:14.508634 master-0 kubenswrapper[7465]: I0320 08:36:14.508592 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fa759777-de22-4440-a3d3-ad429a3b8e7b" volumeName="kubernetes.io/configmap/fa759777-de22-4440-a3d3-ad429a3b8e7b-config" seLinuxMountContext="" Mar 20 08:36:14.508634 master-0 kubenswrapper[7465]: I0320 08:36:14.508638 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7b489385-2c96-4a97-8b31-362162de020e" volumeName="kubernetes.io/projected/7b489385-2c96-4a97-8b31-362162de020e-kube-api-access-pg6th" seLinuxMountContext="" Mar 20 08:36:14.508749 master-0 kubenswrapper[7465]: I0320 08:36:14.508658 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbc0b783-28d5-4554-b49d-c66082546f44" volumeName="kubernetes.io/projected/bbc0b783-28d5-4554-b49d-c66082546f44-kube-api-access-27qvw" seLinuxMountContext="" Mar 20 08:36:14.508749 master-0 kubenswrapper[7465]: I0320 08:36:14.508689 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2a23d24-9e09-431e-8c3b-8456ff51a8d0" volumeName="kubernetes.io/projected/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-kube-api-access-btdjt" seLinuxMountContext="" Mar 20 08:36:14.508749 master-0 kubenswrapper[7465]: I0320 08:36:14.508707 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02" volumeName="kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-bound-sa-token" seLinuxMountContext="" Mar 20 08:36:14.508749 master-0 kubenswrapper[7465]: I0320 08:36:14.508727 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7" volumeName="kubernetes.io/projected/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-kube-api-access-lpdk5" seLinuxMountContext="" Mar 20 08:36:14.508749 master-0 kubenswrapper[7465]: I0320 08:36:14.508742 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f53bc282-5937-49ac-ac98-2ee37ccb268d" volumeName="kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-config" seLinuxMountContext="" Mar 20 08:36:14.508915 master-0 kubenswrapper[7465]: I0320 08:36:14.508758 7465 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb0fc10f-5796-4cd5-b8f5-72d678054c24" volumeName="kubernetes.io/projected/fb0fc10f-5796-4cd5-b8f5-72d678054c24-kube-api-access-lh47j" seLinuxMountContext="" Mar 20 08:36:14.508915 master-0 kubenswrapper[7465]: I0320 08:36:14.508772 7465 reconstruct.go:97] "Volume reconstruction finished" Mar 20 08:36:14.508915 master-0 kubenswrapper[7465]: I0320 08:36:14.508783 7465 reconciler.go:26] "Reconciler: start to sync state" Mar 20 08:36:14.512852 master-0 kubenswrapper[7465]: I0320 08:36:14.512786 7465 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 08:36:14.534722 master-0 kubenswrapper[7465]: I0320 08:36:14.534528 7465 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 08:36:14.536353 master-0 kubenswrapper[7465]: I0320 08:36:14.536255 7465 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 08:36:14.536456 master-0 kubenswrapper[7465]: I0320 08:36:14.536402 7465 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 08:36:14.536522 master-0 kubenswrapper[7465]: I0320 08:36:14.536493 7465 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 08:36:14.536685 master-0 kubenswrapper[7465]: E0320 08:36:14.536594 7465 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 08:36:14.539460 master-0 kubenswrapper[7465]: I0320 08:36:14.539400 7465 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 08:36:14.548029 master-0 kubenswrapper[7465]: I0320 08:36:14.547959 7465 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="81fdbea135dce13afe4433f7d61b259980b46bfdce14d456ee42556d90e1cda4" exitCode=0 Mar 20 08:36:14.555639 master-0 kubenswrapper[7465]: I0320 08:36:14.555571 7465 generic.go:334] "Generic (PLEG): container finished" podID="fdfdabb8-83d6-4b38-a709-9e354062ba1a" containerID="ef7d3c19081b3942ae839231125bb3d9ed41e1148d63c694dd308a85f91f661c" exitCode=0 Mar 20 08:36:14.578149 master-0 kubenswrapper[7465]: I0320 08:36:14.578066 7465 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="09df5a13ce7374304f28bc120919f2392b8b1eedb768ae74aa71f1f46b1260f3" exitCode=0 Mar 20 08:36:14.578149 master-0 kubenswrapper[7465]: I0320 08:36:14.578144 7465 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="0562124cc868051528c8c76baabb685e9f641cfd32418a6cbc0b305b7b8b1525" exitCode=0 Mar 20 08:36:14.578337 master-0 kubenswrapper[7465]: I0320 08:36:14.578169 7465 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="ca1d7ca00a56b55ea93c4440e9e959ff93d3c3b08431ba60809fba320b9496a7" exitCode=0 Mar 20 08:36:14.578337 master-0 kubenswrapper[7465]: I0320 08:36:14.578228 7465 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="d7a9f93548b4324f9218b5fb15026983da36f57336679426ecdeef802c274095" exitCode=0 Mar 20 08:36:14.578337 master-0 kubenswrapper[7465]: I0320 08:36:14.578244 7465 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="2f1eef10f4235bf6943bb1062fc964d69fc5c901795041a7ddca120ef33de66d" exitCode=0 Mar 20 08:36:14.578337 master-0 kubenswrapper[7465]: I0320 08:36:14.578254 7465 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="5772594b3f3e6aae19a5e357ad1c9bc0dade5e494667c07e21d51c8697d24253" exitCode=0 Mar 20 08:36:14.591232 master-0 kubenswrapper[7465]: I0320 08:36:14.591197 7465 generic.go:334] "Generic (PLEG): container finished" podID="248a3d2f-3be4-46bf-959c-79d28736c0d6" containerID="f4a74ff585c6a7d1deca8c58f38e8ca10a816620bf09146c1f9ff9a31d89c1a7" exitCode=0 Mar 20 08:36:14.596665 master-0 kubenswrapper[7465]: I0320 08:36:14.596613 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 20 08:36:14.597296 master-0 kubenswrapper[7465]: I0320 08:36:14.597244 7465 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="96ff83659a940b2334d29c8de766dce2f56c0eddae2926cc1d3dd2e347430a94" exitCode=1 Mar 20 08:36:14.597296 master-0 kubenswrapper[7465]: I0320 08:36:14.597285 7465 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="9f89ba02fad31edac59462f307fd61765f540c624fde9c9ab1bb13b80a642b0c" exitCode=0 Mar 20 08:36:14.601032 master-0 kubenswrapper[7465]: I0320 08:36:14.600913 7465 generic.go:334] "Generic (PLEG): container finished" podID="412becc8-c1a7-422c-94d1-dd1849070ef1" containerID="21b9803fda84668208544ea6b68c3d3a859b684d4b97f36df7e3a02f81f34399" exitCode=0 Mar 20 08:36:14.636974 master-0 kubenswrapper[7465]: E0320 08:36:14.636903 7465 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 20 08:36:14.658062 master-0 kubenswrapper[7465]: I0320 08:36:14.658019 7465 manager.go:324] Recovery completed Mar 20 08:36:14.716312 master-0 kubenswrapper[7465]: I0320 08:36:14.716209 7465 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 08:36:14.716312 master-0 kubenswrapper[7465]: I0320 08:36:14.716283 7465 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 08:36:14.716312 master-0 kubenswrapper[7465]: I0320 08:36:14.716324 7465 state_mem.go:36] "Initialized new in-memory state store" Mar 20 08:36:14.716763 master-0 kubenswrapper[7465]: I0320 08:36:14.716726 7465 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 20 08:36:14.716763 master-0 kubenswrapper[7465]: I0320 08:36:14.716748 7465 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 20 08:36:14.716870 master-0 kubenswrapper[7465]: I0320 08:36:14.716774 7465 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 20 08:36:14.716870 master-0 kubenswrapper[7465]: I0320 08:36:14.716782 7465 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 20 08:36:14.716870 master-0 kubenswrapper[7465]: I0320 08:36:14.716790 7465 policy_none.go:49] "None policy: Start" Mar 20 08:36:14.719969 master-0 kubenswrapper[7465]: I0320 08:36:14.719940 7465 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 08:36:14.719969 master-0 kubenswrapper[7465]: I0320 08:36:14.719968 7465 state_mem.go:35] "Initializing new in-memory state store" Mar 20 08:36:14.720228 master-0 kubenswrapper[7465]: I0320 08:36:14.720166 7465 state_mem.go:75] "Updated machine memory state" Mar 20 08:36:14.720228 master-0 kubenswrapper[7465]: I0320 08:36:14.720226 7465 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 20 08:36:14.734985 master-0 kubenswrapper[7465]: I0320 08:36:14.734941 7465 manager.go:334] "Starting Device Plugin manager" Mar 20 08:36:14.735135 master-0 kubenswrapper[7465]: I0320 08:36:14.735085 7465 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 08:36:14.735135 master-0 kubenswrapper[7465]: I0320 08:36:14.735110 7465 server.go:79] "Starting device plugin registration server" Mar 20 08:36:14.735902 master-0 kubenswrapper[7465]: I0320 08:36:14.735852 7465 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 08:36:14.736008 master-0 kubenswrapper[7465]: I0320 08:36:14.735874 7465 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 08:36:14.736210 master-0 kubenswrapper[7465]: I0320 08:36:14.736166 7465 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 08:36:14.736301 master-0 kubenswrapper[7465]: I0320 08:36:14.736272 7465 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 08:36:14.736301 master-0 kubenswrapper[7465]: I0320 08:36:14.736287 7465 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 08:36:14.837128 master-0 kubenswrapper[7465]: I0320 08:36:14.837035 7465 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:36:14.837128 master-0 kubenswrapper[7465]: I0320 08:36:14.837067 7465 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 20 08:36:14.837763 master-0 kubenswrapper[7465]: I0320 08:36:14.837613 7465 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="156eb04894ecec25bfbcd06dc0b87578738f576b411e8c7c85f2a4cc0e48d6f0" Mar 20 08:36:14.837763 master-0 kubenswrapper[7465]: I0320 08:36:14.837668 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"9adcc83ca09a3e8a61346c1bb76c593566cc39bfca1852854fa89f14749366d6"} Mar 20 08:36:14.837763 master-0 kubenswrapper[7465]: I0320 08:36:14.837740 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"8f140ecbb839c536b9c01be26f60184770878f8b44764513599810ad9344fdb7"} Mar 20 08:36:14.837763 master-0 kubenswrapper[7465]: I0320 08:36:14.837751 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"24e53548727bf62b2d0124119a6e8fe69dde1c3031b4a91a063f8dd58773fece"} Mar 20 08:36:14.837763 master-0 kubenswrapper[7465]: I0320 08:36:14.837763 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"21ccef18afe96346c593d227394cf1225a9a87bf9c404fb2038be61860ddf492"} Mar 20 08:36:14.837976 master-0 kubenswrapper[7465]: I0320 08:36:14.837780 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"c7ff704cef5e82a8995a139ddd4e2496d1fd9c707ed823bbd9e67f8d259c2ea7"} Mar 20 08:36:14.837976 master-0 kubenswrapper[7465]: I0320 08:36:14.837794 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerDied","Data":"81fdbea135dce13afe4433f7d61b259980b46bfdce14d456ee42556d90e1cda4"} Mar 20 08:36:14.837976 master-0 kubenswrapper[7465]: I0320 08:36:14.837807 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"fa276789b940e2d115bb484a83b6dacabc102c332a9c60e259aaf653e5bc53d5"} Mar 20 08:36:14.837976 master-0 kubenswrapper[7465]: I0320 08:36:14.837824 7465 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ffb3901c3fd86821b7075a50ea55db7d24b5dcb1f13fa6241a8b0a2754ace0c" Mar 20 08:36:14.837976 master-0 kubenswrapper[7465]: I0320 08:36:14.837867 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"6369057ac22c507cf87d187a9b3dae20c7b88e730a831d4ea4937e43b3fed7fb"} Mar 20 08:36:14.837976 master-0 kubenswrapper[7465]: I0320 08:36:14.837881 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"bfb1d352b3974d283e9d723de75463f8ee764b61d5e2eec1ee72f20516314a15"} Mar 20 08:36:14.837976 master-0 kubenswrapper[7465]: I0320 08:36:14.837892 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"ac50f53bc5c085a17ffb0495207fc3be5a9d394ca5e56961969c0592f7f25b0d"} Mar 20 08:36:14.837976 master-0 kubenswrapper[7465]: I0320 08:36:14.837903 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"6a321a6b1056a105feaf6a80afaf577bd08c544abd875f2fd35678bf25e3bfcd"} Mar 20 08:36:14.837976 master-0 kubenswrapper[7465]: I0320 08:36:14.837915 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"4c2c291535895db17dfb8bf23092bd97a1919325de4d3a5857890c5d777dcb49"} Mar 20 08:36:14.837976 master-0 kubenswrapper[7465]: I0320 08:36:14.837946 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"879adddb20c3ea4126b46482343a718dc4153b404d31f5e2d5d624d657e93169"} Mar 20 08:36:14.837976 master-0 kubenswrapper[7465]: I0320 08:36:14.837957 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"96ff83659a940b2334d29c8de766dce2f56c0eddae2926cc1d3dd2e347430a94"} Mar 20 08:36:14.837976 master-0 kubenswrapper[7465]: I0320 08:36:14.837967 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"9f89ba02fad31edac59462f307fd61765f540c624fde9c9ab1bb13b80a642b0c"} Mar 20 08:36:14.837976 master-0 kubenswrapper[7465]: I0320 08:36:14.837977 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"91607e86857f069937668c70d7500f8b27325b07177f03b7a7ab831b3600ac2a"} Mar 20 08:36:14.837976 master-0 kubenswrapper[7465]: I0320 08:36:14.837993 7465 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb6de0a9ae1218db5c085d7b11d80bbaa7dc058173d109d69d1996de9307a7d5" Mar 20 08:36:14.844256 master-0 kubenswrapper[7465]: I0320 08:36:14.840832 7465 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:36:14.844256 master-0 kubenswrapper[7465]: I0320 08:36:14.840918 7465 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:36:14.844256 master-0 kubenswrapper[7465]: I0320 08:36:14.840936 7465 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:36:14.844256 master-0 kubenswrapper[7465]: I0320 08:36:14.841061 7465 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 20 08:36:14.854973 master-0 kubenswrapper[7465]: E0320 08:36:14.854912 7465 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:14.855180 master-0 kubenswrapper[7465]: W0320 08:36:14.855137 7465 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 20 08:36:14.855283 master-0 kubenswrapper[7465]: E0320 08:36:14.855260 7465 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:36:14.855373 master-0 kubenswrapper[7465]: E0320 08:36:14.855320 7465 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:36:14.855431 master-0 kubenswrapper[7465]: E0320 08:36:14.855305 7465 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:36:14.856051 master-0 kubenswrapper[7465]: I0320 08:36:14.855997 7465 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 20 08:36:14.856224 master-0 kubenswrapper[7465]: I0320 08:36:14.856165 7465 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 20 08:36:14.914243 master-0 kubenswrapper[7465]: I0320 08:36:14.914160 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:36:14.914243 master-0 kubenswrapper[7465]: I0320 08:36:14.914239 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:14.914588 master-0 kubenswrapper[7465]: I0320 08:36:14.914282 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:14.914588 master-0 kubenswrapper[7465]: I0320 08:36:14.914307 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:14.914588 master-0 kubenswrapper[7465]: I0320 08:36:14.914334 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:36:14.914588 master-0 kubenswrapper[7465]: I0320 08:36:14.914415 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:14.914588 master-0 kubenswrapper[7465]: I0320 08:36:14.914560 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:14.914780 master-0 kubenswrapper[7465]: I0320 08:36:14.914622 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:36:14.914780 master-0 kubenswrapper[7465]: I0320 08:36:14.914672 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:36:14.914780 master-0 kubenswrapper[7465]: I0320 08:36:14.914707 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:36:14.914780 master-0 kubenswrapper[7465]: I0320 08:36:14.914736 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:14.914780 master-0 kubenswrapper[7465]: I0320 08:36:14.914768 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:14.914958 master-0 kubenswrapper[7465]: I0320 08:36:14.914886 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:36:14.915004 master-0 kubenswrapper[7465]: I0320 08:36:14.914973 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:14.915045 master-0 kubenswrapper[7465]: I0320 08:36:14.915001 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:14.915045 master-0 kubenswrapper[7465]: I0320 08:36:14.915026 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:14.915045 master-0 kubenswrapper[7465]: I0320 08:36:14.915049 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:14.952500 master-0 kubenswrapper[7465]: E0320 08:36:14.952431 7465 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:15.016035 master-0 kubenswrapper[7465]: I0320 08:36:15.015973 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:36:15.016035 master-0 kubenswrapper[7465]: I0320 08:36:15.016024 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:36:15.016035 master-0 kubenswrapper[7465]: I0320 08:36:15.016043 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:15.016439 master-0 kubenswrapper[7465]: I0320 08:36:15.016143 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:36:15.016439 master-0 kubenswrapper[7465]: I0320 08:36:15.016149 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:15.016439 master-0 kubenswrapper[7465]: I0320 08:36:15.016206 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:36:15.016439 master-0 kubenswrapper[7465]: I0320 08:36:15.016233 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:36:15.016439 master-0 kubenswrapper[7465]: I0320 08:36:15.016239 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:15.016582 master-0 kubenswrapper[7465]: I0320 08:36:15.016433 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:15.016582 master-0 kubenswrapper[7465]: I0320 08:36:15.016538 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:15.016645 master-0 kubenswrapper[7465]: I0320 08:36:15.016613 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:15.016645 master-0 kubenswrapper[7465]: I0320 08:36:15.016624 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:15.016645 master-0 kubenswrapper[7465]: I0320 08:36:15.016567 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:15.016728 master-0 kubenswrapper[7465]: I0320 08:36:15.016649 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:15.016728 master-0 kubenswrapper[7465]: I0320 08:36:15.016653 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:15.016728 master-0 kubenswrapper[7465]: I0320 08:36:15.016681 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:36:15.016728 master-0 kubenswrapper[7465]: I0320 08:36:15.016697 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:36:15.016728 master-0 kubenswrapper[7465]: I0320 08:36:15.016681 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:15.016890 master-0 kubenswrapper[7465]: I0320 08:36:15.016812 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:15.016929 master-0 kubenswrapper[7465]: I0320 08:36:15.016888 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:15.016929 master-0 kubenswrapper[7465]: I0320 08:36:15.016923 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:15.016985 master-0 kubenswrapper[7465]: I0320 08:36:15.016924 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:36:15.016985 master-0 kubenswrapper[7465]: I0320 08:36:15.016947 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:15.017042 master-0 kubenswrapper[7465]: I0320 08:36:15.016996 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:15.017042 master-0 kubenswrapper[7465]: I0320 08:36:15.017009 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:15.017092 master-0 kubenswrapper[7465]: I0320 08:36:15.017053 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:36:15.017092 master-0 kubenswrapper[7465]: I0320 08:36:15.017060 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:15.017092 master-0 kubenswrapper[7465]: I0320 08:36:15.017076 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:15.017174 master-0 kubenswrapper[7465]: I0320 08:36:15.017092 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:36:15.017174 master-0 kubenswrapper[7465]: I0320 08:36:15.017095 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:15.017174 master-0 kubenswrapper[7465]: I0320 08:36:15.017156 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:15.017281 master-0 kubenswrapper[7465]: I0320 08:36:15.017224 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:15.017319 master-0 kubenswrapper[7465]: I0320 08:36:15.017275 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:36:15.017362 master-0 kubenswrapper[7465]: I0320 08:36:15.017310 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:36:15.474815 master-0 kubenswrapper[7465]: I0320 08:36:15.474739 7465 apiserver.go:52] "Watching apiserver" Mar 20 08:36:15.484230 master-0 kubenswrapper[7465]: I0320 08:36:15.484167 7465 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 08:36:15.486880 master-0 kubenswrapper[7465]: I0320 08:36:15.486817 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-network-diagnostics/network-check-target-xnrw6","openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7","openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28","openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h","openshift-config-operator/openshift-config-operator-95bf4f4d-25cml","openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz","openshift-marketplace/marketplace-operator-89ccd998f-mvn4t","openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l","openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj","kube-system/bootstrap-kube-scheduler-master-0","openshift-dns-operator/dns-operator-9c5679d8f-r6dm8","openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq","openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m","kube-system/bootstrap-kube-controller-manager-master-0","openshift-etcd/etcd-master-0-master-0","openshift-multus/multus-additional-cni-plugins-rpbcn","openshift-network-operator/iptables-alerter-dd9wv","openshift-network-operator/network-operator-7bd846bfc4-mt454","openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj","openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh","openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr","openshift-network-node-identity/network-node-identity-6t5vb","openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx","openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4","openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4","openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd","openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh","openshift-multus/network-metrics-daemon-srdjm","openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77","assisted-installer/assisted-installer-controller-w2zwp","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5","openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr","openshift-multus/multus-2fp4b","openshift-ovn-kubernetes/ovnkube-node-rxdwp","openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg"] Mar 20 08:36:15.487212 master-0 kubenswrapper[7465]: I0320 08:36:15.487147 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:36:15.487374 master-0 kubenswrapper[7465]: I0320 08:36:15.487282 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:15.488245 master-0 kubenswrapper[7465]: I0320 08:36:15.488201 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:15.492887 master-0 kubenswrapper[7465]: I0320 08:36:15.492782 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:15.493000 master-0 kubenswrapper[7465]: I0320 08:36:15.492973 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:15.493080 master-0 kubenswrapper[7465]: I0320 08:36:15.493049 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:15.494297 master-0 kubenswrapper[7465]: I0320 08:36:15.494083 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 08:36:15.494297 master-0 kubenswrapper[7465]: I0320 08:36:15.494141 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 08:36:15.494297 master-0 kubenswrapper[7465]: I0320 08:36:15.494085 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 20 08:36:15.494430 master-0 kubenswrapper[7465]: I0320 08:36:15.494219 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 20 08:36:15.495295 master-0 kubenswrapper[7465]: I0320 08:36:15.494629 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:15.497866 master-0 kubenswrapper[7465]: I0320 08:36:15.497605 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 08:36:15.497866 master-0 kubenswrapper[7465]: I0320 08:36:15.497746 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 08:36:15.497866 master-0 kubenswrapper[7465]: I0320 08:36:15.497814 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 08:36:15.498270 master-0 kubenswrapper[7465]: I0320 08:36:15.498229 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 08:36:15.498334 master-0 kubenswrapper[7465]: I0320 08:36:15.498281 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:15.498427 master-0 kubenswrapper[7465]: I0320 08:36:15.498412 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 08:36:15.498702 master-0 kubenswrapper[7465]: I0320 08:36:15.498671 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:15.498763 master-0 kubenswrapper[7465]: I0320 08:36:15.498726 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 08:36:15.498817 master-0 kubenswrapper[7465]: I0320 08:36:15.498681 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:15.498987 master-0 kubenswrapper[7465]: I0320 08:36:15.498959 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:15.499054 master-0 kubenswrapper[7465]: I0320 08:36:15.498689 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:15.499095 master-0 kubenswrapper[7465]: I0320 08:36:15.499039 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:15.499176 master-0 kubenswrapper[7465]: I0320 08:36:15.498967 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 08:36:15.499240 master-0 kubenswrapper[7465]: I0320 08:36:15.499176 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 08:36:15.499240 master-0 kubenswrapper[7465]: I0320 08:36:15.499134 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 08:36:15.499390 master-0 kubenswrapper[7465]: I0320 08:36:15.499371 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 08:36:15.500752 master-0 kubenswrapper[7465]: I0320 08:36:15.500722 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 20 08:36:15.511895 master-0 kubenswrapper[7465]: I0320 08:36:15.511849 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:15.512131 master-0 kubenswrapper[7465]: I0320 08:36:15.512097 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:15.512607 master-0 kubenswrapper[7465]: I0320 08:36:15.512575 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:36:15.513918 master-0 kubenswrapper[7465]: I0320 08:36:15.513886 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 08:36:15.514637 master-0 kubenswrapper[7465]: I0320 08:36:15.514599 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 20 08:36:15.514692 master-0 kubenswrapper[7465]: I0320 08:36:15.514659 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 08:36:15.515176 master-0 kubenswrapper[7465]: I0320 08:36:15.515021 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 08:36:15.515416 master-0 kubenswrapper[7465]: I0320 08:36:15.515378 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 20 08:36:15.515513 master-0 kubenswrapper[7465]: I0320 08:36:15.515485 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 08:36:15.516270 master-0 kubenswrapper[7465]: I0320 08:36:15.516244 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 08:36:15.516455 master-0 kubenswrapper[7465]: I0320 08:36:15.516428 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 08:36:15.516612 master-0 kubenswrapper[7465]: I0320 08:36:15.516587 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 08:36:15.516612 master-0 kubenswrapper[7465]: I0320 08:36:15.516600 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 08:36:15.516806 master-0 kubenswrapper[7465]: I0320 08:36:15.516781 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 20 08:36:15.516861 master-0 kubenswrapper[7465]: I0320 08:36:15.516838 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 20 08:36:15.517096 master-0 kubenswrapper[7465]: I0320 08:36:15.517055 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 08:36:15.517420 master-0 kubenswrapper[7465]: I0320 08:36:15.517387 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 08:36:15.517553 master-0 kubenswrapper[7465]: I0320 08:36:15.516788 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 08:36:15.517689 master-0 kubenswrapper[7465]: I0320 08:36:15.517663 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 08:36:15.517742 master-0 kubenswrapper[7465]: I0320 08:36:15.517715 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 08:36:15.517888 master-0 kubenswrapper[7465]: I0320 08:36:15.517855 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 20 08:36:15.517925 master-0 kubenswrapper[7465]: I0320 08:36:15.517889 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 20 08:36:15.518029 master-0 kubenswrapper[7465]: I0320 08:36:15.517999 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 08:36:15.518060 master-0 kubenswrapper[7465]: I0320 08:36:15.518020 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 08:36:15.518109 master-0 kubenswrapper[7465]: I0320 08:36:15.518041 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 08:36:15.518303 master-0 kubenswrapper[7465]: I0320 08:36:15.518275 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 20 08:36:15.518596 master-0 kubenswrapper[7465]: I0320 08:36:15.518556 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 08:36:15.518791 master-0 kubenswrapper[7465]: I0320 08:36:15.518760 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 08:36:15.518948 master-0 kubenswrapper[7465]: I0320 08:36:15.517670 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 20 08:36:15.518948 master-0 kubenswrapper[7465]: I0320 08:36:15.518282 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 08:36:15.520172 master-0 kubenswrapper[7465]: I0320 08:36:15.519913 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 08:36:15.520172 master-0 kubenswrapper[7465]: I0320 08:36:15.519925 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 08:36:15.520863 master-0 kubenswrapper[7465]: I0320 08:36:15.520822 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 08:36:15.521246 master-0 kubenswrapper[7465]: I0320 08:36:15.521208 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-multus-certs\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.521307 master-0 kubenswrapper[7465]: I0320 08:36:15.521268 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-images\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:15.521307 master-0 kubenswrapper[7465]: I0320 08:36:15.521295 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 08:36:15.521307 master-0 kubenswrapper[7465]: I0320 08:36:15.521303 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-system-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.521425 master-0 kubenswrapper[7465]: I0320 08:36:15.521373 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:15.521462 master-0 kubenswrapper[7465]: I0320 08:36:15.521433 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:15.521499 master-0 kubenswrapper[7465]: I0320 08:36:15.521468 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:15.521574 master-0 kubenswrapper[7465]: I0320 08:36:15.521542 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nvl4\" (UniqueName: \"kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-kube-api-access-9nvl4\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:15.521645 master-0 kubenswrapper[7465]: I0320 08:36:15.521613 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-client\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:15.521685 master-0 kubenswrapper[7465]: I0320 08:36:15.521647 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 20 08:36:15.521762 master-0 kubenswrapper[7465]: I0320 08:36:15.521653 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:15.521832 master-0 kubenswrapper[7465]: I0320 08:36:15.521798 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvjkr\" (UniqueName: \"kubernetes.io/projected/7c4e7e57-43be-4d31-b523-f7e4d316dce3-kube-api-access-bvjkr\") pod \"csi-snapshot-controller-operator-5f5d689c6b-4dqrh\" (UID: \"7c4e7e57-43be-4d31-b523-f7e4d316dce3\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh" Mar 20 08:36:15.521870 master-0 kubenswrapper[7465]: I0320 08:36:15.521851 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/45e8b72b-564c-4bb1-b911-baff2d6c87ad-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:15.522074 master-0 kubenswrapper[7465]: I0320 08:36:15.521973 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-system-cni-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.522118 master-0 kubenswrapper[7465]: I0320 08:36:15.522087 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-conf-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.522199 master-0 kubenswrapper[7465]: I0320 08:36:15.522147 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68252533-bd64-4fc5-838a-cc350cbe77f0-config\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:15.522357 master-0 kubenswrapper[7465]: I0320 08:36:15.522317 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7k8m\" (UniqueName: \"kubernetes.io/projected/86cb5d23-df7f-4f67-8086-1789d8e68544-kube-api-access-j7k8m\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:15.522492 master-0 kubenswrapper[7465]: I0320 08:36:15.522460 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75r7k\" (UniqueName: \"kubernetes.io/projected/68252533-bd64-4fc5-838a-cc350cbe77f0-kube-api-access-75r7k\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:15.522533 master-0 kubenswrapper[7465]: I0320 08:36:15.522508 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68252533-bd64-4fc5-838a-cc350cbe77f0-config\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:15.522590 master-0 kubenswrapper[7465]: I0320 08:36:15.522525 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c5ch\" (UniqueName: \"kubernetes.io/projected/b98b4efc-6117-487f-9cfc-38ce66dd9570-kube-api-access-6c5ch\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.522633 master-0 kubenswrapper[7465]: I0320 08:36:15.522611 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-os-release\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.522685 master-0 kubenswrapper[7465]: I0320 08:36:15.522660 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cni-binary-copy\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.522747 master-0 kubenswrapper[7465]: I0320 08:36:15.522696 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-bin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.522782 master-0 kubenswrapper[7465]: I0320 08:36:15.522570 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 08:36:15.522811 master-0 kubenswrapper[7465]: I0320 08:36:15.522775 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 08:36:15.522844 master-0 kubenswrapper[7465]: I0320 08:36:15.522806 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 08:36:15.523045 master-0 kubenswrapper[7465]: I0320 08:36:15.523018 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 08:36:15.523045 master-0 kubenswrapper[7465]: I0320 08:36:15.523034 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 08:36:15.523115 master-0 kubenswrapper[7465]: I0320 08:36:15.522638 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 08:36:15.524681 master-0 kubenswrapper[7465]: I0320 08:36:15.524616 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 08:36:15.524913 master-0 kubenswrapper[7465]: I0320 08:36:15.524829 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 08:36:15.525131 master-0 kubenswrapper[7465]: I0320 08:36:15.525078 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 08:36:15.525286 master-0 kubenswrapper[7465]: I0320 08:36:15.525249 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 20 08:36:15.525286 master-0 kubenswrapper[7465]: I0320 08:36:15.525271 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 08:36:15.525391 master-0 kubenswrapper[7465]: I0320 08:36:15.525340 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 08:36:15.525434 master-0 kubenswrapper[7465]: I0320 08:36:15.525403 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 08:36:15.525814 master-0 kubenswrapper[7465]: I0320 08:36:15.525778 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 08:36:15.525874 master-0 kubenswrapper[7465]: I0320 08:36:15.525813 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 08:36:15.525948 master-0 kubenswrapper[7465]: I0320 08:36:15.525920 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 20 08:36:15.526198 master-0 kubenswrapper[7465]: I0320 08:36:15.526155 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 08:36:15.526198 master-0 kubenswrapper[7465]: I0320 08:36:15.525127 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 08:36:15.526198 master-0 kubenswrapper[7465]: I0320 08:36:15.526178 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 08:36:15.526275 master-0 kubenswrapper[7465]: I0320 08:36:15.526257 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 20 08:36:15.526328 master-0 kubenswrapper[7465]: I0320 08:36:15.522685 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 08:36:15.526391 master-0 kubenswrapper[7465]: I0320 08:36:15.522750 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 08:36:15.526510 master-0 kubenswrapper[7465]: I0320 08:36:15.525922 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 08:36:15.526510 master-0 kubenswrapper[7465]: I0320 08:36:15.526467 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-client\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:15.526752 master-0 kubenswrapper[7465]: I0320 08:36:15.526479 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 08:36:15.526915 master-0 kubenswrapper[7465]: I0320 08:36:15.526820 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:15.526915 master-0 kubenswrapper[7465]: I0320 08:36:15.526685 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 08:36:15.526915 master-0 kubenswrapper[7465]: I0320 08:36:15.526912 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 08:36:15.527026 master-0 kubenswrapper[7465]: I0320 08:36:15.526714 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 08:36:15.527148 master-0 kubenswrapper[7465]: I0320 08:36:15.527061 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 08:36:15.527148 master-0 kubenswrapper[7465]: I0320 08:36:15.527099 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:15.527235 master-0 kubenswrapper[7465]: I0320 08:36:15.527157 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4291bfd-53d9-4c78-b7cb-d7eb46560528-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:15.527269 master-0 kubenswrapper[7465]: I0320 08:36:15.527241 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-os-release\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.527298 master-0 kubenswrapper[7465]: I0320 08:36:15.527284 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njjkt\" (UniqueName: \"kubernetes.io/projected/813f91c2-2b37-4681-968d-4217e286e22f-kube-api-access-njjkt\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:15.527382 master-0 kubenswrapper[7465]: I0320 08:36:15.527326 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:15.527431 master-0 kubenswrapper[7465]: I0320 08:36:15.527399 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27qvw\" (UniqueName: \"kubernetes.io/projected/bbc0b783-28d5-4554-b49d-c66082546f44-kube-api-access-27qvw\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:15.527431 master-0 kubenswrapper[7465]: I0320 08:36:15.527427 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab175f7e-a5e8-4fda-98c9-6d052a221a83-serving-cert\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:15.527499 master-0 kubenswrapper[7465]: I0320 08:36:15.527455 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ab175f7e-a5e8-4fda-98c9-6d052a221a83-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:15.527617 master-0 kubenswrapper[7465]: I0320 08:36:15.527544 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 08:36:15.527695 master-0 kubenswrapper[7465]: I0320 08:36:15.527648 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-224dg\" (UniqueName: \"kubernetes.io/projected/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-kube-api-access-224dg\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:15.527695 master-0 kubenswrapper[7465]: I0320 08:36:15.527684 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-daemon-config\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.527771 master-0 kubenswrapper[7465]: I0320 08:36:15.527711 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:15.527771 master-0 kubenswrapper[7465]: I0320 08:36:15.527749 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-bound-sa-token\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:15.527834 master-0 kubenswrapper[7465]: I0320 08:36:15.527778 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d57k\" (UniqueName: \"kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-kube-api-access-8d57k\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:15.527834 master-0 kubenswrapper[7465]: I0320 08:36:15.527807 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-cnibin\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.527893 master-0 kubenswrapper[7465]: I0320 08:36:15.527841 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.527893 master-0 kubenswrapper[7465]: I0320 08:36:15.527875 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa16c3bf-2350-46d1-afa0-9477b3ec8877-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-vlq7h\" (UID: \"aa16c3bf-2350-46d1-afa0-9477b3ec8877\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:36:15.528118 master-0 kubenswrapper[7465]: I0320 08:36:15.527903 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:15.528170 master-0 kubenswrapper[7465]: I0320 08:36:15.528133 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zpkn\" (UniqueName: \"kubernetes.io/projected/45e8b72b-564c-4bb1-b911-baff2d6c87ad-kube-api-access-9zpkn\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:15.528226 master-0 kubenswrapper[7465]: I0320 08:36:15.528168 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-serving-cert\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:15.528282 master-0 kubenswrapper[7465]: I0320 08:36:15.528256 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhkh7\" (UniqueName: \"kubernetes.io/projected/f046860d-2d54-4746-8ba2-f8e90fa55e38-kube-api-access-xhkh7\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:15.528361 master-0 kubenswrapper[7465]: I0320 08:36:15.528321 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:15.528361 master-0 kubenswrapper[7465]: I0320 08:36:15.528356 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a57854ac-809a-4745-aaa1-774f0a08a560-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:15.528620 master-0 kubenswrapper[7465]: I0320 08:36:15.528563 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtcnq\" (UniqueName: \"kubernetes.io/projected/df428d5a-c722-4536-8e7f-cdd85c560481-kube-api-access-dtcnq\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:15.528620 master-0 kubenswrapper[7465]: I0320 08:36:15.528606 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:15.528712 master-0 kubenswrapper[7465]: I0320 08:36:15.528648 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s69rd\" (UniqueName: \"kubernetes.io/projected/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-kube-api-access-s69rd\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:15.529041 master-0 kubenswrapper[7465]: I0320 08:36:15.527708 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 08:36:15.530738 master-0 kubenswrapper[7465]: I0320 08:36:15.529426 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 08:36:15.530738 master-0 kubenswrapper[7465]: I0320 08:36:15.529588 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-serving-cert\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:15.530738 master-0 kubenswrapper[7465]: I0320 08:36:15.529783 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ab175f7e-a5e8-4fda-98c9-6d052a221a83-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:15.530738 master-0 kubenswrapper[7465]: I0320 08:36:15.529994 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-daemon-config\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.530738 master-0 kubenswrapper[7465]: I0320 08:36:15.530007 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 20 08:36:15.530738 master-0 kubenswrapper[7465]: I0320 08:36:15.530316 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab175f7e-a5e8-4fda-98c9-6d052a221a83-serving-cert\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:15.530738 master-0 kubenswrapper[7465]: I0320 08:36:15.527871 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:36:15.530738 master-0 kubenswrapper[7465]: I0320 08:36:15.528216 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 20 08:36:15.530738 master-0 kubenswrapper[7465]: I0320 08:36:15.528003 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 20 08:36:15.530738 master-0 kubenswrapper[7465]: I0320 08:36:15.530510 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:15.530738 master-0 kubenswrapper[7465]: I0320 08:36:15.530622 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 08:36:15.531235 master-0 kubenswrapper[7465]: I0320 08:36:15.530766 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 08:36:15.531235 master-0 kubenswrapper[7465]: I0320 08:36:15.528214 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 08:36:15.531235 master-0 kubenswrapper[7465]: I0320 08:36:15.531022 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 08:36:15.531235 master-0 kubenswrapper[7465]: I0320 08:36:15.528216 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 08:36:15.531235 master-0 kubenswrapper[7465]: I0320 08:36:15.531101 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 08:36:15.531235 master-0 kubenswrapper[7465]: I0320 08:36:15.528435 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 08:36:15.531506 master-0 kubenswrapper[7465]: I0320 08:36:15.528918 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:36:15.531564 master-0 kubenswrapper[7465]: I0320 08:36:15.531537 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 08:36:15.531692 master-0 kubenswrapper[7465]: I0320 08:36:15.531614 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 08:36:15.531933 master-0 kubenswrapper[7465]: I0320 08:36:15.531907 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-images\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:15.532100 master-0 kubenswrapper[7465]: I0320 08:36:15.532033 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 08:36:15.532263 master-0 kubenswrapper[7465]: I0320 08:36:15.532204 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 08:36:15.532898 master-0 kubenswrapper[7465]: I0320 08:36:15.532854 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 20 08:36:15.534125 master-0 kubenswrapper[7465]: I0320 08:36:15.534080 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dab97c35-fe60-4134-8715-a7c6dd085fb3-kube-api-access\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:15.534266 master-0 kubenswrapper[7465]: I0320 08:36:15.534237 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-socket-dir-parent\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.534370 master-0 kubenswrapper[7465]: I0320 08:36:15.534322 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-serving-cert\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:15.534566 master-0 kubenswrapper[7465]: I0320 08:36:15.534530 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-serving-cert\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:15.534639 master-0 kubenswrapper[7465]: I0320 08:36:15.534604 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa16c3bf-2350-46d1-afa0-9477b3ec8877-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-vlq7h\" (UID: \"aa16c3bf-2350-46d1-afa0-9477b3ec8877\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:36:15.534713 master-0 kubenswrapper[7465]: I0320 08:36:15.534518 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-config\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:15.535095 master-0 kubenswrapper[7465]: I0320 08:36:15.534953 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/325f0a83-d56d-4b62-977b-088a7d5f0e00-config\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:15.535227 master-0 kubenswrapper[7465]: I0320 08:36:15.535200 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-config\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:15.535536 master-0 kubenswrapper[7465]: I0320 08:36:15.535310 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cnibin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.535658 master-0 kubenswrapper[7465]: I0320 08:36:15.535596 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:15.536024 master-0 kubenswrapper[7465]: I0320 08:36:15.535844 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:15.536024 master-0 kubenswrapper[7465]: I0320 08:36:15.535988 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa759777-de22-4440-a3d3-ad429a3b8e7b-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:15.536102 master-0 kubenswrapper[7465]: I0320 08:36:15.536044 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/325f0a83-d56d-4b62-977b-088a7d5f0e00-config\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:15.536150 master-0 kubenswrapper[7465]: I0320 08:36:15.536124 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ad692349-5089-4afc-85b2-9b6e7997567c-host-etc-kube\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:36:15.536764 master-0 kubenswrapper[7465]: I0320 08:36:15.536735 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2d4r\" (UniqueName: \"kubernetes.io/projected/f53bc282-5937-49ac-ac98-2ee37ccb268d-kube-api-access-b2d4r\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:15.537016 master-0 kubenswrapper[7465]: I0320 08:36:15.536974 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cni-binary-copy\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.538728 master-0 kubenswrapper[7465]: I0320 08:36:15.536927 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/86cb5d23-df7f-4f67-8086-1789d8e68544-operand-assets\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:15.538795 master-0 kubenswrapper[7465]: I0320 08:36:15.538766 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-images\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:15.538831 master-0 kubenswrapper[7465]: I0320 08:36:15.538805 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-netns\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.538861 master-0 kubenswrapper[7465]: I0320 08:36:15.538832 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:15.538914 master-0 kubenswrapper[7465]: I0320 08:36:15.538893 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a57854ac-809a-4745-aaa1-774f0a08a560-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:15.538954 master-0 kubenswrapper[7465]: I0320 08:36:15.538924 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.538986 master-0 kubenswrapper[7465]: I0320 08:36:15.538951 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-multus\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.538986 master-0 kubenswrapper[7465]: I0320 08:36:15.538978 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:15.539043 master-0 kubenswrapper[7465]: I0320 08:36:15.539007 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqdlf\" (UniqueName: \"kubernetes.io/projected/325f0a83-d56d-4b62-977b-088a7d5f0e00-kube-api-access-lqdlf\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:15.539072 master-0 kubenswrapper[7465]: I0320 08:36:15.539032 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/325f0a83-d56d-4b62-977b-088a7d5f0e00-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:15.539100 master-0 kubenswrapper[7465]: I0320 08:36:15.539070 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-etc-kubernetes\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.539131 master-0 kubenswrapper[7465]: I0320 08:36:15.539118 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:15.539179 master-0 kubenswrapper[7465]: I0320 08:36:15.539142 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa759777-de22-4440-a3d3-ad429a3b8e7b-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:15.539230 master-0 kubenswrapper[7465]: I0320 08:36:15.539201 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:15.539230 master-0 kubenswrapper[7465]: I0320 08:36:15.539225 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrws5\" (UniqueName: \"kubernetes.io/projected/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-kube-api-access-wrws5\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:15.539282 master-0 kubenswrapper[7465]: I0320 08:36:15.539246 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6mb5\" (UniqueName: \"kubernetes.io/projected/ad692349-5089-4afc-85b2-9b6e7997567c-kube-api-access-h6mb5\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:36:15.539282 master-0 kubenswrapper[7465]: I0320 08:36:15.539271 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68252533-bd64-4fc5-838a-cc350cbe77f0-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:15.539365 master-0 kubenswrapper[7465]: I0320 08:36:15.539294 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75e3e2cc-aa56-41f3-8859-1c086f419d05-serving-cert\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:15.539365 master-0 kubenswrapper[7465]: I0320 08:36:15.539322 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab97c35-fe60-4134-8715-a7c6dd085fb3-service-ca\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:15.539424 master-0 kubenswrapper[7465]: I0320 08:36:15.539365 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:15.539424 master-0 kubenswrapper[7465]: I0320 08:36:15.539397 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clz6w\" (UniqueName: \"kubernetes.io/projected/75e3e2cc-aa56-41f3-8859-1c086f419d05-kube-api-access-clz6w\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:15.539424 master-0 kubenswrapper[7465]: I0320 08:36:15.539420 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:15.539507 master-0 kubenswrapper[7465]: I0320 08:36:15.539444 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvx6w\" (UniqueName: \"kubernetes.io/projected/acb704a9-6c8d-4378-ae93-e7095b1fce85-kube-api-access-xvx6w\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:15.539507 master-0 kubenswrapper[7465]: I0320 08:36:15.539467 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-config\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:15.539507 master-0 kubenswrapper[7465]: I0320 08:36:15.539485 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmrxh\" (UniqueName: \"kubernetes.io/projected/ab175f7e-a5e8-4fda-98c9-6d052a221a83-kube-api-access-zmrxh\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:15.539507 master-0 kubenswrapper[7465]: I0320 08:36:15.539507 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a57854ac-809a-4745-aaa1-774f0a08a560-config\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:15.539615 master-0 kubenswrapper[7465]: I0320 08:36:15.539527 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:15.539615 master-0 kubenswrapper[7465]: I0320 08:36:15.539548 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-ca\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:15.539615 master-0 kubenswrapper[7465]: I0320 08:36:15.539567 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/86cb5d23-df7f-4f67-8086-1789d8e68544-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:15.539615 master-0 kubenswrapper[7465]: I0320 08:36:15.539586 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.539615 master-0 kubenswrapper[7465]: I0320 08:36:15.539605 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-k8s-cni-cncf-io\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.539802 master-0 kubenswrapper[7465]: I0320 08:36:15.539622 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxg4z\" (UniqueName: \"kubernetes.io/projected/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-kube-api-access-fxg4z\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.539802 master-0 kubenswrapper[7465]: I0320 08:36:15.539645 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg6th\" (UniqueName: \"kubernetes.io/projected/7b489385-2c96-4a97-8b31-362162de020e-kube-api-access-pg6th\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:15.539802 master-0 kubenswrapper[7465]: I0320 08:36:15.539685 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-kubelet\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.539802 master-0 kubenswrapper[7465]: I0320 08:36:15.539704 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:15.539802 master-0 kubenswrapper[7465]: I0320 08:36:15.539727 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-trusted-ca\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:15.539802 master-0 kubenswrapper[7465]: I0320 08:36:15.539746 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:15.539802 master-0 kubenswrapper[7465]: I0320 08:36:15.539779 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-binary-copy\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.540003 master-0 kubenswrapper[7465]: I0320 08:36:15.539823 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.540003 master-0 kubenswrapper[7465]: I0320 08:36:15.539847 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-config\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:15.540003 master-0 kubenswrapper[7465]: I0320 08:36:15.539866 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:15.540003 master-0 kubenswrapper[7465]: I0320 08:36:15.539890 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:15.540003 master-0 kubenswrapper[7465]: I0320 08:36:15.539940 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ad692349-5089-4afc-85b2-9b6e7997567c-metrics-tls\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:36:15.540003 master-0 kubenswrapper[7465]: I0320 08:36:15.539963 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:15.540003 master-0 kubenswrapper[7465]: I0320 08:36:15.540000 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-hostroot\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.540205 master-0 kubenswrapper[7465]: I0320 08:36:15.540031 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:15.540205 master-0 kubenswrapper[7465]: I0320 08:36:15.540081 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmndg\" (UniqueName: \"kubernetes.io/projected/aa16c3bf-2350-46d1-afa0-9477b3ec8877-kube-api-access-qmndg\") pod \"cluster-storage-operator-7d87854d6-vlq7h\" (UID: \"aa16c3bf-2350-46d1-afa0-9477b3ec8877\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:36:15.540205 master-0 kubenswrapper[7465]: I0320 08:36:15.540112 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btdjt\" (UniqueName: \"kubernetes.io/projected/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-kube-api-access-btdjt\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:15.540205 master-0 kubenswrapper[7465]: I0320 08:36:15.540132 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:15.540205 master-0 kubenswrapper[7465]: I0320 08:36:15.540171 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e3e2cc-aa56-41f3-8859-1c086f419d05-config\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:15.540337 master-0 kubenswrapper[7465]: I0320 08:36:15.540217 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:15.540337 master-0 kubenswrapper[7465]: I0320 08:36:15.540240 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpdk5\" (UniqueName: \"kubernetes.io/projected/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-kube-api-access-lpdk5\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:15.540337 master-0 kubenswrapper[7465]: I0320 08:36:15.540258 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-config\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:15.540337 master-0 kubenswrapper[7465]: I0320 08:36:15.540304 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa759777-de22-4440-a3d3-ad429a3b8e7b-config\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:15.542286 master-0 kubenswrapper[7465]: I0320 08:36:15.540507 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a57854ac-809a-4745-aaa1-774f0a08a560-config\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:15.542286 master-0 kubenswrapper[7465]: I0320 08:36:15.540517 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-ca\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:15.542286 master-0 kubenswrapper[7465]: I0320 08:36:15.540801 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:15.542286 master-0 kubenswrapper[7465]: I0320 08:36:15.540879 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-images\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:15.542286 master-0 kubenswrapper[7465]: I0320 08:36:15.540935 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-config\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:15.542286 master-0 kubenswrapper[7465]: I0320 08:36:15.538713 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/86cb5d23-df7f-4f67-8086-1789d8e68544-operand-assets\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:15.542286 master-0 kubenswrapper[7465]: I0320 08:36:15.541445 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:15.542286 master-0 kubenswrapper[7465]: I0320 08:36:15.541470 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-binary-copy\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.542286 master-0 kubenswrapper[7465]: I0320 08:36:15.541511 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/325f0a83-d56d-4b62-977b-088a7d5f0e00-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:15.542286 master-0 kubenswrapper[7465]: I0320 08:36:15.541736 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ad692349-5089-4afc-85b2-9b6e7997567c-metrics-tls\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:36:15.542286 master-0 kubenswrapper[7465]: I0320 08:36:15.541911 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:15.542286 master-0 kubenswrapper[7465]: I0320 08:36:15.541926 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-config\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:15.542286 master-0 kubenswrapper[7465]: I0320 08:36:15.541946 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:15.542286 master-0 kubenswrapper[7465]: I0320 08:36:15.542149 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa759777-de22-4440-a3d3-ad429a3b8e7b-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:15.542286 master-0 kubenswrapper[7465]: I0320 08:36:15.542301 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-config\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:15.542748 master-0 kubenswrapper[7465]: I0320 08:36:15.542411 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75e3e2cc-aa56-41f3-8859-1c086f419d05-serving-cert\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:15.542748 master-0 kubenswrapper[7465]: I0320 08:36:15.542572 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a57854ac-809a-4745-aaa1-774f0a08a560-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:15.542748 master-0 kubenswrapper[7465]: I0320 08:36:15.542598 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.542748 master-0 kubenswrapper[7465]: I0320 08:36:15.542608 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68252533-bd64-4fc5-838a-cc350cbe77f0-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:15.542855 master-0 kubenswrapper[7465]: I0320 08:36:15.542815 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab97c35-fe60-4134-8715-a7c6dd085fb3-service-ca\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:15.542855 master-0 kubenswrapper[7465]: I0320 08:36:15.542842 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.542855 master-0 kubenswrapper[7465]: I0320 08:36:15.542828 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/45e8b72b-564c-4bb1-b911-baff2d6c87ad-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:15.542971 master-0 kubenswrapper[7465]: I0320 08:36:15.542936 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:15.543009 master-0 kubenswrapper[7465]: I0320 08:36:15.542971 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa759777-de22-4440-a3d3-ad429a3b8e7b-config\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:15.543093 master-0 kubenswrapper[7465]: I0320 08:36:15.543049 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/86cb5d23-df7f-4f67-8086-1789d8e68544-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:15.543389 master-0 kubenswrapper[7465]: I0320 08:36:15.543336 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e3e2cc-aa56-41f3-8859-1c086f419d05-config\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:15.543521 master-0 kubenswrapper[7465]: I0320 08:36:15.543491 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:15.547058 master-0 kubenswrapper[7465]: I0320 08:36:15.547023 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 08:36:15.547557 master-0 kubenswrapper[7465]: I0320 08:36:15.547529 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 08:36:15.548426 master-0 kubenswrapper[7465]: I0320 08:36:15.548397 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 08:36:15.549105 master-0 kubenswrapper[7465]: I0320 08:36:15.549077 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 20 08:36:15.549975 master-0 kubenswrapper[7465]: I0320 08:36:15.549939 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 08:36:15.550873 master-0 kubenswrapper[7465]: I0320 08:36:15.550851 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:15.551956 master-0 kubenswrapper[7465]: I0320 08:36:15.551904 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-trusted-ca\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:15.552677 master-0 kubenswrapper[7465]: I0320 08:36:15.552638 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4291bfd-53d9-4c78-b7cb-d7eb46560528-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:15.555423 master-0 kubenswrapper[7465]: I0320 08:36:15.555380 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:15.557454 master-0 kubenswrapper[7465]: I0320 08:36:15.557302 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 08:36:15.576389 master-0 kubenswrapper[7465]: I0320 08:36:15.576339 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 08:36:15.596821 master-0 kubenswrapper[7465]: I0320 08:36:15.596779 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 08:36:15.601565 master-0 kubenswrapper[7465]: I0320 08:36:15.601542 7465 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 20 08:36:15.617581 master-0 kubenswrapper[7465]: I0320 08:36:15.617510 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 08:36:15.637656 master-0 kubenswrapper[7465]: I0320 08:36:15.637605 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 08:36:15.641096 master-0 kubenswrapper[7465]: I0320 08:36:15.641062 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-socket-dir-parent\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.641214 master-0 kubenswrapper[7465]: I0320 08:36:15.641113 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-log-socket\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.641214 master-0 kubenswrapper[7465]: I0320 08:36:15.641141 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:15.641214 master-0 kubenswrapper[7465]: I0320 08:36:15.641166 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:15.641214 master-0 kubenswrapper[7465]: I0320 08:36:15.641196 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cnibin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.641214 master-0 kubenswrapper[7465]: I0320 08:36:15.641215 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-env-overrides\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:36:15.641376 master-0 kubenswrapper[7465]: I0320 08:36:15.641249 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ad692349-5089-4afc-85b2-9b6e7997567c-host-etc-kube\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:36:15.641376 master-0 kubenswrapper[7465]: I0320 08:36:15.641278 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-slash\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.641376 master-0 kubenswrapper[7465]: I0320 08:36:15.641297 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:15.641376 master-0 kubenswrapper[7465]: I0320 08:36:15.641317 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-netns\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.641376 master-0 kubenswrapper[7465]: I0320 08:36:15.641334 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2zzd\" (UniqueName: \"kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd\") pod \"network-check-target-xnrw6\" (UID: \"c0142d4e-9fd4-4375-a773-bb89b38af654\") " pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:36:15.641376 master-0 kubenswrapper[7465]: I0320 08:36:15.641361 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-multus\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.641376 master-0 kubenswrapper[7465]: I0320 08:36:15.641377 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.641554 master-0 kubenswrapper[7465]: I0320 08:36:15.641399 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.641554 master-0 kubenswrapper[7465]: I0320 08:36:15.641422 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-etc-kubernetes\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.641554 master-0 kubenswrapper[7465]: I0320 08:36:15.641461 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:15.641554 master-0 kubenswrapper[7465]: I0320 08:36:15.641486 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:15.641554 master-0 kubenswrapper[7465]: I0320 08:36:15.641517 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:15.641554 master-0 kubenswrapper[7465]: I0320 08:36:15.641540 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:36:15.641706 master-0 kubenswrapper[7465]: I0320 08:36:15.641560 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:15.641706 master-0 kubenswrapper[7465]: I0320 08:36:15.641587 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-env-overrides\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.641706 master-0 kubenswrapper[7465]: I0320 08:36:15.641613 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-bin\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.641706 master-0 kubenswrapper[7465]: I0320 08:36:15.641636 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.641706 master-0 kubenswrapper[7465]: I0320 08:36:15.641654 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-k8s-cni-cncf-io\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.641706 master-0 kubenswrapper[7465]: I0320 08:36:15.641674 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:15.641706 master-0 kubenswrapper[7465]: I0320 08:36:15.641699 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-kubelet\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.641891 master-0 kubenswrapper[7465]: I0320 08:36:15.641718 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb0fc10f-5796-4cd5-b8f5-72d678054c24-webhook-cert\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:36:15.641891 master-0 kubenswrapper[7465]: I0320 08:36:15.641738 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqd2v\" (UniqueName: \"kubernetes.io/projected/6a80bd6f-2263-4251-8197-5173193f8afc-kube-api-access-tqd2v\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:15.641891 master-0 kubenswrapper[7465]: I0320 08:36:15.641757 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:15.641891 master-0 kubenswrapper[7465]: I0320 08:36:15.641775 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-etc-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.641891 master-0 kubenswrapper[7465]: I0320 08:36:15.641795 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-hostroot\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.641891 master-0 kubenswrapper[7465]: I0320 08:36:15.641814 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l4b7\" (UniqueName: \"kubernetes.io/projected/ee3cc021-67d8-4b7f-b443-16f18228712e-kube-api-access-7l4b7\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:15.641891 master-0 kubenswrapper[7465]: I0320 08:36:15.641840 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-systemd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.641891 master-0 kubenswrapper[7465]: I0320 08:36:15.641860 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:15.641891 master-0 kubenswrapper[7465]: I0320 08:36:15.641884 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.641908 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtc2p\" (UniqueName: \"kubernetes.io/projected/248a3d2f-3be4-46bf-959c-79d28736c0d6-kube-api-access-mtc2p\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.641929 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd6vv\" (UniqueName: \"kubernetes.io/projected/5e3ddf9e-eeb5-4266-b675-092fd4e27623-kube-api-access-xd6vv\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.641963 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-multus-certs\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.641982 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-ovnkube-identity-cm\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642003 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-config\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642024 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642132 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642344 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ad692349-5089-4afc-85b2-9b6e7997567c-host-etc-kube\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642346 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642452 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-hostroot\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642503 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642510 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-multus-certs\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642567 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cnibin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642586 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-netns\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642380 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-k8s-cni-cncf-io\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642612 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-etc-kubernetes\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.642510 7465 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.642542 7465 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.642512 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.642564 7465 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642629 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-system-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.642681 7465 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.642689 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642756 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-socket-dir-parent\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.642789 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.642848 7465 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.642852 7465 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.642882 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert podName:7b489385-2c96-4a97-8b31-362162de020e nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.142850165 +0000 UTC m=+1.786165655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert") pod "olm-operator-5c9796789-tjm9l" (UID: "7b489385-2c96-4a97-8b31-362162de020e") : secret "olm-operator-serving-cert" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642880 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-system-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.642915 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-kubelet\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.642918 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics podName:acb704a9-6c8d-4378-ae93-e7095b1fce85 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.142894576 +0000 UTC m=+1.786210066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-mvn4t" (UID: "acb704a9-6c8d-4378-ae93-e7095b1fce85") : secret "marketplace-operator-metrics" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.643023 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.142952808 +0000 UTC m=+1.786268298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.643032 7465 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.643030 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-multus\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.643097 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.643049 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert podName:dab97c35-fe60-4134-8715-a7c6dd085fb3 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.14303919 +0000 UTC m=+1.786354680 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert") pod "cluster-version-operator-56d8475767-9gfkg" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3") : secret "cluster-version-operator-serving-cert" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.643211 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert podName:bbc0b783-28d5-4554-b49d-c66082546f44 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.143201265 +0000 UTC m=+1.786516755 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-2pg77" (UID: "bbc0b783-28d5-4554-b49d-c66082546f44") : secret "package-server-manager-serving-cert" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.643229 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.143221706 +0000 UTC m=+1.786537196 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.643245 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls podName:45e8b72b-564c-4bb1-b911-baff2d6c87ad nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.143238976 +0000 UTC m=+1.786554466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-nljsr" (UID: "45e8b72b-564c-4bb1-b911-baff2d6c87ad") : secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.643263 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert podName:df428d5a-c722-4536-8e7f-cdd85c560481 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.143255397 +0000 UTC m=+1.786570887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert") pod "catalog-operator-68f85b4d6c-fzm28" (UID: "df428d5a-c722-4536-8e7f-cdd85c560481") : secret "catalog-operator-serving-cert" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: E0320 08:36:15.643276 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls podName:ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.143270807 +0000 UTC m=+1.786586297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls") pod "ingress-operator-66b84d69b-gzg9m" (UID: "ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02") : secret "metrics-tls" not found Mar 20 08:36:15.643230 master-0 kubenswrapper[7465]: I0320 08:36:15.643299 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643327 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: E0320 08:36:15.643417 7465 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: E0320 08:36:15.643448 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.143432482 +0000 UTC m=+1.786747972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: E0320 08:36:15.643481 7465 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: E0320 08:36:15.643504 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs podName:813f91c2-2b37-4681-968d-4217e286e22f nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.143497774 +0000 UTC m=+1.786813264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs") pod "network-metrics-daemon-srdjm" (UID: "813f91c2-2b37-4681-968d-4217e286e22f") : secret "metrics-daemon-secret" not found Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: E0320 08:36:15.643523 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls podName:3f471ecc-922c-4cb1-9bdd-fdb5da08c592 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.143518034 +0000 UTC m=+1.786833514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls") pod "dns-operator-9c5679d8f-r6dm8" (UID: "3f471ecc-922c-4cb1-9bdd-fdb5da08c592") : secret "metrics-tls" not found Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643555 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-var-lib-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643582 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-conf-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643606 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643643 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-system-cni-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643674 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-conf-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643682 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh47j\" (UniqueName: \"kubernetes.io/projected/fb0fc10f-5796-4cd5-b8f5-72d678054c24-kube-api-access-lh47j\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643715 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-bin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643741 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643783 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-system-cni-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643790 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-os-release\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643831 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ee3cc021-67d8-4b7f-b443-16f18228712e-iptables-alerter-script\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643857 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-netd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: E0320 08:36:15.643858 7465 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: E0320 08:36:15.643899 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls podName:b4291bfd-53d9-4c78-b7cb-d7eb46560528 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.143891215 +0000 UTC m=+1.787206705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-fmhbq" (UID: "b4291bfd-53d9-4c78-b7cb-d7eb46560528") : secret "image-registry-operator-tls" not found Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643914 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-systemd-units\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643933 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-bin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643936 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-netns\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.643971 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-ovn\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.644009 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-os-release\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.644054 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-node-log\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.644204 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ee3cc021-67d8-4b7f-b443-16f18228712e-iptables-alerter-script\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.644407 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-os-release\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.644454 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovn-node-metrics-cert\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.644501 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-os-release\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.644578 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee3cc021-67d8-4b7f-b443-16f18228712e-host-slash\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.644722 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-env-overrides\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.644760 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-kubelet\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.644798 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.644844 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-cnibin\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: E0320 08:36:15.644892 7465 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: E0320 08:36:15.644920 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.144911885 +0000 UTC m=+1.788227385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "node-tuning-operator-tls" not found Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.644946 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.644970 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-cnibin\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.645016 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.645038 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-script-lib\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.645062 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: E0320 08:36:15.645173 7465 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: E0320 08:36:15.645208 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls podName:42df77ec-94aa-48ba-bb35-7b1f1e8b8e97 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.145201753 +0000 UTC m=+1.788517243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls") pod "machine-config-operator-84d549f6d5-gm4qr" (UID: "42df77ec-94aa-48ba-bb35-7b1f1e8b8e97") : secret "mco-proxy-tls" not found Mar 20 08:36:15.645255 master-0 kubenswrapper[7465]: I0320 08:36:15.645251 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:15.657793 master-0 kubenswrapper[7465]: I0320 08:36:15.657721 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 08:36:15.666878 master-0 kubenswrapper[7465]: I0320 08:36:15.666817 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-script-lib\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.681701 master-0 kubenswrapper[7465]: I0320 08:36:15.681627 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 08:36:15.685512 master-0 kubenswrapper[7465]: I0320 08:36:15.685407 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:36:15.699742 master-0 kubenswrapper[7465]: I0320 08:36:15.699009 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 08:36:15.702857 master-0 kubenswrapper[7465]: I0320 08:36:15.702787 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:36:15.703739 master-0 kubenswrapper[7465]: I0320 08:36:15.703671 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-config\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.717663 master-0 kubenswrapper[7465]: I0320 08:36:15.717602 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 08:36:15.723838 master-0 kubenswrapper[7465]: I0320 08:36:15.723766 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-env-overrides\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.723838 master-0 kubenswrapper[7465]: I0320 08:36:15.723824 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-env-overrides\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:36:15.738769 master-0 kubenswrapper[7465]: I0320 08:36:15.738629 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 08:36:15.746472 master-0 kubenswrapper[7465]: I0320 08:36:15.746408 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-slash\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.746658 master-0 kubenswrapper[7465]: I0320 08:36:15.746489 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2zzd\" (UniqueName: \"kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd\") pod \"network-check-target-xnrw6\" (UID: \"c0142d4e-9fd4-4375-a773-bb89b38af654\") " pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:36:15.746921 master-0 kubenswrapper[7465]: I0320 08:36:15.746868 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.746981 master-0 kubenswrapper[7465]: I0320 08:36:15.746912 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-slash\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.746981 master-0 kubenswrapper[7465]: I0320 08:36:15.746920 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.746981 master-0 kubenswrapper[7465]: I0320 08:36:15.746964 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.747091 master-0 kubenswrapper[7465]: I0320 08:36:15.747070 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.747153 master-0 kubenswrapper[7465]: I0320 08:36:15.747135 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:15.747255 master-0 kubenswrapper[7465]: I0320 08:36:15.747223 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-bin\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.747322 master-0 kubenswrapper[7465]: I0320 08:36:15.747301 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-bin\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.747450 master-0 kubenswrapper[7465]: E0320 08:36:15.747409 7465 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 20 08:36:15.747497 master-0 kubenswrapper[7465]: I0320 08:36:15.747451 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-etc-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.747538 master-0 kubenswrapper[7465]: I0320 08:36:15.747494 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-etc-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.747538 master-0 kubenswrapper[7465]: E0320 08:36:15.747502 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs podName:6a80bd6f-2263-4251-8197-5173193f8afc nodeName:}" failed. No retries permitted until 2026-03-20 08:36:16.247471968 +0000 UTC m=+1.890787648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-5rrrh" (UID: "6a80bd6f-2263-4251-8197-5173193f8afc") : secret "multus-admission-controller-secret" not found Mar 20 08:36:15.747597 master-0 kubenswrapper[7465]: I0320 08:36:15.747552 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-systemd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.747634 master-0 kubenswrapper[7465]: I0320 08:36:15.747617 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-systemd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.747728 master-0 kubenswrapper[7465]: I0320 08:36:15.747675 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-var-lib-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.747806 master-0 kubenswrapper[7465]: I0320 08:36:15.747788 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-netd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.747837 master-0 kubenswrapper[7465]: I0320 08:36:15.747816 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-systemd-units\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.747864 master-0 kubenswrapper[7465]: I0320 08:36:15.747838 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-netns\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.747900 master-0 kubenswrapper[7465]: I0320 08:36:15.747861 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-ovn\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.747927 master-0 kubenswrapper[7465]: I0320 08:36:15.747907 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-node-log\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.747955 master-0 kubenswrapper[7465]: I0320 08:36:15.747937 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee3cc021-67d8-4b7f-b443-16f18228712e-host-slash\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:15.748000 master-0 kubenswrapper[7465]: I0320 08:36:15.747982 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-kubelet\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.748062 master-0 kubenswrapper[7465]: I0320 08:36:15.748042 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.748585 master-0 kubenswrapper[7465]: I0320 08:36:15.748113 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-log-socket\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.748585 master-0 kubenswrapper[7465]: I0320 08:36:15.748216 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-log-socket\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.748585 master-0 kubenswrapper[7465]: I0320 08:36:15.748252 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-var-lib-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.748585 master-0 kubenswrapper[7465]: I0320 08:36:15.748306 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-ovn\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.748585 master-0 kubenswrapper[7465]: I0320 08:36:15.748345 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-netns\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.748585 master-0 kubenswrapper[7465]: I0320 08:36:15.748344 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-node-log\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.748585 master-0 kubenswrapper[7465]: I0320 08:36:15.748374 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.748585 master-0 kubenswrapper[7465]: I0320 08:36:15.748364 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-netd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.748585 master-0 kubenswrapper[7465]: I0320 08:36:15.748391 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-kubelet\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.748585 master-0 kubenswrapper[7465]: I0320 08:36:15.748418 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee3cc021-67d8-4b7f-b443-16f18228712e-host-slash\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:15.748585 master-0 kubenswrapper[7465]: I0320 08:36:15.748428 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-systemd-units\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.757850 master-0 kubenswrapper[7465]: I0320 08:36:15.757805 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 08:36:15.776967 master-0 kubenswrapper[7465]: I0320 08:36:15.776919 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 08:36:15.783028 master-0 kubenswrapper[7465]: I0320 08:36:15.782986 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb0fc10f-5796-4cd5-b8f5-72d678054c24-webhook-cert\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:36:15.797037 master-0 kubenswrapper[7465]: I0320 08:36:15.797003 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 08:36:15.805322 master-0 kubenswrapper[7465]: I0320 08:36:15.805280 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-env-overrides\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:36:15.817173 master-0 kubenswrapper[7465]: I0320 08:36:15.817147 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 08:36:15.823681 master-0 kubenswrapper[7465]: I0320 08:36:15.823640 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-ovnkube-identity-cm\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:36:15.837319 master-0 kubenswrapper[7465]: I0320 08:36:15.837278 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 08:36:15.856371 master-0 kubenswrapper[7465]: I0320 08:36:15.856332 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 08:36:15.865772 master-0 kubenswrapper[7465]: I0320 08:36:15.865743 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovn-node-metrics-cert\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:15.888504 master-0 kubenswrapper[7465]: I0320 08:36:15.888434 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nvl4\" (UniqueName: \"kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-kube-api-access-9nvl4\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:15.901452 master-0 kubenswrapper[7465]: I0320 08:36:15.901382 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:15.909485 master-0 kubenswrapper[7465]: I0320 08:36:15.909265 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvjkr\" (UniqueName: \"kubernetes.io/projected/7c4e7e57-43be-4d31-b523-f7e4d316dce3-kube-api-access-bvjkr\") pod \"csi-snapshot-controller-operator-5f5d689c6b-4dqrh\" (UID: \"7c4e7e57-43be-4d31-b523-f7e4d316dce3\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh" Mar 20 08:36:15.934836 master-0 kubenswrapper[7465]: I0320 08:36:15.934762 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7k8m\" (UniqueName: \"kubernetes.io/projected/86cb5d23-df7f-4f67-8086-1789d8e68544-kube-api-access-j7k8m\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:36:15.952732 master-0 kubenswrapper[7465]: I0320 08:36:15.952660 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c5ch\" (UniqueName: \"kubernetes.io/projected/b98b4efc-6117-487f-9cfc-38ce66dd9570-kube-api-access-6c5ch\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:36:15.969594 master-0 kubenswrapper[7465]: I0320 08:36:15.969529 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75r7k\" (UniqueName: \"kubernetes.io/projected/68252533-bd64-4fc5-838a-cc350cbe77f0-kube-api-access-75r7k\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:36:16.009727 master-0 kubenswrapper[7465]: I0320 08:36:16.009557 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s69rd\" (UniqueName: \"kubernetes.io/projected/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-kube-api-access-s69rd\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:36:16.030450 master-0 kubenswrapper[7465]: I0320 08:36:16.030405 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27qvw\" (UniqueName: \"kubernetes.io/projected/bbc0b783-28d5-4554-b49d-c66082546f44-kube-api-access-27qvw\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:16.049468 master-0 kubenswrapper[7465]: I0320 08:36:16.049425 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:36:16.068843 master-0 kubenswrapper[7465]: I0320 08:36:16.068806 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-224dg\" (UniqueName: \"kubernetes.io/projected/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-kube-api-access-224dg\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:16.088569 master-0 kubenswrapper[7465]: I0320 08:36:16.088517 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zpkn\" (UniqueName: \"kubernetes.io/projected/45e8b72b-564c-4bb1-b911-baff2d6c87ad-kube-api-access-9zpkn\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:16.090389 master-0 kubenswrapper[7465]: I0320 08:36:16.090208 7465 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:36:16.106654 master-0 kubenswrapper[7465]: I0320 08:36:16.106616 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-bound-sa-token\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:16.144939 master-0 kubenswrapper[7465]: I0320 08:36:16.144887 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhkh7\" (UniqueName: \"kubernetes.io/projected/f046860d-2d54-4746-8ba2-f8e90fa55e38-kube-api-access-xhkh7\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:36:16.154081 master-0 kubenswrapper[7465]: I0320 08:36:16.154030 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a57854ac-809a-4745-aaa1-774f0a08a560-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:36:16.155709 master-0 kubenswrapper[7465]: I0320 08:36:16.155667 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:16.155946 master-0 kubenswrapper[7465]: I0320 08:36:16.155909 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:16.156057 master-0 kubenswrapper[7465]: I0320 08:36:16.156026 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:16.156227 master-0 kubenswrapper[7465]: E0320 08:36:16.156074 7465 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 20 08:36:16.156271 master-0 kubenswrapper[7465]: E0320 08:36:16.156260 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert podName:dab97c35-fe60-4134-8715-a7c6dd085fb3 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.156246407 +0000 UTC m=+2.799561897 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert") pod "cluster-version-operator-56d8475767-9gfkg" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3") : secret "cluster-version-operator-serving-cert" not found Mar 20 08:36:16.156332 master-0 kubenswrapper[7465]: E0320 08:36:16.156129 7465 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:16.156526 master-0 kubenswrapper[7465]: E0320 08:36:16.156440 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.156401212 +0000 UTC m=+2.799716702 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:16.156796 master-0 kubenswrapper[7465]: I0320 08:36:16.156770 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:16.156847 master-0 kubenswrapper[7465]: I0320 08:36:16.156831 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:16.156967 master-0 kubenswrapper[7465]: E0320 08:36:16.156939 7465 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:16.157011 master-0 kubenswrapper[7465]: E0320 08:36:16.156980 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls podName:ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.156971888 +0000 UTC m=+2.800287368 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls") pod "ingress-operator-66b84d69b-gzg9m" (UID: "ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02") : secret "metrics-tls" not found Mar 20 08:36:16.157073 master-0 kubenswrapper[7465]: I0320 08:36:16.157050 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:16.157125 master-0 kubenswrapper[7465]: I0320 08:36:16.157104 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:16.157125 master-0 kubenswrapper[7465]: E0320 08:36:16.157115 7465 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:16.157233 master-0 kubenswrapper[7465]: I0320 08:36:16.157209 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:16.157266 master-0 kubenswrapper[7465]: E0320 08:36:16.157245 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 20 08:36:16.157332 master-0 kubenswrapper[7465]: E0320 08:36:16.157276 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert podName:bbc0b783-28d5-4554-b49d-c66082546f44 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.157264247 +0000 UTC m=+2.800579737 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-2pg77" (UID: "bbc0b783-28d5-4554-b49d-c66082546f44") : secret "package-server-manager-serving-cert" not found Mar 20 08:36:16.157332 master-0 kubenswrapper[7465]: I0320 08:36:16.157245 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:16.157332 master-0 kubenswrapper[7465]: E0320 08:36:16.157312 7465 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:16.157446 master-0 kubenswrapper[7465]: I0320 08:36:16.157352 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:16.157446 master-0 kubenswrapper[7465]: E0320 08:36:16.157386 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 20 08:36:16.157446 master-0 kubenswrapper[7465]: E0320 08:36:16.157410 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert podName:df428d5a-c722-4536-8e7f-cdd85c560481 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.157402971 +0000 UTC m=+2.800718461 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert") pod "catalog-operator-68f85b4d6c-fzm28" (UID: "df428d5a-c722-4536-8e7f-cdd85c560481") : secret "catalog-operator-serving-cert" not found Mar 20 08:36:16.157446 master-0 kubenswrapper[7465]: E0320 08:36:16.157440 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls podName:3f471ecc-922c-4cb1-9bdd-fdb5da08c592 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.157418812 +0000 UTC m=+2.800734302 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls") pod "dns-operator-9c5679d8f-r6dm8" (UID: "3f471ecc-922c-4cb1-9bdd-fdb5da08c592") : secret "metrics-tls" not found Mar 20 08:36:16.157446 master-0 kubenswrapper[7465]: E0320 08:36:16.157443 7465 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 20 08:36:16.157446 master-0 kubenswrapper[7465]: I0320 08:36:16.157460 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:16.157446 master-0 kubenswrapper[7465]: E0320 08:36:16.157470 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs podName:813f91c2-2b37-4681-968d-4217e286e22f nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.157462543 +0000 UTC m=+2.800778033 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs") pod "network-metrics-daemon-srdjm" (UID: "813f91c2-2b37-4681-968d-4217e286e22f") : secret "metrics-daemon-secret" not found Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: E0320 08:36:16.157485 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls podName:45e8b72b-564c-4bb1-b911-baff2d6c87ad nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.157479983 +0000 UTC m=+2.800795473 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-nljsr" (UID: "45e8b72b-564c-4bb1-b911-baff2d6c87ad") : secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: I0320 08:36:16.157529 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: E0320 08:36:16.157574 7465 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: E0320 08:36:16.157619 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls podName:b4291bfd-53d9-4c78-b7cb-d7eb46560528 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.157612047 +0000 UTC m=+2.800927537 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-fmhbq" (UID: "b4291bfd-53d9-4c78-b7cb-d7eb46560528") : secret "image-registry-operator-tls" not found Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: I0320 08:36:16.157585 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: I0320 08:36:16.157712 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: E0320 08:36:16.157625 7465 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: E0320 08:36:16.157754 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls podName:42df77ec-94aa-48ba-bb35-7b1f1e8b8e97 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.157746681 +0000 UTC m=+2.801062171 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls") pod "machine-config-operator-84d549f6d5-gm4qr" (UID: "42df77ec-94aa-48ba-bb35-7b1f1e8b8e97") : secret "mco-proxy-tls" not found Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: E0320 08:36:16.157661 7465 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: E0320 08:36:16.157780 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.157774472 +0000 UTC m=+2.801089962 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: I0320 08:36:16.157800 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: E0320 08:36:16.157813 7465 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: E0320 08:36:16.157859 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.157834724 +0000 UTC m=+2.801150204 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: E0320 08:36:16.157876 7465 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: E0320 08:36:16.157902 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics podName:acb704a9-6c8d-4378-ae93-e7095b1fce85 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.157895835 +0000 UTC m=+2.801211325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-mvn4t" (UID: "acb704a9-6c8d-4378-ae93-e7095b1fce85") : secret "marketplace-operator-metrics" not found Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: E0320 08:36:16.157710 7465 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: E0320 08:36:16.157923 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.157916976 +0000 UTC m=+2.801232466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "node-tuning-operator-tls" not found Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: E0320 08:36:16.157939 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 20 08:36:16.158445 master-0 kubenswrapper[7465]: E0320 08:36:16.157976 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert podName:7b489385-2c96-4a97-8b31-362162de020e nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.157968018 +0000 UTC m=+2.801283508 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert") pod "olm-operator-5c9796789-tjm9l" (UID: "7b489385-2c96-4a97-8b31-362162de020e") : secret "olm-operator-serving-cert" not found Mar 20 08:36:16.169045 master-0 kubenswrapper[7465]: I0320 08:36:16.169010 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d57k\" (UniqueName: \"kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-kube-api-access-8d57k\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:16.189098 master-0 kubenswrapper[7465]: I0320 08:36:16.189032 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtcnq\" (UniqueName: \"kubernetes.io/projected/df428d5a-c722-4536-8e7f-cdd85c560481-kube-api-access-dtcnq\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:16.211222 master-0 kubenswrapper[7465]: I0320 08:36:16.210969 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dab97c35-fe60-4134-8715-a7c6dd085fb3-kube-api-access\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:16.231725 master-0 kubenswrapper[7465]: I0320 08:36:16.231677 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njjkt\" (UniqueName: \"kubernetes.io/projected/813f91c2-2b37-4681-968d-4217e286e22f-kube-api-access-njjkt\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:16.249748 master-0 kubenswrapper[7465]: I0320 08:36:16.249707 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa759777-de22-4440-a3d3-ad429a3b8e7b-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:36:16.260801 master-0 kubenswrapper[7465]: I0320 08:36:16.260656 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:16.261110 master-0 kubenswrapper[7465]: E0320 08:36:16.260989 7465 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 20 08:36:16.261334 master-0 kubenswrapper[7465]: E0320 08:36:16.261303 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs podName:6a80bd6f-2263-4251-8197-5173193f8afc nodeName:}" failed. No retries permitted until 2026-03-20 08:36:17.26122698 +0000 UTC m=+2.904542470 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-5rrrh" (UID: "6a80bd6f-2263-4251-8197-5173193f8afc") : secret "multus-admission-controller-secret" not found Mar 20 08:36:16.268061 master-0 kubenswrapper[7465]: I0320 08:36:16.268017 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2d4r\" (UniqueName: \"kubernetes.io/projected/f53bc282-5937-49ac-ac98-2ee37ccb268d-kube-api-access-b2d4r\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:16.288040 master-0 kubenswrapper[7465]: I0320 08:36:16.287986 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxg4z\" (UniqueName: \"kubernetes.io/projected/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-kube-api-access-fxg4z\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:36:16.308697 master-0 kubenswrapper[7465]: I0320 08:36:16.308658 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg6th\" (UniqueName: \"kubernetes.io/projected/7b489385-2c96-4a97-8b31-362162de020e-kube-api-access-pg6th\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:16.327559 master-0 kubenswrapper[7465]: I0320 08:36:16.327512 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btdjt\" (UniqueName: \"kubernetes.io/projected/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-kube-api-access-btdjt\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:36:16.353991 master-0 kubenswrapper[7465]: I0320 08:36:16.353951 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrws5\" (UniqueName: \"kubernetes.io/projected/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-kube-api-access-wrws5\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:16.368745 master-0 kubenswrapper[7465]: I0320 08:36:16.368711 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:16.388749 master-0 kubenswrapper[7465]: I0320 08:36:16.388685 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6mb5\" (UniqueName: \"kubernetes.io/projected/ad692349-5089-4afc-85b2-9b6e7997567c-kube-api-access-h6mb5\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:36:16.413551 master-0 kubenswrapper[7465]: I0320 08:36:16.413498 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clz6w\" (UniqueName: \"kubernetes.io/projected/75e3e2cc-aa56-41f3-8859-1c086f419d05-kube-api-access-clz6w\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:36:16.429504 master-0 kubenswrapper[7465]: I0320 08:36:16.429467 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmrxh\" (UniqueName: \"kubernetes.io/projected/ab175f7e-a5e8-4fda-98c9-6d052a221a83-kube-api-access-zmrxh\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:16.452271 master-0 kubenswrapper[7465]: I0320 08:36:16.452213 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqdlf\" (UniqueName: \"kubernetes.io/projected/325f0a83-d56d-4b62-977b-088a7d5f0e00-kube-api-access-lqdlf\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:36:16.471857 master-0 kubenswrapper[7465]: I0320 08:36:16.471814 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpdk5\" (UniqueName: \"kubernetes.io/projected/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-kube-api-access-lpdk5\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:16.492521 master-0 kubenswrapper[7465]: I0320 08:36:16.492463 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmndg\" (UniqueName: \"kubernetes.io/projected/aa16c3bf-2350-46d1-afa0-9477b3ec8877-kube-api-access-qmndg\") pod \"cluster-storage-operator-7d87854d6-vlq7h\" (UID: \"aa16c3bf-2350-46d1-afa0-9477b3ec8877\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:36:16.510934 master-0 kubenswrapper[7465]: I0320 08:36:16.510820 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvx6w\" (UniqueName: \"kubernetes.io/projected/acb704a9-6c8d-4378-ae93-e7095b1fce85-kube-api-access-xvx6w\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:16.524734 master-0 kubenswrapper[7465]: W0320 08:36:16.524686 7465 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 20 08:36:16.524840 master-0 kubenswrapper[7465]: E0320 08:36:16.524781 7465 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:36:16.543591 master-0 kubenswrapper[7465]: E0320 08:36:16.543553 7465 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:36:16.573597 master-0 kubenswrapper[7465]: I0320 08:36:16.573519 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqd2v\" (UniqueName: \"kubernetes.io/projected/6a80bd6f-2263-4251-8197-5173193f8afc-kube-api-access-tqd2v\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:16.589912 master-0 kubenswrapper[7465]: I0320 08:36:16.589865 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd6vv\" (UniqueName: \"kubernetes.io/projected/5e3ddf9e-eeb5-4266-b675-092fd4e27623-kube-api-access-xd6vv\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:36:16.608297 master-0 kubenswrapper[7465]: I0320 08:36:16.608235 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtc2p\" (UniqueName: \"kubernetes.io/projected/248a3d2f-3be4-46bf-959c-79d28736c0d6-kube-api-access-mtc2p\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:16.645391 master-0 kubenswrapper[7465]: I0320 08:36:16.645300 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l4b7\" (UniqueName: \"kubernetes.io/projected/ee3cc021-67d8-4b7f-b443-16f18228712e-kube-api-access-7l4b7\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:36:16.649060 master-0 kubenswrapper[7465]: I0320 08:36:16.649035 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh47j\" (UniqueName: \"kubernetes.io/projected/fb0fc10f-5796-4cd5-b8f5-72d678054c24-kube-api-access-lh47j\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:36:16.670080 master-0 kubenswrapper[7465]: I0320 08:36:16.670045 7465 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 08:36:16.679833 master-0 kubenswrapper[7465]: I0320 08:36:16.679782 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2zzd\" (UniqueName: \"kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd\") pod \"network-check-target-xnrw6\" (UID: \"c0142d4e-9fd4-4375-a773-bb89b38af654\") " pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:36:16.713170 master-0 kubenswrapper[7465]: I0320 08:36:16.713125 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: I0320 08:36:17.176277 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: I0320 08:36:17.176337 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: E0320 08:36:17.176540 7465 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: E0320 08:36:17.176646 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert podName:dab97c35-fe60-4134-8715-a7c6dd085fb3 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.176617864 +0000 UTC m=+4.819933544 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert") pod "cluster-version-operator-56d8475767-9gfkg" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3") : secret "cluster-version-operator-serving-cert" not found Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: I0320 08:36:17.176643 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: E0320 08:36:17.176778 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: I0320 08:36:17.176789 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: E0320 08:36:17.176850 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert podName:7b489385-2c96-4a97-8b31-362162de020e nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.17682738 +0000 UTC m=+4.820142870 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert") pod "olm-operator-5c9796789-tjm9l" (UID: "7b489385-2c96-4a97-8b31-362162de020e") : secret "olm-operator-serving-cert" not found Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: I0320 08:36:17.176874 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: I0320 08:36:17.176903 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: I0320 08:36:17.176929 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: I0320 08:36:17.176954 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: I0320 08:36:17.176976 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: I0320 08:36:17.176992 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: E0320 08:36:17.176992 7465 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: I0320 08:36:17.177018 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: E0320 08:36:17.177043 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: I0320 08:36:17.177045 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:17.177072 master-0 kubenswrapper[7465]: E0320 08:36:17.177072 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls podName:ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.177046476 +0000 UTC m=+4.820362156 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls") pod "ingress-operator-66b84d69b-gzg9m" (UID: "ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02") : secret "metrics-tls" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177130 7465 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177143 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert podName:df428d5a-c722-4536-8e7f-cdd85c560481 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.177132859 +0000 UTC m=+4.820448549 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert") pod "catalog-operator-68f85b4d6c-fzm28" (UID: "df428d5a-c722-4536-8e7f-cdd85c560481") : secret "catalog-operator-serving-cert" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177196 7465 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: I0320 08:36:17.177179 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177223 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls podName:3f471ecc-922c-4cb1-9bdd-fdb5da08c592 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.177217411 +0000 UTC m=+4.820532901 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls") pod "dns-operator-9c5679d8f-r6dm8" (UID: "3f471ecc-922c-4cb1-9bdd-fdb5da08c592") : secret "metrics-tls" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177240 7465 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: I0320 08:36:17.177249 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177260 7465 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177148 7465 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177395 7465 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177085 7465 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177122 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177271 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs podName:813f91c2-2b37-4681-968d-4217e286e22f nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.177262113 +0000 UTC m=+4.820577813 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs") pod "network-metrics-daemon-srdjm" (UID: "813f91c2-2b37-4681-968d-4217e286e22f") : secret "metrics-daemon-secret" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177483 7465 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177498 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.177485269 +0000 UTC m=+4.820800969 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177349 7465 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177515 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls podName:b4291bfd-53d9-4c78-b7cb-d7eb46560528 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.17750796 +0000 UTC m=+4.820823680 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-fmhbq" (UID: "b4291bfd-53d9-4c78-b7cb-d7eb46560528") : secret "image-registry-operator-tls" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177540 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls podName:42df77ec-94aa-48ba-bb35-7b1f1e8b8e97 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.17753141 +0000 UTC m=+4.820847130 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls") pod "machine-config-operator-84d549f6d5-gm4qr" (UID: "42df77ec-94aa-48ba-bb35-7b1f1e8b8e97") : secret "mco-proxy-tls" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177565 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.177555231 +0000 UTC m=+4.820870961 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177580 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls podName:45e8b72b-564c-4bb1-b911-baff2d6c87ad nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.177572802 +0000 UTC m=+4.820888522 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-nljsr" (UID: "45e8b72b-564c-4bb1-b911-baff2d6c87ad") : secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: I0320 08:36:17.177606 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177762 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.177747927 +0000 UTC m=+4.821063637 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "node-tuning-operator-tls" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177783 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert podName:bbc0b783-28d5-4554-b49d-c66082546f44 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.177773128 +0000 UTC m=+4.821088848 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-2pg77" (UID: "bbc0b783-28d5-4554-b49d-c66082546f44") : secret "package-server-manager-serving-cert" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177797 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.177790028 +0000 UTC m=+4.821105738 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177903 7465 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 20 08:36:17.178573 master-0 kubenswrapper[7465]: E0320 08:36:17.177995 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics podName:acb704a9-6c8d-4378-ae93-e7095b1fce85 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.177973443 +0000 UTC m=+4.821288933 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-mvn4t" (UID: "acb704a9-6c8d-4378-ae93-e7095b1fce85") : secret "marketplace-operator-metrics" not found Mar 20 08:36:17.278910 master-0 kubenswrapper[7465]: I0320 08:36:17.278819 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:17.279220 master-0 kubenswrapper[7465]: E0320 08:36:17.279078 7465 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 20 08:36:17.279289 master-0 kubenswrapper[7465]: E0320 08:36:17.279223 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs podName:6a80bd6f-2263-4251-8197-5173193f8afc nodeName:}" failed. No retries permitted until 2026-03-20 08:36:19.279199927 +0000 UTC m=+4.922515407 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-5rrrh" (UID: "6a80bd6f-2263-4251-8197-5173193f8afc") : secret "multus-admission-controller-secret" not found Mar 20 08:36:17.472664 master-0 kubenswrapper[7465]: E0320 08:36:17.472574 7465 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302" Mar 20 08:36:17.472879 master-0 kubenswrapper[7465]: E0320 08:36:17.472818 7465 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-scheduler-operator-container,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302,Command:[cluster-kube-scheduler-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:1.31.14,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-kube-scheduler-operator-dddff6458-vx5d7_openshift-kube-scheduler-operator(a57854ac-809a-4745-aaa1-774f0a08a560): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:36:17.475280 master-0 kubenswrapper[7465]: E0320 08:36:17.475167 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" podUID="a57854ac-809a-4745-aaa1-774f0a08a560" Mar 20 08:36:18.123577 master-0 kubenswrapper[7465]: E0320 08:36:18.123051 7465 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310" Mar 20 08:36:18.124218 master-0 kubenswrapper[7465]: E0320 08:36:18.123555 7465 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cluster-storage-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310,Command:[cluster-storage-operator start],Args:[start -v=2 --terminate-on-files=/var/run/secrets/serving-cert/tls.crt --terminate-on-files=/var/run/secrets/serving-cert/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:AWS_EBS_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6f8be2ccd34c5347b290d853b5d7a8d746d13d2f5d2828da73c16a8eb6d5af67,ValueFrom:nil,},EnvVar{Name:AWS_EBS_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f9a5f406ad4ce6ecadd3e2590848bc4b5de5ab1cb5d0bb753b98188a28c44956,ValueFrom:nil,},EnvVar{Name:GCP_PD_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:78861d4efdfa2f7b109402745c586e7e0be2529fa1d9a26b0ad3ddf3e8020953,ValueFrom:nil,},EnvVar{Name:GCP_PD_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f422ffe6b712af5b5ee25b68e73cda9554f145a58ac26b02b5a750f5c5dd126d,ValueFrom:nil,},EnvVar{Name:OPENSTACK_CINDER_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9f0d505a150af2e1cc3a499a3ebfa2209ee91ad2dc51ae193947b6a5c594b206,ValueFrom:nil,},EnvVar{Name:OPENSTACK_CINDER_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8bd95b84c33750b6a5d68ff914c99418bcac9138d5b20a0465d95bfd6a16b86,ValueFrom:nil,},EnvVar{Name:OVIRT_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c4c2f0ac2d23a17393606070219485ba5d974d45328077daba15411925771795,ValueFrom:nil,},EnvVar{Name:OVIRT_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:78aa5be28be7d85f30d68230d193084a2ec6db6e8b67d91b99b9964b7832c3b5,ValueFrom:nil,},EnvVar{Name:MANILA_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f357238e2e91c79b804978401909536e3b9c657c994ab388d82ccc37406fa380,ValueFrom:nil,},EnvVar{Name:MANILA_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2f0bf8b84707646f962f446f9d8e27091796740abf15092d294625e6afb03c8,ValueFrom:nil,},EnvVar{Name:MANILA_NFS_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ed36d7e6dd64d1d4152dfc347cfe2c7a932541f66234887c2145fcf75ef3149,ValueFrom:nil,},EnvVar{Name:PROVISIONER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0de2b4097ae6231bf44e30b6724d34fd6ca3b075050479be4cabe5d5dc0847f5,ValueFrom:nil,},EnvVar{Name:ATTACHER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2af92f0e90f29d4329e3ad1e235e69bc2397e065bfbc6772d7e073701c7b1363,ValueFrom:nil,},EnvVar{Name:RESIZER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7f42a877d794962193d792163231e076195b1451f9d09260a4c833b7d587c217,ValueFrom:nil,},EnvVar{Name:SNAPSHOTTER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a7ce31e7a0bb2c38d29ced899704249479ce280f444d211b468b907d367f4f70,ValueFrom:nil,},EnvVar{Name:NODE_DRIVER_REGISTRAR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d8a31138bdae4975c69f6c9ad5fb30ce438f4bd6ae05eb8fd7db07924729855c,ValueFrom:nil,},EnvVar{Name:LIVENESS_PROBE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dca644664dcc5649c27b5b0d55102bcd46c0d25de3e63f96866b81f7cb1d90cc,ValueFrom:nil,},EnvVar{Name:VSPHERE_PROBLEM_DETECTOR_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ad103effb35bf1b15fb53ef8f3d77a563dee94e7a6703924b377b31ac7754ba2,ValueFrom:nil,},EnvVar{Name:AZURE_DISK_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe31f5b78d8ec974d4e30efa3524849ebdc534bd5e83b6b8789944322ee9b9ff,ValueFrom:nil,},EnvVar{Name:AZURE_DISK_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fc78ae6ff45a27c111fff14e9d15a2e9982f97577722fe519630a018ebd64a5e,ValueFrom:nil,},EnvVar{Name:AZURE_FILE_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c7ede995e9f063c14d14db7d70ee4ddb5e098b36033ca7479593abb1e34c1f0f,ValueFrom:nil,},EnvVar{Name:AZURE_FILE_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:13a3fe1b64974d4b2ea6bebddbc974b777556820de3dbd204e8a5b634e7a76a5,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed,ValueFrom:nil,},EnvVar{Name:VMWARE_VSPHERE_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f52b7d31de7697dd95b0addb28b5a270e2e2a8e37543a16696aaadcaf7a14756,ValueFrom:nil,},EnvVar{Name:VMWARE_VSPHERE_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b6029487b019751b36752e15a5afd5db73fe449798b0df7e7465fe47353b8271,ValueFrom:nil,},EnvVar{Name:VMWARE_VSPHERE_SYNCER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0509bd17634879a7e7c73a96a6cfe4be00f98e3ce7258733d0d6bb7f8a95b91f,ValueFrom:nil,},EnvVar{Name:CLUSTER_CLOUD_CONTROLLER_MANAGER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5,ValueFrom:nil,},EnvVar{Name:IBM_VPC_BLOCK_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53284e11f5db88ec68f5ac7fdd1d42b26e62fde221368f8a1b8f918ed6b38d4f,ValueFrom:nil,},EnvVar{Name:IBM_VPC_BLOCK_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101edd497d95ff956953bb01124b8f81d6d0691e2a44a76c88dd8260299ff382,ValueFrom:nil,},EnvVar{Name:POWERVS_BLOCK_CSI_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4dcc3f2648915ed6887ff9db0c8d45b5487e3acdd7eb832ff6e7d579846ed90b,ValueFrom:nil,},EnvVar{Name:POWERVS_BLOCK_CSI_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:284d17aa10d048eb7e39956681248cc31caa37aedde5edcd72181d12f1beaa43,ValueFrom:nil,},EnvVar{Name:TOOLS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:67c988e079558dc6b20232ebf9a7f7276fee60c756caed584c9715e0bec77a5a,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cluster-storage-operator-serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qmndg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cluster-storage-operator-7d87854d6-vlq7h_openshift-cluster-storage-operator(aa16c3bf-2350-46d1-afa0-9477b3ec8877): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:36:18.125141 master-0 kubenswrapper[7465]: E0320 08:36:18.125054 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-storage-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" podUID="aa16c3bf-2350-46d1-afa0-9477b3ec8877" Mar 20 08:36:18.389662 master-0 kubenswrapper[7465]: I0320 08:36:18.389518 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:18.394783 master-0 kubenswrapper[7465]: I0320 08:36:18.394733 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:18.728605 master-0 kubenswrapper[7465]: E0320 08:36:18.728543 7465 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a" Mar 20 08:36:18.728847 master-0 kubenswrapper[7465]: E0320 08:36:18.728775 7465 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:etcd-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a,Command:[cluster-etcd-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml --terminate-on-files=/var/run/secrets/serving-cert/tls.crt --terminate-on-files=/var/run/secrets/serving-cert/tls.key --terminate-on-files=/var/run/secrets/etcd-client/tls.crt --terminate-on-files=/var/run/secrets/etcd-client/tls.key --terminate-on-files=/var/run/configmaps/etcd-ca/ca-bundle.crt --terminate-on-files=/var/run/configmaps/etcd-service-ca/service-ca.crt],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPENSHIFT_PROFILE,Value:web,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-service-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-service-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-client,ReadOnly:false,MountPath:/var/run/secrets/etcd-client,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xhkh7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:30,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod etcd-operator-8544cbcf9c-brhw4_openshift-etcd-operator(f046860d-2d54-4746-8ba2-f8e90fa55e38): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:36:18.730313 master-0 kubenswrapper[7465]: E0320 08:36:18.730269 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" podUID="f046860d-2d54-4746-8ba2-f8e90fa55e38" Mar 20 08:36:19.011738 master-0 kubenswrapper[7465]: I0320 08:36:19.011666 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xnrw6"] Mar 20 08:36:19.208990 master-0 kubenswrapper[7465]: I0320 08:36:19.208400 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:19.208990 master-0 kubenswrapper[7465]: I0320 08:36:19.208980 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: I0320 08:36:19.209022 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: I0320 08:36:19.209074 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: I0320 08:36:19.209113 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: I0320 08:36:19.209144 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: I0320 08:36:19.209173 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: I0320 08:36:19.209235 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: I0320 08:36:19.209272 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: I0320 08:36:19.209299 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: I0320 08:36:19.209337 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: I0320 08:36:19.209370 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: I0320 08:36:19.209402 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: I0320 08:36:19.209436 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: I0320 08:36:19.209462 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.208632 7465 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.209722 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.209697165 +0000 UTC m=+8.853012665 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.209752 7465 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.209799 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics podName:acb704a9-6c8d-4378-ae93-e7095b1fce85 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.209784378 +0000 UTC m=+8.853099868 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-mvn4t" (UID: "acb704a9-6c8d-4378-ae93-e7095b1fce85") : secret "marketplace-operator-metrics" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.209856 7465 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.209881 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls podName:3f471ecc-922c-4cb1-9bdd-fdb5da08c592 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.20987234 +0000 UTC m=+8.853187840 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls") pod "dns-operator-9c5679d8f-r6dm8" (UID: "3f471ecc-922c-4cb1-9bdd-fdb5da08c592") : secret "metrics-tls" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.209934 7465 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.209964 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs podName:813f91c2-2b37-4681-968d-4217e286e22f nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.209955393 +0000 UTC m=+8.853270893 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs") pod "network-metrics-daemon-srdjm" (UID: "813f91c2-2b37-4681-968d-4217e286e22f") : secret "metrics-daemon-secret" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210017 7465 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210044 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls podName:b4291bfd-53d9-4c78-b7cb-d7eb46560528 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.210035645 +0000 UTC m=+8.853351135 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-fmhbq" (UID: "b4291bfd-53d9-4c78-b7cb-d7eb46560528") : secret "image-registry-operator-tls" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210104 7465 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210129 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.210121067 +0000 UTC m=+8.853436557 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "node-tuning-operator-tls" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210179 7465 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210223 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls podName:42df77ec-94aa-48ba-bb35-7b1f1e8b8e97 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.21021504 +0000 UTC m=+8.853530530 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls") pod "machine-config-operator-84d549f6d5-gm4qr" (UID: "42df77ec-94aa-48ba-bb35-7b1f1e8b8e97") : secret "mco-proxy-tls" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210273 7465 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210289 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210301 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert podName:dab97c35-fe60-4134-8715-a7c6dd085fb3 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.210290432 +0000 UTC m=+8.853605932 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert") pod "cluster-version-operator-56d8475767-9gfkg" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3") : secret "cluster-version-operator-serving-cert" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210325 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert podName:7b489385-2c96-4a97-8b31-362162de020e nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.210313553 +0000 UTC m=+8.853629063 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert") pod "olm-operator-5c9796789-tjm9l" (UID: "7b489385-2c96-4a97-8b31-362162de020e") : secret "olm-operator-serving-cert" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210353 7465 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210378 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls podName:45e8b72b-564c-4bb1-b911-baff2d6c87ad nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.210370285 +0000 UTC m=+8.853685775 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-nljsr" (UID: "45e8b72b-564c-4bb1-b911-baff2d6c87ad") : secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.209640 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210406 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert podName:df428d5a-c722-4536-8e7f-cdd85c560481 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.210398186 +0000 UTC m=+8.853713686 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert") pod "catalog-operator-68f85b4d6c-fzm28" (UID: "df428d5a-c722-4536-8e7f-cdd85c560481") : secret "catalog-operator-serving-cert" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210379 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210438 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert podName:bbc0b783-28d5-4554-b49d-c66082546f44 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.210429996 +0000 UTC m=+8.853745496 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-2pg77" (UID: "bbc0b783-28d5-4554-b49d-c66082546f44") : secret "package-server-manager-serving-cert" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210485 7465 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210514 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.210505069 +0000 UTC m=+8.853820569 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210545 7465 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210632 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.210609862 +0000 UTC m=+8.853925352 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210636 7465 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:19.210789 master-0 kubenswrapper[7465]: E0320 08:36:19.210672 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls podName:ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.210663493 +0000 UTC m=+8.853979003 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls") pod "ingress-operator-66b84d69b-gzg9m" (UID: "ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02") : secret "metrics-tls" not found Mar 20 08:36:19.312715 master-0 kubenswrapper[7465]: I0320 08:36:19.312249 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:19.312715 master-0 kubenswrapper[7465]: E0320 08:36:19.312632 7465 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 20 08:36:19.312715 master-0 kubenswrapper[7465]: E0320 08:36:19.312690 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs podName:6a80bd6f-2263-4251-8197-5173193f8afc nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.31266687 +0000 UTC m=+8.955982350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-5rrrh" (UID: "6a80bd6f-2263-4251-8197-5173193f8afc") : secret "multus-admission-controller-secret" not found Mar 20 08:36:19.549063 master-0 kubenswrapper[7465]: I0320 08:36:19.548994 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:19.554870 master-0 kubenswrapper[7465]: I0320 08:36:19.554819 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:19.622924 master-0 kubenswrapper[7465]: I0320 08:36:19.622846 7465 generic.go:334] "Generic (PLEG): container finished" podID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerID="6fafac3f004d2f582e04bae9436b72da7fad6247504ddaf33a3c755f3641fa2c" exitCode=0 Mar 20 08:36:19.623257 master-0 kubenswrapper[7465]: I0320 08:36:19.622955 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" event={"ID":"ab175f7e-a5e8-4fda-98c9-6d052a221a83","Type":"ContainerDied","Data":"6fafac3f004d2f582e04bae9436b72da7fad6247504ddaf33a3c755f3641fa2c"} Mar 20 08:36:19.649202 master-0 kubenswrapper[7465]: I0320 08:36:19.649129 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" event={"ID":"c2a23d24-9e09-431e-8c3b-8456ff51a8d0","Type":"ContainerStarted","Data":"4f3597589c3cef89a127e7d3fec06145ebc7ceb92ec5a686f505064f491c35a3"} Mar 20 08:36:19.651002 master-0 kubenswrapper[7465]: I0320 08:36:19.650944 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" event={"ID":"75e3e2cc-aa56-41f3-8859-1c086f419d05","Type":"ContainerStarted","Data":"adad78c0e9b98fa52036392a08a46d2f94455f9ec32c4bacb460a8042eb8ee5e"} Mar 20 08:36:19.653009 master-0 kubenswrapper[7465]: I0320 08:36:19.652947 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" event={"ID":"68252533-bd64-4fc5-838a-cc350cbe77f0","Type":"ContainerStarted","Data":"e3f8332f8ebf3d1a4f060a0c3657a0fbf7dd52dab57065dbe98fe23495c66678"} Mar 20 08:36:19.654249 master-0 kubenswrapper[7465]: I0320 08:36:19.654218 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" event={"ID":"fa759777-de22-4440-a3d3-ad429a3b8e7b","Type":"ContainerStarted","Data":"9fc8932c5b9da1ed7961a714ed5b905683dbfe51746690c485563d67ef3b0b29"} Mar 20 08:36:19.655595 master-0 kubenswrapper[7465]: I0320 08:36:19.655555 7465 generic.go:334] "Generic (PLEG): container finished" podID="86cb5d23-df7f-4f67-8086-1789d8e68544" containerID="eb336787da69f3659db83e9b59377f619a1bab475c9f6f4fe67e34d16e998717" exitCode=0 Mar 20 08:36:19.655649 master-0 kubenswrapper[7465]: I0320 08:36:19.655610 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" event={"ID":"86cb5d23-df7f-4f67-8086-1789d8e68544","Type":"ContainerDied","Data":"eb336787da69f3659db83e9b59377f619a1bab475c9f6f4fe67e34d16e998717"} Mar 20 08:36:19.674885 master-0 kubenswrapper[7465]: I0320 08:36:19.673953 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh" event={"ID":"7c4e7e57-43be-4d31-b523-f7e4d316dce3","Type":"ContainerStarted","Data":"68bec1ef3f4454b1453d2de2db069e48c08d8a5c1a267f409f8da798126b9d46"} Mar 20 08:36:19.704961 master-0 kubenswrapper[7465]: I0320 08:36:19.704885 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xnrw6" event={"ID":"c0142d4e-9fd4-4375-a773-bb89b38af654","Type":"ContainerStarted","Data":"1179771f4bb82559252cc032e9b6d619a03143a4bf62b3be7c0a1d8b8023730c"} Mar 20 08:36:19.704961 master-0 kubenswrapper[7465]: I0320 08:36:19.704953 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xnrw6" event={"ID":"c0142d4e-9fd4-4375-a773-bb89b38af654","Type":"ContainerStarted","Data":"5702154693e32d84807189cf18ed2f8ceb28029864edaaaff188dc529b9551c9"} Mar 20 08:36:19.731010 master-0 kubenswrapper[7465]: I0320 08:36:19.726459 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" event={"ID":"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6","Type":"ContainerStarted","Data":"b2c7cbe5708ed7a3530e1dc35eccab2ac0970444664ce50722925f65c5f61474"} Mar 20 08:36:19.769590 master-0 kubenswrapper[7465]: I0320 08:36:19.760340 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:36:20.090048 master-0 kubenswrapper[7465]: I0320 08:36:20.089485 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr"] Mar 20 08:36:20.090048 master-0 kubenswrapper[7465]: E0320 08:36:20.089669 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdfdabb8-83d6-4b38-a709-9e354062ba1a" containerName="assisted-installer-controller" Mar 20 08:36:20.090048 master-0 kubenswrapper[7465]: I0320 08:36:20.089683 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdfdabb8-83d6-4b38-a709-9e354062ba1a" containerName="assisted-installer-controller" Mar 20 08:36:20.090048 master-0 kubenswrapper[7465]: E0320 08:36:20.089695 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412becc8-c1a7-422c-94d1-dd1849070ef1" containerName="prober" Mar 20 08:36:20.090048 master-0 kubenswrapper[7465]: I0320 08:36:20.089700 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="412becc8-c1a7-422c-94d1-dd1849070ef1" containerName="prober" Mar 20 08:36:20.090048 master-0 kubenswrapper[7465]: I0320 08:36:20.089802 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="412becc8-c1a7-422c-94d1-dd1849070ef1" containerName="prober" Mar 20 08:36:20.090048 master-0 kubenswrapper[7465]: I0320 08:36:20.089813 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdfdabb8-83d6-4b38-a709-9e354062ba1a" containerName="assisted-installer-controller" Mar 20 08:36:20.090440 master-0 kubenswrapper[7465]: I0320 08:36:20.090404 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr" Mar 20 08:36:20.095560 master-0 kubenswrapper[7465]: I0320 08:36:20.092366 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 08:36:20.095560 master-0 kubenswrapper[7465]: I0320 08:36:20.094911 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr"] Mar 20 08:36:20.095560 master-0 kubenswrapper[7465]: I0320 08:36:20.095029 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 08:36:20.247168 master-0 kubenswrapper[7465]: I0320 08:36:20.247087 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qns9g\" (UniqueName: \"kubernetes.io/projected/2bf90db0-f943-464c-8599-e36b4fc32e1c-kube-api-access-qns9g\") pod \"migrator-8487694857-w5tlr\" (UID: \"2bf90db0-f943-464c-8599-e36b4fc32e1c\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr" Mar 20 08:36:20.348926 master-0 kubenswrapper[7465]: I0320 08:36:20.348782 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qns9g\" (UniqueName: \"kubernetes.io/projected/2bf90db0-f943-464c-8599-e36b4fc32e1c-kube-api-access-qns9g\") pod \"migrator-8487694857-w5tlr\" (UID: \"2bf90db0-f943-464c-8599-e36b4fc32e1c\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr" Mar 20 08:36:20.378995 master-0 kubenswrapper[7465]: I0320 08:36:20.378928 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qns9g\" (UniqueName: \"kubernetes.io/projected/2bf90db0-f943-464c-8599-e36b4fc32e1c-kube-api-access-qns9g\") pod \"migrator-8487694857-w5tlr\" (UID: \"2bf90db0-f943-464c-8599-e36b4fc32e1c\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr" Mar 20 08:36:20.430539 master-0 kubenswrapper[7465]: I0320 08:36:20.430447 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr" Mar 20 08:36:20.433071 master-0 kubenswrapper[7465]: I0320 08:36:20.433017 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr"] Mar 20 08:36:20.433754 master-0 kubenswrapper[7465]: I0320 08:36:20.433710 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" Mar 20 08:36:20.447441 master-0 kubenswrapper[7465]: I0320 08:36:20.447389 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr"] Mar 20 08:36:20.562704 master-0 kubenswrapper[7465]: I0320 08:36:20.562625 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8bxz\" (UniqueName: \"kubernetes.io/projected/96de6024-e20f-4b52-9294-b330d65e4153-kube-api-access-z8bxz\") pod \"csi-snapshot-controller-64854d9cff-f44gr\" (UID: \"96de6024-e20f-4b52-9294-b330d65e4153\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" Mar 20 08:36:20.665815 master-0 kubenswrapper[7465]: I0320 08:36:20.663977 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8bxz\" (UniqueName: \"kubernetes.io/projected/96de6024-e20f-4b52-9294-b330d65e4153-kube-api-access-z8bxz\") pod \"csi-snapshot-controller-64854d9cff-f44gr\" (UID: \"96de6024-e20f-4b52-9294-b330d65e4153\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" Mar 20 08:36:20.685632 master-0 kubenswrapper[7465]: I0320 08:36:20.685572 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8bxz\" (UniqueName: \"kubernetes.io/projected/96de6024-e20f-4b52-9294-b330d65e4153-kube-api-access-z8bxz\") pod \"csi-snapshot-controller-64854d9cff-f44gr\" (UID: \"96de6024-e20f-4b52-9294-b330d65e4153\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" Mar 20 08:36:20.723600 master-0 kubenswrapper[7465]: I0320 08:36:20.722122 7465 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:36:20.836240 master-0 kubenswrapper[7465]: I0320 08:36:20.835653 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" Mar 20 08:36:20.893360 master-0 kubenswrapper[7465]: I0320 08:36:20.893307 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr"] Mar 20 08:36:20.918035 master-0 kubenswrapper[7465]: W0320 08:36:20.917925 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bf90db0_f943_464c_8599_e36b4fc32e1c.slice/crio-3fd8857a2c2302ef6094cc86660ac7c76904e438bf13dc986cbdd24f1c5f29f3 WatchSource:0}: Error finding container 3fd8857a2c2302ef6094cc86660ac7c76904e438bf13dc986cbdd24f1c5f29f3: Status 404 returned error can't find the container with id 3fd8857a2c2302ef6094cc86660ac7c76904e438bf13dc986cbdd24f1c5f29f3 Mar 20 08:36:21.038742 master-0 kubenswrapper[7465]: I0320 08:36:21.038689 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr"] Mar 20 08:36:21.055138 master-0 kubenswrapper[7465]: W0320 08:36:21.055066 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96de6024_e20f_4b52_9294_b330d65e4153.slice/crio-07c77bfc7ae9afd423082de8b582bc56bb7322d1ee441fcf749b3f543d9311b5 WatchSource:0}: Error finding container 07c77bfc7ae9afd423082de8b582bc56bb7322d1ee441fcf749b3f543d9311b5: Status 404 returned error can't find the container with id 07c77bfc7ae9afd423082de8b582bc56bb7322d1ee441fcf749b3f543d9311b5 Mar 20 08:36:21.587203 master-0 kubenswrapper[7465]: I0320 08:36:21.582431 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-797c5b47d8-xrgq6"] Mar 20 08:36:21.587203 master-0 kubenswrapper[7465]: I0320 08:36:21.583168 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:21.587203 master-0 kubenswrapper[7465]: I0320 08:36:21.586711 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 08:36:21.587203 master-0 kubenswrapper[7465]: I0320 08:36:21.586724 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 08:36:21.587203 master-0 kubenswrapper[7465]: I0320 08:36:21.587112 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 08:36:21.596550 master-0 kubenswrapper[7465]: I0320 08:36:21.588341 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 08:36:21.596550 master-0 kubenswrapper[7465]: I0320 08:36:21.588163 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 08:36:21.596550 master-0 kubenswrapper[7465]: I0320 08:36:21.595924 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 08:36:21.604634 master-0 kubenswrapper[7465]: I0320 08:36:21.596939 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-797c5b47d8-xrgq6"] Mar 20 08:36:21.609824 master-0 kubenswrapper[7465]: I0320 08:36:21.605582 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:21.609824 master-0 kubenswrapper[7465]: I0320 08:36:21.605765 7465 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:36:21.613028 master-0 kubenswrapper[7465]: I0320 08:36:21.612977 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:21.683397 master-0 kubenswrapper[7465]: I0320 08:36:21.683305 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:36:21.687690 master-0 kubenswrapper[7465]: I0320 08:36:21.687533 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-client-ca\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:21.687690 master-0 kubenswrapper[7465]: I0320 08:36:21.687571 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhn78\" (UniqueName: \"kubernetes.io/projected/b416a87b-7f06-403c-94b4-01f27442a000-kube-api-access-zhn78\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:21.687690 master-0 kubenswrapper[7465]: I0320 08:36:21.687613 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-proxy-ca-bundles\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:21.687690 master-0 kubenswrapper[7465]: I0320 08:36:21.687667 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b416a87b-7f06-403c-94b4-01f27442a000-serving-cert\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:21.687979 master-0 kubenswrapper[7465]: I0320 08:36:21.687727 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-config\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:21.737585 master-0 kubenswrapper[7465]: I0320 08:36:21.737530 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr" event={"ID":"2bf90db0-f943-464c-8599-e36b4fc32e1c","Type":"ContainerStarted","Data":"3fd8857a2c2302ef6094cc86660ac7c76904e438bf13dc986cbdd24f1c5f29f3"} Mar 20 08:36:21.738848 master-0 kubenswrapper[7465]: I0320 08:36:21.738808 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" event={"ID":"96de6024-e20f-4b52-9294-b330d65e4153","Type":"ContainerStarted","Data":"07c77bfc7ae9afd423082de8b582bc56bb7322d1ee441fcf749b3f543d9311b5"} Mar 20 08:36:21.741282 master-0 kubenswrapper[7465]: I0320 08:36:21.740745 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" event={"ID":"325f0a83-d56d-4b62-977b-088a7d5f0e00","Type":"ContainerStarted","Data":"d498d1943e73378963ddf1fcb87e4851f496c741db4b494c02cde7a06a7405b7"} Mar 20 08:36:21.741282 master-0 kubenswrapper[7465]: I0320 08:36:21.740821 7465 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:36:21.792350 master-0 kubenswrapper[7465]: I0320 08:36:21.792081 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b416a87b-7f06-403c-94b4-01f27442a000-serving-cert\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:21.792350 master-0 kubenswrapper[7465]: I0320 08:36:21.792239 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-config\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:21.792350 master-0 kubenswrapper[7465]: I0320 08:36:21.792314 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-client-ca\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:21.792350 master-0 kubenswrapper[7465]: I0320 08:36:21.792338 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhn78\" (UniqueName: \"kubernetes.io/projected/b416a87b-7f06-403c-94b4-01f27442a000-kube-api-access-zhn78\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:21.792686 master-0 kubenswrapper[7465]: I0320 08:36:21.792386 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-proxy-ca-bundles\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:21.793121 master-0 kubenswrapper[7465]: E0320 08:36:21.793067 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:21.793234 master-0 kubenswrapper[7465]: E0320 08:36:21.793215 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-client-ca podName:b416a87b-7f06-403c-94b4-01f27442a000 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:22.293173328 +0000 UTC m=+7.936488818 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-client-ca") pod "controller-manager-797c5b47d8-xrgq6" (UID: "b416a87b-7f06-403c-94b4-01f27442a000") : configmap "client-ca" not found Mar 20 08:36:21.793296 master-0 kubenswrapper[7465]: E0320 08:36:21.793227 7465 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 20 08:36:21.793374 master-0 kubenswrapper[7465]: E0320 08:36:21.793356 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b416a87b-7f06-403c-94b4-01f27442a000-serving-cert podName:b416a87b-7f06-403c-94b4-01f27442a000 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:22.293321612 +0000 UTC m=+7.936637282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b416a87b-7f06-403c-94b4-01f27442a000-serving-cert") pod "controller-manager-797c5b47d8-xrgq6" (UID: "b416a87b-7f06-403c-94b4-01f27442a000") : secret "serving-cert" not found Mar 20 08:36:21.794679 master-0 kubenswrapper[7465]: E0320 08:36:21.794260 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 20 08:36:21.794679 master-0 kubenswrapper[7465]: E0320 08:36:21.794312 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-config podName:b416a87b-7f06-403c-94b4-01f27442a000 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:22.294299931 +0000 UTC m=+7.937615621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-config") pod "controller-manager-797c5b47d8-xrgq6" (UID: "b416a87b-7f06-403c-94b4-01f27442a000") : configmap "config" not found Mar 20 08:36:21.798154 master-0 kubenswrapper[7465]: I0320 08:36:21.797796 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-proxy-ca-bundles\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:21.826320 master-0 kubenswrapper[7465]: I0320 08:36:21.823197 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhn78\" (UniqueName: \"kubernetes.io/projected/b416a87b-7f06-403c-94b4-01f27442a000-kube-api-access-zhn78\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:21.983031 master-0 kubenswrapper[7465]: I0320 08:36:21.982879 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh"] Mar 20 08:36:21.983617 master-0 kubenswrapper[7465]: I0320 08:36:21.983588 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:21.985972 master-0 kubenswrapper[7465]: I0320 08:36:21.985936 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 08:36:21.986087 master-0 kubenswrapper[7465]: I0320 08:36:21.986040 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 08:36:21.987896 master-0 kubenswrapper[7465]: I0320 08:36:21.987568 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 08:36:21.987896 master-0 kubenswrapper[7465]: I0320 08:36:21.987673 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 08:36:21.987896 master-0 kubenswrapper[7465]: I0320 08:36:21.987841 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 08:36:21.997272 master-0 kubenswrapper[7465]: I0320 08:36:21.997219 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-serving-cert\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:21.997511 master-0 kubenswrapper[7465]: I0320 08:36:21.997288 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vpsg\" (UniqueName: \"kubernetes.io/projected/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-kube-api-access-8vpsg\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:21.997511 master-0 kubenswrapper[7465]: I0320 08:36:21.997367 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-client-ca\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:21.997511 master-0 kubenswrapper[7465]: I0320 08:36:21.997486 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-config\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:22.016981 master-0 kubenswrapper[7465]: I0320 08:36:22.016481 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh"] Mar 20 08:36:22.097858 master-0 kubenswrapper[7465]: I0320 08:36:22.097803 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-serving-cert\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:22.097858 master-0 kubenswrapper[7465]: I0320 08:36:22.097868 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vpsg\" (UniqueName: \"kubernetes.io/projected/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-kube-api-access-8vpsg\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:22.098174 master-0 kubenswrapper[7465]: I0320 08:36:22.098046 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-client-ca\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:22.098174 master-0 kubenswrapper[7465]: E0320 08:36:22.098158 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:22.098940 master-0 kubenswrapper[7465]: E0320 08:36:22.098256 7465 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 20 08:36:22.098940 master-0 kubenswrapper[7465]: I0320 08:36:22.098278 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-config\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:22.098940 master-0 kubenswrapper[7465]: E0320 08:36:22.098315 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-client-ca podName:3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:22.598289462 +0000 UTC m=+8.241604952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-client-ca") pod "route-controller-manager-6cd6978d68-4chhh" (UID: "3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3") : configmap "client-ca" not found Mar 20 08:36:22.099237 master-0 kubenswrapper[7465]: E0320 08:36:22.098351 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: configmap "config" not found Mar 20 08:36:22.099283 master-0 kubenswrapper[7465]: E0320 08:36:22.099247 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-config podName:3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:22.599218719 +0000 UTC m=+8.242534209 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-config") pod "route-controller-manager-6cd6978d68-4chhh" (UID: "3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3") : configmap "config" not found Mar 20 08:36:22.100846 master-0 kubenswrapper[7465]: E0320 08:36:22.100814 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-serving-cert podName:3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:22.600799155 +0000 UTC m=+8.244114645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-serving-cert") pod "route-controller-manager-6cd6978d68-4chhh" (UID: "3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3") : secret "serving-cert" not found Mar 20 08:36:22.115880 master-0 kubenswrapper[7465]: I0320 08:36:22.115821 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vpsg\" (UniqueName: \"kubernetes.io/projected/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-kube-api-access-8vpsg\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:22.302025 master-0 kubenswrapper[7465]: I0320 08:36:22.301971 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-config\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:22.302502 master-0 kubenswrapper[7465]: E0320 08:36:22.302225 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 20 08:36:22.302502 master-0 kubenswrapper[7465]: I0320 08:36:22.302370 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-client-ca\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:22.302696 master-0 kubenswrapper[7465]: E0320 08:36:22.302517 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-config podName:b416a87b-7f06-403c-94b4-01f27442a000 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.302484211 +0000 UTC m=+8.945799701 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-config") pod "controller-manager-797c5b47d8-xrgq6" (UID: "b416a87b-7f06-403c-94b4-01f27442a000") : configmap "config" not found Mar 20 08:36:22.302696 master-0 kubenswrapper[7465]: E0320 08:36:22.302558 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:22.302696 master-0 kubenswrapper[7465]: E0320 08:36:22.302628 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-client-ca podName:b416a87b-7f06-403c-94b4-01f27442a000 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.302609115 +0000 UTC m=+8.945924605 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-client-ca") pod "controller-manager-797c5b47d8-xrgq6" (UID: "b416a87b-7f06-403c-94b4-01f27442a000") : configmap "client-ca" not found Mar 20 08:36:22.302800 master-0 kubenswrapper[7465]: I0320 08:36:22.302703 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b416a87b-7f06-403c-94b4-01f27442a000-serving-cert\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:22.302865 master-0 kubenswrapper[7465]: E0320 08:36:22.302846 7465 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 20 08:36:22.302937 master-0 kubenswrapper[7465]: E0320 08:36:22.302915 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b416a87b-7f06-403c-94b4-01f27442a000-serving-cert podName:b416a87b-7f06-403c-94b4-01f27442a000 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.302905553 +0000 UTC m=+8.946221043 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b416a87b-7f06-403c-94b4-01f27442a000-serving-cert") pod "controller-manager-797c5b47d8-xrgq6" (UID: "b416a87b-7f06-403c-94b4-01f27442a000") : secret "serving-cert" not found Mar 20 08:36:22.473085 master-0 kubenswrapper[7465]: I0320 08:36:22.471963 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-79bc6b8d76-72j8t"] Mar 20 08:36:22.473085 master-0 kubenswrapper[7465]: I0320 08:36:22.472873 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:36:22.474784 master-0 kubenswrapper[7465]: I0320 08:36:22.474734 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 08:36:22.485218 master-0 kubenswrapper[7465]: I0320 08:36:22.484465 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-79bc6b8d76-72j8t"] Mar 20 08:36:22.485218 master-0 kubenswrapper[7465]: I0320 08:36:22.484798 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 08:36:22.485218 master-0 kubenswrapper[7465]: I0320 08:36:22.485054 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 08:36:22.485218 master-0 kubenswrapper[7465]: I0320 08:36:22.485205 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 08:36:22.508139 master-0 kubenswrapper[7465]: I0320 08:36:22.508065 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-key\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:36:22.508511 master-0 kubenswrapper[7465]: I0320 08:36:22.508449 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-cabundle\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:36:22.508550 master-0 kubenswrapper[7465]: I0320 08:36:22.508528 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdwvw\" (UniqueName: \"kubernetes.io/projected/2f844652-225b-4713-a9ad-cf9bcc348f47-kube-api-access-jdwvw\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:36:22.610528 master-0 kubenswrapper[7465]: I0320 08:36:22.610397 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-client-ca\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:22.610528 master-0 kubenswrapper[7465]: I0320 08:36:22.610460 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-cabundle\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:36:22.610528 master-0 kubenswrapper[7465]: I0320 08:36:22.610483 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdwvw\" (UniqueName: \"kubernetes.io/projected/2f844652-225b-4713-a9ad-cf9bcc348f47-kube-api-access-jdwvw\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:36:22.611599 master-0 kubenswrapper[7465]: E0320 08:36:22.610596 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:22.611599 master-0 kubenswrapper[7465]: I0320 08:36:22.610635 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-config\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:22.611599 master-0 kubenswrapper[7465]: E0320 08:36:22.610697 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-client-ca podName:3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.610678265 +0000 UTC m=+9.253993745 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-client-ca") pod "route-controller-manager-6cd6978d68-4chhh" (UID: "3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3") : configmap "client-ca" not found Mar 20 08:36:22.611599 master-0 kubenswrapper[7465]: I0320 08:36:22.610826 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-serving-cert\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:22.611599 master-0 kubenswrapper[7465]: E0320 08:36:22.610909 7465 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 20 08:36:22.611599 master-0 kubenswrapper[7465]: E0320 08:36:22.610933 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-serving-cert podName:3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.610927492 +0000 UTC m=+9.254242982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-serving-cert") pod "route-controller-manager-6cd6978d68-4chhh" (UID: "3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3") : secret "serving-cert" not found Mar 20 08:36:22.611599 master-0 kubenswrapper[7465]: I0320 08:36:22.610978 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-key\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:36:22.611599 master-0 kubenswrapper[7465]: E0320 08:36:22.611048 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: configmap "config" not found Mar 20 08:36:22.611599 master-0 kubenswrapper[7465]: E0320 08:36:22.611147 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-config podName:3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:23.611124848 +0000 UTC m=+9.254440338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-config") pod "route-controller-manager-6cd6978d68-4chhh" (UID: "3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3") : configmap "config" not found Mar 20 08:36:22.612724 master-0 kubenswrapper[7465]: I0320 08:36:22.612661 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-cabundle\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:36:22.614769 master-0 kubenswrapper[7465]: I0320 08:36:22.614739 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-key\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:36:22.628556 master-0 kubenswrapper[7465]: I0320 08:36:22.628521 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdwvw\" (UniqueName: \"kubernetes.io/projected/2f844652-225b-4713-a9ad-cf9bcc348f47-kube-api-access-jdwvw\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:36:22.747331 master-0 kubenswrapper[7465]: I0320 08:36:22.747268 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dd9wv" event={"ID":"ee3cc021-67d8-4b7f-b443-16f18228712e","Type":"ContainerStarted","Data":"89d9294562c55f84d7f5035d5fc91869611db748859a24623525e7ba4ce8193e"} Mar 20 08:36:22.837426 master-0 kubenswrapper[7465]: I0320 08:36:22.837340 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: I0320 08:36:23.222976 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: E0320 08:36:23.223152 7465 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: E0320 08:36:23.223253 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert podName:dab97c35-fe60-4134-8715-a7c6dd085fb3 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.223231531 +0000 UTC m=+16.866547021 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert") pod "cluster-version-operator-56d8475767-9gfkg" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3") : secret "cluster-version-operator-serving-cert" not found Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: I0320 08:36:23.223467 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: I0320 08:36:23.223539 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: I0320 08:36:23.223571 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: I0320 08:36:23.223605 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: E0320 08:36:23.223653 7465 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: E0320 08:36:23.223677 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.223669634 +0000 UTC m=+16.866985124 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: E0320 08:36:23.224881 7465 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: E0320 08:36:23.224920 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls podName:45e8b72b-564c-4bb1-b911-baff2d6c87ad nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.22490894 +0000 UTC m=+16.868224610 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-nljsr" (UID: "45e8b72b-564c-4bb1-b911-baff2d6c87ad") : secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: E0320 08:36:23.224974 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: E0320 08:36:23.224997 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert podName:7b489385-2c96-4a97-8b31-362162de020e nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.224989092 +0000 UTC m=+16.868304582 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert") pod "olm-operator-5c9796789-tjm9l" (UID: "7b489385-2c96-4a97-8b31-362162de020e") : secret "olm-operator-serving-cert" not found Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: I0320 08:36:23.225038 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: I0320 08:36:23.225066 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: I0320 08:36:23.225103 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: I0320 08:36:23.225128 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: I0320 08:36:23.225148 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: I0320 08:36:23.225216 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: I0320 08:36:23.225266 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:23.225295 master-0 kubenswrapper[7465]: I0320 08:36:23.225292 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:23.226529 master-0 kubenswrapper[7465]: I0320 08:36:23.225502 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:23.226529 master-0 kubenswrapper[7465]: I0320 08:36:23.225542 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:23.226529 master-0 kubenswrapper[7465]: E0320 08:36:23.226238 7465 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 20 08:36:23.226529 master-0 kubenswrapper[7465]: E0320 08:36:23.226278 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.2262671 +0000 UTC m=+16.869582590 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "node-tuning-operator-tls" not found Mar 20 08:36:23.226529 master-0 kubenswrapper[7465]: E0320 08:36:23.226294 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 20 08:36:23.226529 master-0 kubenswrapper[7465]: E0320 08:36:23.226343 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert podName:bbc0b783-28d5-4554-b49d-c66082546f44 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.226330662 +0000 UTC m=+16.869646162 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-2pg77" (UID: "bbc0b783-28d5-4554-b49d-c66082546f44") : secret "package-server-manager-serving-cert" not found Mar 20 08:36:23.226529 master-0 kubenswrapper[7465]: E0320 08:36:23.226357 7465 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 20 08:36:23.226529 master-0 kubenswrapper[7465]: E0320 08:36:23.226421 7465 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 20 08:36:23.226529 master-0 kubenswrapper[7465]: E0320 08:36:23.226454 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls podName:b4291bfd-53d9-4c78-b7cb-d7eb46560528 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.226425644 +0000 UTC m=+16.869741314 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-fmhbq" (UID: "b4291bfd-53d9-4c78-b7cb-d7eb46560528") : secret "image-registry-operator-tls" not found Mar 20 08:36:23.226529 master-0 kubenswrapper[7465]: E0320 08:36:23.226461 7465 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 20 08:36:23.226529 master-0 kubenswrapper[7465]: E0320 08:36:23.226468 7465 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 20 08:36:23.226529 master-0 kubenswrapper[7465]: E0320 08:36:23.226532 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 20 08:36:23.226529 master-0 kubenswrapper[7465]: E0320 08:36:23.226545 7465 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:23.226529 master-0 kubenswrapper[7465]: E0320 08:36:23.226485 7465 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:23.227307 master-0 kubenswrapper[7465]: E0320 08:36:23.226487 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs podName:813f91c2-2b37-4681-968d-4217e286e22f nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.226470106 +0000 UTC m=+16.869785806 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs") pod "network-metrics-daemon-srdjm" (UID: "813f91c2-2b37-4681-968d-4217e286e22f") : secret "metrics-daemon-secret" not found Mar 20 08:36:23.227307 master-0 kubenswrapper[7465]: E0320 08:36:23.226491 7465 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:23.227307 master-0 kubenswrapper[7465]: E0320 08:36:23.226623 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics podName:acb704a9-6c8d-4378-ae93-e7095b1fce85 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.226598879 +0000 UTC m=+16.869914369 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-mvn4t" (UID: "acb704a9-6c8d-4378-ae93-e7095b1fce85") : secret "marketplace-operator-metrics" not found Mar 20 08:36:23.227307 master-0 kubenswrapper[7465]: E0320 08:36:23.226642 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert podName:df428d5a-c722-4536-8e7f-cdd85c560481 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.22663396 +0000 UTC m=+16.869949450 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert") pod "catalog-operator-68f85b4d6c-fzm28" (UID: "df428d5a-c722-4536-8e7f-cdd85c560481") : secret "catalog-operator-serving-cert" not found Mar 20 08:36:23.227307 master-0 kubenswrapper[7465]: E0320 08:36:23.226660 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls podName:42df77ec-94aa-48ba-bb35-7b1f1e8b8e97 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.226652661 +0000 UTC m=+16.869968151 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls") pod "machine-config-operator-84d549f6d5-gm4qr" (UID: "42df77ec-94aa-48ba-bb35-7b1f1e8b8e97") : secret "mco-proxy-tls" not found Mar 20 08:36:23.227307 master-0 kubenswrapper[7465]: E0320 08:36:23.226676 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls podName:ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.226669231 +0000 UTC m=+16.869984721 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls") pod "ingress-operator-66b84d69b-gzg9m" (UID: "ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02") : secret "metrics-tls" not found Mar 20 08:36:23.227307 master-0 kubenswrapper[7465]: E0320 08:36:23.226690 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert podName:ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.226683132 +0000 UTC m=+16.869998622 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vxzvg" (UID: "ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7") : secret "performance-addon-operator-webhook-cert" not found Mar 20 08:36:23.227307 master-0 kubenswrapper[7465]: E0320 08:36:23.226604 7465 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 20 08:36:23.227307 master-0 kubenswrapper[7465]: E0320 08:36:23.226767 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.226721883 +0000 UTC m=+16.870037373 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:23.227307 master-0 kubenswrapper[7465]: E0320 08:36:23.226801 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls podName:3f471ecc-922c-4cb1-9bdd-fdb5da08c592 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.226791005 +0000 UTC m=+16.870106495 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls") pod "dns-operator-9c5679d8f-r6dm8" (UID: "3f471ecc-922c-4cb1-9bdd-fdb5da08c592") : secret "metrics-tls" not found Mar 20 08:36:23.329213 master-0 kubenswrapper[7465]: I0320 08:36:23.328527 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-client-ca\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:23.329213 master-0 kubenswrapper[7465]: I0320 08:36:23.329199 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b416a87b-7f06-403c-94b4-01f27442a000-serving-cert\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:23.329550 master-0 kubenswrapper[7465]: I0320 08:36:23.329302 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:23.329550 master-0 kubenswrapper[7465]: I0320 08:36:23.329347 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-config\") pod \"controller-manager-797c5b47d8-xrgq6\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:23.329626 master-0 kubenswrapper[7465]: E0320 08:36:23.328911 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:23.329626 master-0 kubenswrapper[7465]: E0320 08:36:23.329587 7465 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 20 08:36:23.329696 master-0 kubenswrapper[7465]: E0320 08:36:23.329647 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-client-ca podName:b416a87b-7f06-403c-94b4-01f27442a000 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:25.329623076 +0000 UTC m=+10.972938746 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-client-ca") pod "controller-manager-797c5b47d8-xrgq6" (UID: "b416a87b-7f06-403c-94b4-01f27442a000") : configmap "client-ca" not found Mar 20 08:36:23.329696 master-0 kubenswrapper[7465]: E0320 08:36:23.329674 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs podName:6a80bd6f-2263-4251-8197-5173193f8afc nodeName:}" failed. No retries permitted until 2026-03-20 08:36:31.329662947 +0000 UTC m=+16.972978647 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-5rrrh" (UID: "6a80bd6f-2263-4251-8197-5173193f8afc") : secret "multus-admission-controller-secret" not found Mar 20 08:36:23.329696 master-0 kubenswrapper[7465]: E0320 08:36:23.329548 7465 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 20 08:36:23.329813 master-0 kubenswrapper[7465]: E0320 08:36:23.329703 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b416a87b-7f06-403c-94b4-01f27442a000-serving-cert podName:b416a87b-7f06-403c-94b4-01f27442a000 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:25.329697108 +0000 UTC m=+10.973012598 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b416a87b-7f06-403c-94b4-01f27442a000-serving-cert") pod "controller-manager-797c5b47d8-xrgq6" (UID: "b416a87b-7f06-403c-94b4-01f27442a000") : secret "serving-cert" not found Mar 20 08:36:23.340203 master-0 kubenswrapper[7465]: E0320 08:36:23.330162 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 20 08:36:23.340203 master-0 kubenswrapper[7465]: E0320 08:36:23.330315 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-config podName:b416a87b-7f06-403c-94b4-01f27442a000 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:25.330288135 +0000 UTC m=+10.973603785 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-config") pod "controller-manager-797c5b47d8-xrgq6" (UID: "b416a87b-7f06-403c-94b4-01f27442a000") : configmap "config" not found Mar 20 08:36:23.558489 master-0 kubenswrapper[7465]: I0320 08:36:23.558439 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-797c5b47d8-xrgq6"] Mar 20 08:36:23.558869 master-0 kubenswrapper[7465]: E0320 08:36:23.558830 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" podUID="b416a87b-7f06-403c-94b4-01f27442a000" Mar 20 08:36:23.583845 master-0 kubenswrapper[7465]: I0320 08:36:23.583770 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh"] Mar 20 08:36:23.584757 master-0 kubenswrapper[7465]: E0320 08:36:23.584714 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" podUID="3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3" Mar 20 08:36:23.633322 master-0 kubenswrapper[7465]: I0320 08:36:23.633254 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-client-ca\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:23.633875 master-0 kubenswrapper[7465]: I0320 08:36:23.633359 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-config\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:23.633875 master-0 kubenswrapper[7465]: I0320 08:36:23.633487 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-serving-cert\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:23.633875 master-0 kubenswrapper[7465]: E0320 08:36:23.633632 7465 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 20 08:36:23.633875 master-0 kubenswrapper[7465]: E0320 08:36:23.633686 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-serving-cert podName:3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:25.633669148 +0000 UTC m=+11.276984638 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-serving-cert") pod "route-controller-manager-6cd6978d68-4chhh" (UID: "3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3") : secret "serving-cert" not found Mar 20 08:36:23.634074 master-0 kubenswrapper[7465]: E0320 08:36:23.634047 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:23.634114 master-0 kubenswrapper[7465]: E0320 08:36:23.634081 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-client-ca podName:3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:25.63407405 +0000 UTC m=+11.277389540 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-client-ca") pod "route-controller-manager-6cd6978d68-4chhh" (UID: "3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3") : configmap "client-ca" not found Mar 20 08:36:23.636359 master-0 kubenswrapper[7465]: I0320 08:36:23.635460 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-config\") pod \"route-controller-manager-6cd6978d68-4chhh\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:23.727102 master-0 kubenswrapper[7465]: I0320 08:36:23.726980 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:23.727407 master-0 kubenswrapper[7465]: I0320 08:36:23.727145 7465 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:36:23.727407 master-0 kubenswrapper[7465]: I0320 08:36:23.727159 7465 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:36:23.730801 master-0 kubenswrapper[7465]: I0320 08:36:23.730754 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:23.750865 master-0 kubenswrapper[7465]: I0320 08:36:23.750814 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:23.750865 master-0 kubenswrapper[7465]: I0320 08:36:23.750851 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:23.751409 master-0 kubenswrapper[7465]: I0320 08:36:23.751265 7465 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:36:23.758564 master-0 kubenswrapper[7465]: I0320 08:36:23.758524 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:23.761464 master-0 kubenswrapper[7465]: I0320 08:36:23.761430 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:23.835266 master-0 kubenswrapper[7465]: I0320 08:36:23.835033 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-proxy-ca-bundles\") pod \"b416a87b-7f06-403c-94b4-01f27442a000\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " Mar 20 08:36:23.835266 master-0 kubenswrapper[7465]: I0320 08:36:23.835125 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-config\") pod \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " Mar 20 08:36:23.835266 master-0 kubenswrapper[7465]: I0320 08:36:23.835176 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhn78\" (UniqueName: \"kubernetes.io/projected/b416a87b-7f06-403c-94b4-01f27442a000-kube-api-access-zhn78\") pod \"b416a87b-7f06-403c-94b4-01f27442a000\" (UID: \"b416a87b-7f06-403c-94b4-01f27442a000\") " Mar 20 08:36:23.835266 master-0 kubenswrapper[7465]: I0320 08:36:23.835272 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vpsg\" (UniqueName: \"kubernetes.io/projected/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-kube-api-access-8vpsg\") pod \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\" (UID: \"3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3\") " Mar 20 08:36:23.835966 master-0 kubenswrapper[7465]: I0320 08:36:23.835852 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-config" (OuterVolumeSpecName: "config") pod "3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3" (UID: "3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:36:23.836222 master-0 kubenswrapper[7465]: I0320 08:36:23.836161 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b416a87b-7f06-403c-94b4-01f27442a000" (UID: "b416a87b-7f06-403c-94b4-01f27442a000"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:36:23.839584 master-0 kubenswrapper[7465]: I0320 08:36:23.839535 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b416a87b-7f06-403c-94b4-01f27442a000-kube-api-access-zhn78" (OuterVolumeSpecName: "kube-api-access-zhn78") pod "b416a87b-7f06-403c-94b4-01f27442a000" (UID: "b416a87b-7f06-403c-94b4-01f27442a000"). InnerVolumeSpecName "kube-api-access-zhn78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:36:23.841408 master-0 kubenswrapper[7465]: I0320 08:36:23.841342 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-kube-api-access-8vpsg" (OuterVolumeSpecName: "kube-api-access-8vpsg") pod "3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3" (UID: "3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3"). InnerVolumeSpecName "kube-api-access-8vpsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:36:23.937067 master-0 kubenswrapper[7465]: I0320 08:36:23.936971 7465 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:23.937067 master-0 kubenswrapper[7465]: I0320 08:36:23.937032 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhn78\" (UniqueName: \"kubernetes.io/projected/b416a87b-7f06-403c-94b4-01f27442a000-kube-api-access-zhn78\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:23.937067 master-0 kubenswrapper[7465]: I0320 08:36:23.937052 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vpsg\" (UniqueName: \"kubernetes.io/projected/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-kube-api-access-8vpsg\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:23.937067 master-0 kubenswrapper[7465]: I0320 08:36:23.937067 7465 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:24.752972 master-0 kubenswrapper[7465]: I0320 08:36:24.752932 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-797c5b47d8-xrgq6" Mar 20 08:36:24.753786 master-0 kubenswrapper[7465]: I0320 08:36:24.752946 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh" Mar 20 08:36:24.878590 master-0 kubenswrapper[7465]: I0320 08:36:24.877878 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5"] Mar 20 08:36:24.891981 master-0 kubenswrapper[7465]: I0320 08:36:24.887152 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:24.891981 master-0 kubenswrapper[7465]: I0320 08:36:24.888922 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-797c5b47d8-xrgq6"] Mar 20 08:36:24.892744 master-0 kubenswrapper[7465]: I0320 08:36:24.892624 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5"] Mar 20 08:36:24.895335 master-0 kubenswrapper[7465]: I0320 08:36:24.894558 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-797c5b47d8-xrgq6"] Mar 20 08:36:24.903705 master-0 kubenswrapper[7465]: I0320 08:36:24.901315 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 08:36:24.903705 master-0 kubenswrapper[7465]: I0320 08:36:24.901374 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 08:36:24.903705 master-0 kubenswrapper[7465]: I0320 08:36:24.902393 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 08:36:24.903705 master-0 kubenswrapper[7465]: I0320 08:36:24.903474 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 08:36:24.903705 master-0 kubenswrapper[7465]: I0320 08:36:24.903684 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 08:36:24.904049 master-0 kubenswrapper[7465]: I0320 08:36:24.903762 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh"] Mar 20 08:36:24.911018 master-0 kubenswrapper[7465]: I0320 08:36:24.909989 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 08:36:24.920680 master-0 kubenswrapper[7465]: I0320 08:36:24.920166 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cd6978d68-4chhh"] Mar 20 08:36:24.934276 master-0 kubenswrapper[7465]: I0320 08:36:24.930866 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:24.942547 master-0 kubenswrapper[7465]: I0320 08:36:24.940947 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-79bc6b8d76-72j8t"] Mar 20 08:36:24.961278 master-0 kubenswrapper[7465]: I0320 08:36:24.956923 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-config\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:24.961278 master-0 kubenswrapper[7465]: I0320 08:36:24.957460 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:24.961278 master-0 kubenswrapper[7465]: I0320 08:36:24.957488 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9plc\" (UniqueName: \"kubernetes.io/projected/39288b15-7841-4449-8814-4250f6ba5db0-kube-api-access-z9plc\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:24.961278 master-0 kubenswrapper[7465]: I0320 08:36:24.957608 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-proxy-ca-bundles\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:24.961278 master-0 kubenswrapper[7465]: I0320 08:36:24.957656 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:24.961278 master-0 kubenswrapper[7465]: I0320 08:36:24.957719 7465 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:24.961278 master-0 kubenswrapper[7465]: I0320 08:36:24.957732 7465 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:24.961278 master-0 kubenswrapper[7465]: I0320 08:36:24.957743 7465 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b416a87b-7f06-403c-94b4-01f27442a000-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:24.961278 master-0 kubenswrapper[7465]: I0320 08:36:24.957754 7465 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b416a87b-7f06-403c-94b4-01f27442a000-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:24.961278 master-0 kubenswrapper[7465]: I0320 08:36:24.957766 7465 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:24.984609 master-0 kubenswrapper[7465]: I0320 08:36:24.984505 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:25.026541 master-0 kubenswrapper[7465]: E0320 08:36:25.026480 7465 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86cb5d23_df7f_4f67_8086_1789d8e68544.slice/crio-e5ff2b8e0eaf0ec4027bec63b2223736cf7655c761ce84b8c07f00889c293502.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86cb5d23_df7f_4f67_8086_1789d8e68544.slice/crio-conmon-e5ff2b8e0eaf0ec4027bec63b2223736cf7655c761ce84b8c07f00889c293502.scope\": RecentStats: unable to find data in memory cache]" Mar 20 08:36:25.059835 master-0 kubenswrapper[7465]: I0320 08:36:25.059772 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-proxy-ca-bundles\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:25.059968 master-0 kubenswrapper[7465]: I0320 08:36:25.059869 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:25.059968 master-0 kubenswrapper[7465]: I0320 08:36:25.059953 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-config\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:25.060046 master-0 kubenswrapper[7465]: I0320 08:36:25.059996 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:25.060046 master-0 kubenswrapper[7465]: I0320 08:36:25.060022 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9plc\" (UniqueName: \"kubernetes.io/projected/39288b15-7841-4449-8814-4250f6ba5db0-kube-api-access-z9plc\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:25.060392 master-0 kubenswrapper[7465]: E0320 08:36:25.060334 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:25.060623 master-0 kubenswrapper[7465]: E0320 08:36:25.060601 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca podName:39288b15-7841-4449-8814-4250f6ba5db0 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:25.560558029 +0000 UTC m=+11.203873519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca") pod "controller-manager-f9d74d9cf-ks2l5" (UID: "39288b15-7841-4449-8814-4250f6ba5db0") : configmap "client-ca" not found Mar 20 08:36:25.061134 master-0 kubenswrapper[7465]: E0320 08:36:25.060741 7465 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 20 08:36:25.061254 master-0 kubenswrapper[7465]: E0320 08:36:25.061242 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert podName:39288b15-7841-4449-8814-4250f6ba5db0 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:25.561232268 +0000 UTC m=+11.204547758 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert") pod "controller-manager-f9d74d9cf-ks2l5" (UID: "39288b15-7841-4449-8814-4250f6ba5db0") : secret "serving-cert" not found Mar 20 08:36:25.062629 master-0 kubenswrapper[7465]: I0320 08:36:25.062588 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-proxy-ca-bundles\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:25.063472 master-0 kubenswrapper[7465]: I0320 08:36:25.063413 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-config\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:25.068311 master-0 kubenswrapper[7465]: I0320 08:36:25.068282 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:25.068431 master-0 kubenswrapper[7465]: I0320 08:36:25.068409 7465 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:36:25.077051 master-0 kubenswrapper[7465]: I0320 08:36:25.077004 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:36:25.109163 master-0 kubenswrapper[7465]: I0320 08:36:25.100115 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9plc\" (UniqueName: \"kubernetes.io/projected/39288b15-7841-4449-8814-4250f6ba5db0-kube-api-access-z9plc\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:25.174011 master-0 kubenswrapper[7465]: I0320 08:36:25.173929 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:25.200045 master-0 kubenswrapper[7465]: I0320 08:36:25.199993 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:25.567062 master-0 kubenswrapper[7465]: I0320 08:36:25.566983 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:25.567444 master-0 kubenswrapper[7465]: E0320 08:36:25.567273 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:25.567444 master-0 kubenswrapper[7465]: E0320 08:36:25.567434 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca podName:39288b15-7841-4449-8814-4250f6ba5db0 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:26.56739833 +0000 UTC m=+12.210713840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca") pod "controller-manager-f9d74d9cf-ks2l5" (UID: "39288b15-7841-4449-8814-4250f6ba5db0") : configmap "client-ca" not found Mar 20 08:36:25.567643 master-0 kubenswrapper[7465]: I0320 08:36:25.567616 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:25.567944 master-0 kubenswrapper[7465]: E0320 08:36:25.567874 7465 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 20 08:36:25.568112 master-0 kubenswrapper[7465]: E0320 08:36:25.568028 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert podName:39288b15-7841-4449-8814-4250f6ba5db0 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:26.567997967 +0000 UTC m=+12.211313657 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert") pod "controller-manager-f9d74d9cf-ks2l5" (UID: "39288b15-7841-4449-8814-4250f6ba5db0") : secret "serving-cert" not found Mar 20 08:36:25.760817 master-0 kubenswrapper[7465]: I0320 08:36:25.760742 7465 generic.go:334] "Generic (PLEG): container finished" podID="86cb5d23-df7f-4f67-8086-1789d8e68544" containerID="e5ff2b8e0eaf0ec4027bec63b2223736cf7655c761ce84b8c07f00889c293502" exitCode=0 Mar 20 08:36:25.760817 master-0 kubenswrapper[7465]: I0320 08:36:25.760815 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" event={"ID":"86cb5d23-df7f-4f67-8086-1789d8e68544","Type":"ContainerDied","Data":"e5ff2b8e0eaf0ec4027bec63b2223736cf7655c761ce84b8c07f00889c293502"} Mar 20 08:36:25.766386 master-0 kubenswrapper[7465]: I0320 08:36:25.766302 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr" event={"ID":"2bf90db0-f943-464c-8599-e36b4fc32e1c","Type":"ContainerStarted","Data":"f56cbcb6003f86134027b553b740ce400f8478f47bcb39227381ffc5427ea999"} Mar 20 08:36:25.766386 master-0 kubenswrapper[7465]: I0320 08:36:25.766375 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr" event={"ID":"2bf90db0-f943-464c-8599-e36b4fc32e1c","Type":"ContainerStarted","Data":"205e4e19a489854c9e21dadf12222f24c4ca924c96c05925ba16193713f47edd"} Mar 20 08:36:25.769083 master-0 kubenswrapper[7465]: I0320 08:36:25.769018 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" event={"ID":"96de6024-e20f-4b52-9294-b330d65e4153","Type":"ContainerStarted","Data":"08ddad0132f1249a0926348017f97a2ace0dd09f6ef4f6ff1405a91cd7359f8f"} Mar 20 08:36:25.771314 master-0 kubenswrapper[7465]: I0320 08:36:25.771278 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" event={"ID":"ab175f7e-a5e8-4fda-98c9-6d052a221a83","Type":"ContainerStarted","Data":"32a6a1f6b15d34dd1d0099bb33c1df2bcbe7798374f91764aaa7bb5b94a96471"} Mar 20 08:36:25.771752 master-0 kubenswrapper[7465]: I0320 08:36:25.771677 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:25.772924 master-0 kubenswrapper[7465]: I0320 08:36:25.772839 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" event={"ID":"2f844652-225b-4713-a9ad-cf9bcc348f47","Type":"ContainerStarted","Data":"cdc09fd6c3bb18aaf3523f814928e0e85e0c65581ea0a2f8e18d09f87a8cff20"} Mar 20 08:36:25.772924 master-0 kubenswrapper[7465]: I0320 08:36:25.772907 7465 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:36:25.773080 master-0 kubenswrapper[7465]: I0320 08:36:25.772932 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" event={"ID":"2f844652-225b-4713-a9ad-cf9bcc348f47","Type":"ContainerStarted","Data":"fee33178d398a85728734b8702eecb787d89c780d680fd9fa904a7591c14e420"} Mar 20 08:36:25.812307 master-0 kubenswrapper[7465]: I0320 08:36:25.812175 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" podStartSLOduration=2.199628139 podStartE2EDuration="5.812148052s" podCreationTimestamp="2026-03-20 08:36:20 +0000 UTC" firstStartedPulling="2026-03-20 08:36:21.061075003 +0000 UTC m=+6.704390493" lastFinishedPulling="2026-03-20 08:36:24.673594916 +0000 UTC m=+10.316910406" observedRunningTime="2026-03-20 08:36:25.81070575 +0000 UTC m=+11.454021240" watchObservedRunningTime="2026-03-20 08:36:25.812148052 +0000 UTC m=+11.455463542" Mar 20 08:36:25.838007 master-0 kubenswrapper[7465]: I0320 08:36:25.837782 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr" podStartSLOduration=2.071302655 podStartE2EDuration="5.83775509s" podCreationTimestamp="2026-03-20 08:36:20 +0000 UTC" firstStartedPulling="2026-03-20 08:36:20.926889397 +0000 UTC m=+6.570204887" lastFinishedPulling="2026-03-20 08:36:24.693341832 +0000 UTC m=+10.336657322" observedRunningTime="2026-03-20 08:36:25.835237726 +0000 UTC m=+11.478553216" watchObservedRunningTime="2026-03-20 08:36:25.83775509 +0000 UTC m=+11.481070590" Mar 20 08:36:25.855412 master-0 kubenswrapper[7465]: I0320 08:36:25.855267 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" podStartSLOduration=3.85523991 podStartE2EDuration="3.85523991s" podCreationTimestamp="2026-03-20 08:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:36:25.853807278 +0000 UTC m=+11.497122788" watchObservedRunningTime="2026-03-20 08:36:25.85523991 +0000 UTC m=+11.498555400" Mar 20 08:36:26.548069 master-0 kubenswrapper[7465]: I0320 08:36:26.547174 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3" path="/var/lib/kubelet/pods/3dd3caf0-4503-4bce-8d8b-b9b4eed4d8d3/volumes" Mar 20 08:36:26.548806 master-0 kubenswrapper[7465]: I0320 08:36:26.548786 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b416a87b-7f06-403c-94b4-01f27442a000" path="/var/lib/kubelet/pods/b416a87b-7f06-403c-94b4-01f27442a000/volumes" Mar 20 08:36:26.585211 master-0 kubenswrapper[7465]: I0320 08:36:26.582238 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:26.585211 master-0 kubenswrapper[7465]: I0320 08:36:26.582469 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:26.585211 master-0 kubenswrapper[7465]: E0320 08:36:26.582689 7465 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 20 08:36:26.585211 master-0 kubenswrapper[7465]: E0320 08:36:26.582770 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert podName:39288b15-7841-4449-8814-4250f6ba5db0 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:28.582746731 +0000 UTC m=+14.226062231 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert") pod "controller-manager-f9d74d9cf-ks2l5" (UID: "39288b15-7841-4449-8814-4250f6ba5db0") : secret "serving-cert" not found Mar 20 08:36:26.585211 master-0 kubenswrapper[7465]: E0320 08:36:26.583340 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:26.585211 master-0 kubenswrapper[7465]: E0320 08:36:26.583397 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca podName:39288b15-7841-4449-8814-4250f6ba5db0 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:28.58338356 +0000 UTC m=+14.226699050 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca") pod "controller-manager-f9d74d9cf-ks2l5" (UID: "39288b15-7841-4449-8814-4250f6ba5db0") : configmap "client-ca" not found Mar 20 08:36:26.778562 master-0 kubenswrapper[7465]: I0320 08:36:26.778491 7465 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:36:26.895659 master-0 kubenswrapper[7465]: I0320 08:36:26.895506 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499"] Mar 20 08:36:26.896219 master-0 kubenswrapper[7465]: I0320 08:36:26.896158 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:26.900339 master-0 kubenswrapper[7465]: I0320 08:36:26.900298 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 08:36:26.900646 master-0 kubenswrapper[7465]: I0320 08:36:26.900620 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 08:36:26.900772 master-0 kubenswrapper[7465]: I0320 08:36:26.900751 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 08:36:26.900888 master-0 kubenswrapper[7465]: I0320 08:36:26.900866 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 08:36:26.901054 master-0 kubenswrapper[7465]: I0320 08:36:26.901030 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 08:36:26.912264 master-0 kubenswrapper[7465]: I0320 08:36:26.912217 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499"] Mar 20 08:36:26.987396 master-0 kubenswrapper[7465]: I0320 08:36:26.986985 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:26.987396 master-0 kubenswrapper[7465]: I0320 08:36:26.987085 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmsks\" (UniqueName: \"kubernetes.io/projected/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-kube-api-access-wmsks\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:26.987396 master-0 kubenswrapper[7465]: I0320 08:36:26.987194 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:26.987396 master-0 kubenswrapper[7465]: I0320 08:36:26.987278 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-config\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:27.088425 master-0 kubenswrapper[7465]: I0320 08:36:27.088278 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-config\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:27.088425 master-0 kubenswrapper[7465]: I0320 08:36:27.088352 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:27.089739 master-0 kubenswrapper[7465]: I0320 08:36:27.088833 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmsks\" (UniqueName: \"kubernetes.io/projected/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-kube-api-access-wmsks\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:27.089739 master-0 kubenswrapper[7465]: I0320 08:36:27.088918 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:27.089739 master-0 kubenswrapper[7465]: E0320 08:36:27.089072 7465 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 20 08:36:27.089739 master-0 kubenswrapper[7465]: E0320 08:36:27.089133 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert podName:14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:27.589112027 +0000 UTC m=+13.232427517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert") pod "route-controller-manager-5ff6bdb8cf-b6499" (UID: "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6") : secret "serving-cert" not found Mar 20 08:36:27.089739 master-0 kubenswrapper[7465]: E0320 08:36:27.089537 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:27.089739 master-0 kubenswrapper[7465]: E0320 08:36:27.089571 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca podName:14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:27.58955893 +0000 UTC m=+13.232874420 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca") pod "route-controller-manager-5ff6bdb8cf-b6499" (UID: "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6") : configmap "client-ca" not found Mar 20 08:36:27.089739 master-0 kubenswrapper[7465]: I0320 08:36:27.089593 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-config\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:27.105987 master-0 kubenswrapper[7465]: I0320 08:36:27.105897 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmsks\" (UniqueName: \"kubernetes.io/projected/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-kube-api-access-wmsks\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:27.595472 master-0 kubenswrapper[7465]: I0320 08:36:27.595364 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:27.595757 master-0 kubenswrapper[7465]: I0320 08:36:27.595554 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:27.595757 master-0 kubenswrapper[7465]: E0320 08:36:27.595563 7465 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 20 08:36:27.595757 master-0 kubenswrapper[7465]: E0320 08:36:27.595636 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert podName:14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:28.595617629 +0000 UTC m=+14.238933119 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert") pod "route-controller-manager-5ff6bdb8cf-b6499" (UID: "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6") : secret "serving-cert" not found Mar 20 08:36:27.595757 master-0 kubenswrapper[7465]: E0320 08:36:27.595671 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:27.595757 master-0 kubenswrapper[7465]: E0320 08:36:27.595704 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca podName:14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:28.595694151 +0000 UTC m=+14.239009641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca") pod "route-controller-manager-5ff6bdb8cf-b6499" (UID: "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6") : configmap "client-ca" not found Mar 20 08:36:28.608129 master-0 kubenswrapper[7465]: I0320 08:36:28.608089 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:28.608904 master-0 kubenswrapper[7465]: I0320 08:36:28.608876 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:28.609036 master-0 kubenswrapper[7465]: I0320 08:36:28.609013 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:28.609223 master-0 kubenswrapper[7465]: I0320 08:36:28.609209 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:28.609497 master-0 kubenswrapper[7465]: E0320 08:36:28.609447 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:28.609603 master-0 kubenswrapper[7465]: E0320 08:36:28.609579 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca podName:39288b15-7841-4449-8814-4250f6ba5db0 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:32.609543798 +0000 UTC m=+18.252859328 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca") pod "controller-manager-f9d74d9cf-ks2l5" (UID: "39288b15-7841-4449-8814-4250f6ba5db0") : configmap "client-ca" not found Mar 20 08:36:28.609838 master-0 kubenswrapper[7465]: E0320 08:36:28.609775 7465 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 20 08:36:28.609911 master-0 kubenswrapper[7465]: E0320 08:36:28.609889 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert podName:39288b15-7841-4449-8814-4250f6ba5db0 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:32.609863417 +0000 UTC m=+18.253178907 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert") pod "controller-manager-f9d74d9cf-ks2l5" (UID: "39288b15-7841-4449-8814-4250f6ba5db0") : secret "serving-cert" not found Mar 20 08:36:28.610001 master-0 kubenswrapper[7465]: E0320 08:36:28.609945 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:28.610061 master-0 kubenswrapper[7465]: E0320 08:36:28.610038 7465 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 20 08:36:28.610098 master-0 kubenswrapper[7465]: E0320 08:36:28.610074 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert podName:14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:30.610065653 +0000 UTC m=+16.253381143 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert") pod "route-controller-manager-5ff6bdb8cf-b6499" (UID: "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6") : secret "serving-cert" not found Mar 20 08:36:28.610098 master-0 kubenswrapper[7465]: E0320 08:36:28.610094 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca podName:14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:30.610085344 +0000 UTC m=+16.253400834 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca") pod "route-controller-manager-5ff6bdb8cf-b6499" (UID: "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6") : configmap "client-ca" not found Mar 20 08:36:28.794152 master-0 kubenswrapper[7465]: I0320 08:36:28.793742 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" event={"ID":"86cb5d23-df7f-4f67-8086-1789d8e68544","Type":"ContainerStarted","Data":"aba047bf62fdafe49170c75e4151358bb7421fff8f12793b7c505a061479cedc"} Mar 20 08:36:28.949074 master-0 kubenswrapper[7465]: I0320 08:36:28.949004 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:30.368094 master-0 kubenswrapper[7465]: I0320 08:36:30.368045 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-67dcd4998-c5742_86cb5d23-df7f-4f67-8086-1789d8e68544/cluster-olm-operator/0.log" Mar 20 08:36:30.370722 master-0 kubenswrapper[7465]: I0320 08:36:30.370684 7465 generic.go:334] "Generic (PLEG): container finished" podID="86cb5d23-df7f-4f67-8086-1789d8e68544" containerID="aba047bf62fdafe49170c75e4151358bb7421fff8f12793b7c505a061479cedc" exitCode=255 Mar 20 08:36:30.370857 master-0 kubenswrapper[7465]: I0320 08:36:30.370829 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" event={"ID":"86cb5d23-df7f-4f67-8086-1789d8e68544","Type":"ContainerDied","Data":"aba047bf62fdafe49170c75e4151358bb7421fff8f12793b7c505a061479cedc"} Mar 20 08:36:30.371621 master-0 kubenswrapper[7465]: I0320 08:36:30.371600 7465 scope.go:117] "RemoveContainer" containerID="aba047bf62fdafe49170c75e4151358bb7421fff8f12793b7c505a061479cedc" Mar 20 08:36:30.649793 master-0 kubenswrapper[7465]: I0320 08:36:30.649598 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:30.650075 master-0 kubenswrapper[7465]: E0320 08:36:30.649988 7465 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 20 08:36:30.650740 master-0 kubenswrapper[7465]: E0320 08:36:30.650715 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert podName:14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:34.650698744 +0000 UTC m=+20.294014234 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert") pod "route-controller-manager-5ff6bdb8cf-b6499" (UID: "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6") : secret "serving-cert" not found Mar 20 08:36:30.651811 master-0 kubenswrapper[7465]: I0320 08:36:30.651776 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:30.651881 master-0 kubenswrapper[7465]: E0320 08:36:30.651847 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:30.651921 master-0 kubenswrapper[7465]: E0320 08:36:30.651884 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca podName:14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:34.651874308 +0000 UTC m=+20.295189798 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca") pod "route-controller-manager-5ff6bdb8cf-b6499" (UID: "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6") : configmap "client-ca" not found Mar 20 08:36:31.261503 master-0 kubenswrapper[7465]: I0320 08:36:31.261433 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:31.261503 master-0 kubenswrapper[7465]: I0320 08:36:31.261503 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:31.261503 master-0 kubenswrapper[7465]: I0320 08:36:31.261533 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:31.261907 master-0 kubenswrapper[7465]: I0320 08:36:31.261568 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:31.261907 master-0 kubenswrapper[7465]: I0320 08:36:31.261596 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:31.261907 master-0 kubenswrapper[7465]: I0320 08:36:31.261614 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:31.261907 master-0 kubenswrapper[7465]: I0320 08:36:31.261639 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:31.262064 master-0 kubenswrapper[7465]: E0320 08:36:31.261919 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 20 08:36:31.262064 master-0 kubenswrapper[7465]: E0320 08:36:31.261988 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert podName:7b489385-2c96-4a97-8b31-362162de020e nodeName:}" failed. No retries permitted until 2026-03-20 08:36:47.261967943 +0000 UTC m=+32.905283433 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert") pod "olm-operator-5c9796789-tjm9l" (UID: "7b489385-2c96-4a97-8b31-362162de020e") : secret "olm-operator-serving-cert" not found Mar 20 08:36:31.262582 master-0 kubenswrapper[7465]: E0320 08:36:31.262544 7465 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:31.262661 master-0 kubenswrapper[7465]: E0320 08:36:31.262587 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:47.262576961 +0000 UTC m=+32.905892451 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-webhook-server-cert" not found Mar 20 08:36:31.262661 master-0 kubenswrapper[7465]: I0320 08:36:31.262617 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:31.262661 master-0 kubenswrapper[7465]: I0320 08:36:31.262654 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:31.262812 master-0 kubenswrapper[7465]: I0320 08:36:31.262685 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:31.262812 master-0 kubenswrapper[7465]: I0320 08:36:31.262707 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:31.262812 master-0 kubenswrapper[7465]: I0320 08:36:31.262737 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:31.262812 master-0 kubenswrapper[7465]: I0320 08:36:31.262758 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:31.262812 master-0 kubenswrapper[7465]: I0320 08:36:31.262773 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:31.262812 master-0 kubenswrapper[7465]: I0320 08:36:31.262797 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:31.263083 master-0 kubenswrapper[7465]: E0320 08:36:31.263032 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 20 08:36:31.263208 master-0 kubenswrapper[7465]: E0320 08:36:31.263165 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert podName:df428d5a-c722-4536-8e7f-cdd85c560481 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:47.263132257 +0000 UTC m=+32.906447787 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert") pod "catalog-operator-68f85b4d6c-fzm28" (UID: "df428d5a-c722-4536-8e7f-cdd85c560481") : secret "catalog-operator-serving-cert" not found Mar 20 08:36:31.263325 master-0 kubenswrapper[7465]: E0320 08:36:31.263298 7465 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 20 08:36:31.263378 master-0 kubenswrapper[7465]: E0320 08:36:31.263355 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics podName:acb704a9-6c8d-4378-ae93-e7095b1fce85 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:47.263339373 +0000 UTC m=+32.906654903 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-mvn4t" (UID: "acb704a9-6c8d-4378-ae93-e7095b1fce85") : secret "marketplace-operator-metrics" not found Mar 20 08:36:31.263475 master-0 kubenswrapper[7465]: E0320 08:36:31.263450 7465 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 20 08:36:31.263524 master-0 kubenswrapper[7465]: E0320 08:36:31.263499 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls podName:42df77ec-94aa-48ba-bb35-7b1f1e8b8e97 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:47.263487297 +0000 UTC m=+32.906802827 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls") pod "machine-config-operator-84d549f6d5-gm4qr" (UID: "42df77ec-94aa-48ba-bb35-7b1f1e8b8e97") : secret "mco-proxy-tls" not found Mar 20 08:36:31.263603 master-0 kubenswrapper[7465]: E0320 08:36:31.263581 7465 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 20 08:36:31.263651 master-0 kubenswrapper[7465]: E0320 08:36:31.263628 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert podName:bbc0b783-28d5-4554-b49d-c66082546f44 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:47.263613921 +0000 UTC m=+32.906929411 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-2pg77" (UID: "bbc0b783-28d5-4554-b49d-c66082546f44") : secret "package-server-manager-serving-cert" not found Mar 20 08:36:31.267525 master-0 kubenswrapper[7465]: E0320 08:36:31.267461 7465 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 20 08:36:31.267643 master-0 kubenswrapper[7465]: E0320 08:36:31.267611 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs podName:813f91c2-2b37-4681-968d-4217e286e22f nodeName:}" failed. No retries permitted until 2026-03-20 08:36:47.267576286 +0000 UTC m=+32.910891786 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs") pod "network-metrics-daemon-srdjm" (UID: "813f91c2-2b37-4681-968d-4217e286e22f") : secret "metrics-daemon-secret" not found Mar 20 08:36:31.267760 master-0 kubenswrapper[7465]: E0320 08:36:31.267701 7465 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:31.267820 master-0 kubenswrapper[7465]: E0320 08:36:31.267760 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls podName:f53bc282-5937-49ac-ac98-2ee37ccb268d nodeName:}" failed. No retries permitted until 2026-03-20 08:36:47.267745941 +0000 UTC m=+32.911061631 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-dv6cd" (UID: "f53bc282-5937-49ac-ac98-2ee37ccb268d") : secret "cluster-baremetal-operator-tls" not found Mar 20 08:36:31.267869 master-0 kubenswrapper[7465]: E0320 08:36:31.267843 7465 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:31.267913 master-0 kubenswrapper[7465]: E0320 08:36:31.267889 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls podName:45e8b72b-564c-4bb1-b911-baff2d6c87ad nodeName:}" failed. No retries permitted until 2026-03-20 08:36:47.267873285 +0000 UTC m=+32.911188875 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-nljsr" (UID: "45e8b72b-564c-4bb1-b911-baff2d6c87ad") : secret "cluster-monitoring-operator-tls" not found Mar 20 08:36:31.268893 master-0 kubenswrapper[7465]: I0320 08:36:31.268857 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:31.269067 master-0 kubenswrapper[7465]: I0320 08:36:31.269020 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"cluster-version-operator-56d8475767-9gfkg\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:31.269258 master-0 kubenswrapper[7465]: I0320 08:36:31.269236 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:31.269635 master-0 kubenswrapper[7465]: I0320 08:36:31.269580 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:31.272486 master-0 kubenswrapper[7465]: I0320 08:36:31.272445 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:31.274968 master-0 kubenswrapper[7465]: I0320 08:36:31.274912 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:31.363757 master-0 kubenswrapper[7465]: I0320 08:36:31.363691 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:31.364054 master-0 kubenswrapper[7465]: E0320 08:36:31.363954 7465 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 20 08:36:31.364102 master-0 kubenswrapper[7465]: E0320 08:36:31.364057 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs podName:6a80bd6f-2263-4251-8197-5173193f8afc nodeName:}" failed. No retries permitted until 2026-03-20 08:36:47.364031731 +0000 UTC m=+33.007347321 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-5rrrh" (UID: "6a80bd6f-2263-4251-8197-5173193f8afc") : secret "multus-admission-controller-secret" not found Mar 20 08:36:31.376704 master-0 kubenswrapper[7465]: I0320 08:36:31.376631 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" event={"ID":"aa16c3bf-2350-46d1-afa0-9477b3ec8877","Type":"ContainerStarted","Data":"222440b4a2f7299de95ce041a034d3160fcac83fac650064e342b5c86cfa35c1"} Mar 20 08:36:31.378464 master-0 kubenswrapper[7465]: I0320 08:36:31.378421 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" event={"ID":"f046860d-2d54-4746-8ba2-f8e90fa55e38","Type":"ContainerStarted","Data":"0bd5b879aadefbf9a44c6c6dcf866736c73041c10aa61588b2570b45b21eb4d2"} Mar 20 08:36:31.380283 master-0 kubenswrapper[7465]: I0320 08:36:31.380228 7465 generic.go:334] "Generic (PLEG): container finished" podID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerID="32a6a1f6b15d34dd1d0099bb33c1df2bcbe7798374f91764aaa7bb5b94a96471" exitCode=0 Mar 20 08:36:31.380377 master-0 kubenswrapper[7465]: I0320 08:36:31.380310 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" event={"ID":"ab175f7e-a5e8-4fda-98c9-6d052a221a83","Type":"ContainerDied","Data":"32a6a1f6b15d34dd1d0099bb33c1df2bcbe7798374f91764aaa7bb5b94a96471"} Mar 20 08:36:31.380925 master-0 kubenswrapper[7465]: I0320 08:36:31.380883 7465 scope.go:117] "RemoveContainer" containerID="32a6a1f6b15d34dd1d0099bb33c1df2bcbe7798374f91764aaa7bb5b94a96471" Mar 20 08:36:31.384242 master-0 kubenswrapper[7465]: I0320 08:36:31.382733 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-67dcd4998-c5742_86cb5d23-df7f-4f67-8086-1789d8e68544/cluster-olm-operator/0.log" Mar 20 08:36:31.384242 master-0 kubenswrapper[7465]: I0320 08:36:31.383732 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" event={"ID":"86cb5d23-df7f-4f67-8086-1789d8e68544","Type":"ContainerStarted","Data":"517434b092860d80f200ad453a8ab960ca389e8d7a3ffc04820cc51b48ee30fe"} Mar 20 08:36:31.398393 master-0 kubenswrapper[7465]: I0320 08:36:31.395793 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:36:31.398393 master-0 kubenswrapper[7465]: I0320 08:36:31.396696 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:36:31.399558 master-0 kubenswrapper[7465]: I0320 08:36:31.399518 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:36:31.399634 master-0 kubenswrapper[7465]: I0320 08:36:31.399605 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:36:31.399831 master-0 kubenswrapper[7465]: I0320 08:36:31.399811 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:36:31.794156 master-0 kubenswrapper[7465]: I0320 08:36:31.793611 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq"] Mar 20 08:36:31.848738 master-0 kubenswrapper[7465]: I0320 08:36:31.848604 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg"] Mar 20 08:36:31.869587 master-0 kubenswrapper[7465]: W0320 08:36:31.868167 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebab0b0b_6cc3_490a_944b_0f8b4e2d5ae7.slice/crio-b8d362586f6fb451267fc09fb3ebee6bcd03ae4f33abd963a5893c4d677a7fc5 WatchSource:0}: Error finding container b8d362586f6fb451267fc09fb3ebee6bcd03ae4f33abd963a5893c4d677a7fc5: Status 404 returned error can't find the container with id b8d362586f6fb451267fc09fb3ebee6bcd03ae4f33abd963a5893c4d677a7fc5 Mar 20 08:36:31.884427 master-0 kubenswrapper[7465]: I0320 08:36:31.882655 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-9c5679d8f-r6dm8"] Mar 20 08:36:31.918423 master-0 kubenswrapper[7465]: I0320 08:36:31.917985 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m"] Mar 20 08:36:31.954389 master-0 kubenswrapper[7465]: I0320 08:36:31.949807 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:32.419290 master-0 kubenswrapper[7465]: I0320 08:36:32.418525 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" event={"ID":"ab175f7e-a5e8-4fda-98c9-6d052a221a83","Type":"ContainerStarted","Data":"c27171e28b2189b6b8d129f565f467a424f94c7f7677d037a89da262ce118c31"} Mar 20 08:36:32.420131 master-0 kubenswrapper[7465]: I0320 08:36:32.419552 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:32.422341 master-0 kubenswrapper[7465]: I0320 08:36:32.422050 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" event={"ID":"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02","Type":"ContainerStarted","Data":"0262d134f60647f6e04ff950df203ce5bc3f1656b20c1e15f442731269c3be76"} Mar 20 08:36:32.428587 master-0 kubenswrapper[7465]: I0320 08:36:32.427468 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" event={"ID":"3f471ecc-922c-4cb1-9bdd-fdb5da08c592","Type":"ContainerStarted","Data":"3092eb7a16220393b74c3ca8c6aedf7058f62f9313af91e571c5d2e31d050e35"} Mar 20 08:36:32.430888 master-0 kubenswrapper[7465]: I0320 08:36:32.429907 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" event={"ID":"dab97c35-fe60-4134-8715-a7c6dd085fb3","Type":"ContainerStarted","Data":"37362c35c821b752ac43451f735aec18a90fc1833fb68a65ce307045393dec48"} Mar 20 08:36:32.431778 master-0 kubenswrapper[7465]: I0320 08:36:32.431745 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" event={"ID":"b4291bfd-53d9-4c78-b7cb-d7eb46560528","Type":"ContainerStarted","Data":"3f9c2dbd6bdf8182b597345f8c7fea11c09d5e650fe0f55bf00a3c9f8887aa52"} Mar 20 08:36:32.433449 master-0 kubenswrapper[7465]: I0320 08:36:32.433371 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" event={"ID":"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7","Type":"ContainerStarted","Data":"b8d362586f6fb451267fc09fb3ebee6bcd03ae4f33abd963a5893c4d677a7fc5"} Mar 20 08:36:32.696308 master-0 kubenswrapper[7465]: I0320 08:36:32.696119 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:32.696932 master-0 kubenswrapper[7465]: I0320 08:36:32.696897 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:32.697597 master-0 kubenswrapper[7465]: E0320 08:36:32.697033 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:32.697804 master-0 kubenswrapper[7465]: E0320 08:36:32.697780 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca podName:39288b15-7841-4449-8814-4250f6ba5db0 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:40.697619679 +0000 UTC m=+26.340935169 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca") pod "controller-manager-f9d74d9cf-ks2l5" (UID: "39288b15-7841-4449-8814-4250f6ba5db0") : configmap "client-ca" not found Mar 20 08:36:32.711690 master-0 kubenswrapper[7465]: I0320 08:36:32.711645 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:34.452499 master-0 kubenswrapper[7465]: I0320 08:36:34.452419 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:36:34.728303 master-0 kubenswrapper[7465]: I0320 08:36:34.727588 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:34.728303 master-0 kubenswrapper[7465]: E0320 08:36:34.727787 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:34.728303 master-0 kubenswrapper[7465]: I0320 08:36:34.727868 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:34.728303 master-0 kubenswrapper[7465]: E0320 08:36:34.727895 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca podName:14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:42.727872507 +0000 UTC m=+28.371187997 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca") pod "route-controller-manager-5ff6bdb8cf-b6499" (UID: "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6") : configmap "client-ca" not found Mar 20 08:36:34.728303 master-0 kubenswrapper[7465]: E0320 08:36:34.727995 7465 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 20 08:36:34.728303 master-0 kubenswrapper[7465]: E0320 08:36:34.728054 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert podName:14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:42.728036042 +0000 UTC m=+28.371351532 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert") pod "route-controller-manager-5ff6bdb8cf-b6499" (UID: "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6") : secret "serving-cert" not found Mar 20 08:36:36.689059 master-0 kubenswrapper[7465]: I0320 08:36:36.688602 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-c765cd67b-xddmb"] Mar 20 08:36:36.690845 master-0 kubenswrapper[7465]: I0320 08:36:36.690067 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.692795 master-0 kubenswrapper[7465]: I0320 08:36:36.692121 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Mar 20 08:36:36.692795 master-0 kubenswrapper[7465]: I0320 08:36:36.692410 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Mar 20 08:36:36.692795 master-0 kubenswrapper[7465]: I0320 08:36:36.692611 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 08:36:36.696090 master-0 kubenswrapper[7465]: I0320 08:36:36.695598 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 08:36:36.696090 master-0 kubenswrapper[7465]: I0320 08:36:36.695961 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 08:36:36.697567 master-0 kubenswrapper[7465]: I0320 08:36:36.696455 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 08:36:36.697567 master-0 kubenswrapper[7465]: I0320 08:36:36.696622 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 08:36:36.697567 master-0 kubenswrapper[7465]: I0320 08:36:36.696782 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 08:36:36.697567 master-0 kubenswrapper[7465]: I0320 08:36:36.696909 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 08:36:36.707455 master-0 kubenswrapper[7465]: I0320 08:36:36.707072 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 08:36:36.707455 master-0 kubenswrapper[7465]: I0320 08:36:36.707399 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-c765cd67b-xddmb"] Mar 20 08:36:36.762012 master-0 kubenswrapper[7465]: I0320 08:36:36.761467 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-config\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.762012 master-0 kubenswrapper[7465]: I0320 08:36:36.761514 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-encryption-config\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.762012 master-0 kubenswrapper[7465]: I0320 08:36:36.761539 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-image-import-ca\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.762012 master-0 kubenswrapper[7465]: I0320 08:36:36.761575 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0f5535dc-722d-4948-8c71-eac713e57af5-node-pullsecrets\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.762012 master-0 kubenswrapper[7465]: I0320 08:36:36.761591 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-serving-ca\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.762012 master-0 kubenswrapper[7465]: I0320 08:36:36.761616 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-serving-cert\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.762012 master-0 kubenswrapper[7465]: I0320 08:36:36.761632 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-audit\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.762012 master-0 kubenswrapper[7465]: I0320 08:36:36.761656 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-trusted-ca-bundle\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.762012 master-0 kubenswrapper[7465]: I0320 08:36:36.761693 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-client\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.762012 master-0 kubenswrapper[7465]: I0320 08:36:36.761709 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f5535dc-722d-4948-8c71-eac713e57af5-audit-dir\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.762012 master-0 kubenswrapper[7465]: I0320 08:36:36.761739 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw58s\" (UniqueName: \"kubernetes.io/projected/0f5535dc-722d-4948-8c71-eac713e57af5-kube-api-access-pw58s\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.864356 master-0 kubenswrapper[7465]: I0320 08:36:36.863394 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-config\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.864356 master-0 kubenswrapper[7465]: I0320 08:36:36.863481 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-encryption-config\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.864356 master-0 kubenswrapper[7465]: I0320 08:36:36.863517 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-image-import-ca\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.864356 master-0 kubenswrapper[7465]: I0320 08:36:36.863582 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0f5535dc-722d-4948-8c71-eac713e57af5-node-pullsecrets\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.864356 master-0 kubenswrapper[7465]: I0320 08:36:36.863605 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-serving-ca\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.864356 master-0 kubenswrapper[7465]: I0320 08:36:36.863645 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-serving-cert\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.864356 master-0 kubenswrapper[7465]: I0320 08:36:36.863670 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-audit\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.864356 master-0 kubenswrapper[7465]: I0320 08:36:36.863708 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-trusted-ca-bundle\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.864356 master-0 kubenswrapper[7465]: I0320 08:36:36.863773 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-client\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.864356 master-0 kubenswrapper[7465]: I0320 08:36:36.863801 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f5535dc-722d-4948-8c71-eac713e57af5-audit-dir\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.864356 master-0 kubenswrapper[7465]: I0320 08:36:36.863841 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw58s\" (UniqueName: \"kubernetes.io/projected/0f5535dc-722d-4948-8c71-eac713e57af5-kube-api-access-pw58s\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.865695 master-0 kubenswrapper[7465]: I0320 08:36:36.864597 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0f5535dc-722d-4948-8c71-eac713e57af5-node-pullsecrets\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.865695 master-0 kubenswrapper[7465]: I0320 08:36:36.865329 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-serving-ca\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.865808 master-0 kubenswrapper[7465]: I0320 08:36:36.865749 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-trusted-ca-bundle\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.866860 master-0 kubenswrapper[7465]: E0320 08:36:36.866814 7465 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 20 08:36:36.866933 master-0 kubenswrapper[7465]: E0320 08:36:36.866896 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-audit podName:0f5535dc-722d-4948-8c71-eac713e57af5 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:37.36687626 +0000 UTC m=+23.010191750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-audit") pod "apiserver-c765cd67b-xddmb" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5") : configmap "audit-0" not found Mar 20 08:36:36.866990 master-0 kubenswrapper[7465]: E0320 08:36:36.866957 7465 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 20 08:36:36.866990 master-0 kubenswrapper[7465]: E0320 08:36:36.866982 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-client podName:0f5535dc-722d-4948-8c71-eac713e57af5 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:37.366975323 +0000 UTC m=+23.010290813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-client") pod "apiserver-c765cd67b-xddmb" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5") : secret "etcd-client" not found Mar 20 08:36:36.867140 master-0 kubenswrapper[7465]: I0320 08:36:36.867122 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f5535dc-722d-4948-8c71-eac713e57af5-audit-dir\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.867814 master-0 kubenswrapper[7465]: I0320 08:36:36.867773 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-config\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.868200 master-0 kubenswrapper[7465]: E0320 08:36:36.868122 7465 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 20 08:36:36.868278 master-0 kubenswrapper[7465]: I0320 08:36:36.868232 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-image-import-ca\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.868278 master-0 kubenswrapper[7465]: E0320 08:36:36.868244 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-serving-cert podName:0f5535dc-722d-4948-8c71-eac713e57af5 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:37.368213539 +0000 UTC m=+23.011529219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-serving-cert") pod "apiserver-c765cd67b-xddmb" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5") : secret "serving-cert" not found Mar 20 08:36:36.875836 master-0 kubenswrapper[7465]: I0320 08:36:36.875777 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-encryption-config\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:36.885363 master-0 kubenswrapper[7465]: I0320 08:36:36.883600 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw58s\" (UniqueName: \"kubernetes.io/projected/0f5535dc-722d-4948-8c71-eac713e57af5-kube-api-access-pw58s\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:37.370425 master-0 kubenswrapper[7465]: I0320 08:36:37.370350 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-serving-cert\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:37.370425 master-0 kubenswrapper[7465]: I0320 08:36:37.370409 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-audit\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:37.370814 master-0 kubenswrapper[7465]: I0320 08:36:37.370464 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-client\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:37.370814 master-0 kubenswrapper[7465]: E0320 08:36:37.370682 7465 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 20 08:36:37.370814 master-0 kubenswrapper[7465]: E0320 08:36:37.370748 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-client podName:0f5535dc-722d-4948-8c71-eac713e57af5 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:38.370728274 +0000 UTC m=+24.014043764 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-client") pod "apiserver-c765cd67b-xddmb" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5") : secret "etcd-client" not found Mar 20 08:36:37.372815 master-0 kubenswrapper[7465]: E0320 08:36:37.372141 7465 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 20 08:36:37.372815 master-0 kubenswrapper[7465]: E0320 08:36:37.372268 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-serving-cert podName:0f5535dc-722d-4948-8c71-eac713e57af5 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:38.372197587 +0000 UTC m=+24.015513077 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-serving-cert") pod "apiserver-c765cd67b-xddmb" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5") : secret "serving-cert" not found Mar 20 08:36:37.372815 master-0 kubenswrapper[7465]: E0320 08:36:37.372329 7465 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 20 08:36:37.372815 master-0 kubenswrapper[7465]: E0320 08:36:37.372435 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-audit podName:0f5535dc-722d-4948-8c71-eac713e57af5 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:38.372412233 +0000 UTC m=+24.015727723 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-audit") pod "apiserver-c765cd67b-xddmb" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5") : configmap "audit-0" not found Mar 20 08:36:38.399436 master-0 kubenswrapper[7465]: I0320 08:36:38.399374 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-serving-cert\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:38.399436 master-0 kubenswrapper[7465]: I0320 08:36:38.399441 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-audit\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:38.400466 master-0 kubenswrapper[7465]: E0320 08:36:38.399708 7465 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 20 08:36:38.400466 master-0 kubenswrapper[7465]: E0320 08:36:38.399830 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-serving-cert podName:0f5535dc-722d-4948-8c71-eac713e57af5 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:40.399800514 +0000 UTC m=+26.043116004 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-serving-cert") pod "apiserver-c765cd67b-xddmb" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5") : secret "serving-cert" not found Mar 20 08:36:38.400466 master-0 kubenswrapper[7465]: E0320 08:36:38.400102 7465 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 20 08:36:38.400466 master-0 kubenswrapper[7465]: E0320 08:36:38.400304 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-audit podName:0f5535dc-722d-4948-8c71-eac713e57af5 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:40.400255778 +0000 UTC m=+26.043571268 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-audit") pod "apiserver-c765cd67b-xddmb" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5") : configmap "audit-0" not found Mar 20 08:36:38.400466 master-0 kubenswrapper[7465]: I0320 08:36:38.400348 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-client\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:38.401489 master-0 kubenswrapper[7465]: E0320 08:36:38.401446 7465 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 20 08:36:38.401625 master-0 kubenswrapper[7465]: E0320 08:36:38.401603 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-client podName:0f5535dc-722d-4948-8c71-eac713e57af5 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:40.401589586 +0000 UTC m=+26.044905076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-client") pod "apiserver-c765cd67b-xddmb" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5") : secret "etcd-client" not found Mar 20 08:36:40.434518 master-0 kubenswrapper[7465]: I0320 08:36:40.432060 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-serving-cert\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:40.434518 master-0 kubenswrapper[7465]: I0320 08:36:40.432497 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-audit\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:40.434518 master-0 kubenswrapper[7465]: I0320 08:36:40.432577 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-client\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:40.434518 master-0 kubenswrapper[7465]: E0320 08:36:40.432952 7465 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 20 08:36:40.434518 master-0 kubenswrapper[7465]: E0320 08:36:40.433039 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-audit podName:0f5535dc-722d-4948-8c71-eac713e57af5 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:44.43301756 +0000 UTC m=+30.076333050 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-audit") pod "apiserver-c765cd67b-xddmb" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5") : configmap "audit-0" not found Mar 20 08:36:40.436586 master-0 kubenswrapper[7465]: I0320 08:36:40.435981 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-serving-cert\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:40.436586 master-0 kubenswrapper[7465]: I0320 08:36:40.436564 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-client\") pod \"apiserver-c765cd67b-xddmb\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:40.695129 master-0 kubenswrapper[7465]: I0320 08:36:40.694688 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-c765cd67b-xddmb"] Mar 20 08:36:40.695437 master-0 kubenswrapper[7465]: E0320 08:36:40.695388 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-c765cd67b-xddmb" podUID="0f5535dc-722d-4948-8c71-eac713e57af5" Mar 20 08:36:40.739743 master-0 kubenswrapper[7465]: I0320 08:36:40.739657 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca\") pod \"controller-manager-f9d74d9cf-ks2l5\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:40.740015 master-0 kubenswrapper[7465]: E0320 08:36:40.739936 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:40.740054 master-0 kubenswrapper[7465]: E0320 08:36:40.740013 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca podName:39288b15-7841-4449-8814-4250f6ba5db0 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:56.739991508 +0000 UTC m=+42.383306998 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca") pod "controller-manager-f9d74d9cf-ks2l5" (UID: "39288b15-7841-4449-8814-4250f6ba5db0") : configmap "client-ca" not found Mar 20 08:36:41.019877 master-0 kubenswrapper[7465]: I0320 08:36:41.016930 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5"] Mar 20 08:36:41.019877 master-0 kubenswrapper[7465]: E0320 08:36:41.018819 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" podUID="39288b15-7841-4449-8814-4250f6ba5db0" Mar 20 08:36:41.111527 master-0 kubenswrapper[7465]: I0320 08:36:41.111472 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499"] Mar 20 08:36:41.111861 master-0 kubenswrapper[7465]: E0320 08:36:41.111822 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" podUID="14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6" Mar 20 08:36:41.494101 master-0 kubenswrapper[7465]: I0320 08:36:41.493524 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" event={"ID":"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02","Type":"ContainerStarted","Data":"0c7a85a881c1e6ccd13a87741cbb2be0fd8a4f88ff19bd0cb0941e6a43061f79"} Mar 20 08:36:41.507228 master-0 kubenswrapper[7465]: I0320 08:36:41.507153 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" event={"ID":"dab97c35-fe60-4134-8715-a7c6dd085fb3","Type":"ContainerStarted","Data":"120e4b0ea2b4cfd16661b5116aa5a9d09dd784f31b0c1dd992cc3cf6ddc463ef"} Mar 20 08:36:41.519949 master-0 kubenswrapper[7465]: I0320 08:36:41.519877 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" event={"ID":"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7","Type":"ContainerStarted","Data":"68df10b3a72fc3b0c353b5fc70a166a2be68d78636e2ecc68d4b89aecbe60781"} Mar 20 08:36:41.535359 master-0 kubenswrapper[7465]: I0320 08:36:41.535309 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" event={"ID":"b4291bfd-53d9-4c78-b7cb-d7eb46560528","Type":"ContainerStarted","Data":"7d9ef09c05c17f91e19a7e2b31b502d477af56141dfbd1c2fd48a2cadd1f3194"} Mar 20 08:36:41.552156 master-0 kubenswrapper[7465]: I0320 08:36:41.550215 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf"] Mar 20 08:36:41.552156 master-0 kubenswrapper[7465]: I0320 08:36:41.551345 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.552780 master-0 kubenswrapper[7465]: I0320 08:36:41.552750 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:41.553000 master-0 kubenswrapper[7465]: I0320 08:36:41.552970 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" event={"ID":"a57854ac-809a-4745-aaa1-774f0a08a560","Type":"ContainerStarted","Data":"4d0d97d44af51af5156c718231836b8527e98e8ee5a7d3079503faf5682e5428"} Mar 20 08:36:41.553690 master-0 kubenswrapper[7465]: I0320 08:36:41.553675 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:41.553820 master-0 kubenswrapper[7465]: I0320 08:36:41.553805 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:41.559213 master-0 kubenswrapper[7465]: I0320 08:36:41.557770 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 20 08:36:41.559213 master-0 kubenswrapper[7465]: I0320 08:36:41.558017 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 20 08:36:41.565005 master-0 kubenswrapper[7465]: I0320 08:36:41.564966 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf"] Mar 20 08:36:41.597363 master-0 kubenswrapper[7465]: I0320 08:36:41.572732 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 20 08:36:41.635970 master-0 kubenswrapper[7465]: I0320 08:36:41.635915 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:41.655197 master-0 kubenswrapper[7465]: I0320 08:36:41.655080 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7e451189-850e-4d19-a40c-40f642d08511-cache\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.657359 master-0 kubenswrapper[7465]: I0320 08:36:41.657157 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:41.657529 master-0 kubenswrapper[7465]: I0320 08:36:41.657510 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/7e451189-850e-4d19-a40c-40f642d08511-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.657676 master-0 kubenswrapper[7465]: I0320 08:36:41.657647 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/7e451189-850e-4d19-a40c-40f642d08511-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.657931 master-0 kubenswrapper[7465]: I0320 08:36:41.657916 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snmpq\" (UniqueName: \"kubernetes.io/projected/7e451189-850e-4d19-a40c-40f642d08511-kube-api-access-snmpq\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.658049 master-0 kubenswrapper[7465]: I0320 08:36:41.658036 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/7e451189-850e-4d19-a40c-40f642d08511-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.712767 master-0 kubenswrapper[7465]: I0320 08:36:41.706265 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:41.729840 master-0 kubenswrapper[7465]: I0320 08:36:41.726809 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr"] Mar 20 08:36:41.729840 master-0 kubenswrapper[7465]: I0320 08:36:41.727628 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.730822 master-0 kubenswrapper[7465]: I0320 08:36:41.730796 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 20 08:36:41.732813 master-0 kubenswrapper[7465]: I0320 08:36:41.732766 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 20 08:36:41.732974 master-0 kubenswrapper[7465]: I0320 08:36:41.732925 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 20 08:36:41.747373 master-0 kubenswrapper[7465]: I0320 08:36:41.741830 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 20 08:36:41.751500 master-0 kubenswrapper[7465]: I0320 08:36:41.749942 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr"] Mar 20 08:36:41.772590 master-0 kubenswrapper[7465]: I0320 08:36:41.770876 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmsks\" (UniqueName: \"kubernetes.io/projected/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-kube-api-access-wmsks\") pod \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " Mar 20 08:36:41.772590 master-0 kubenswrapper[7465]: I0320 08:36:41.770936 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert\") pod \"39288b15-7841-4449-8814-4250f6ba5db0\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " Mar 20 08:36:41.772590 master-0 kubenswrapper[7465]: I0320 08:36:41.770982 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-proxy-ca-bundles\") pod \"39288b15-7841-4449-8814-4250f6ba5db0\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " Mar 20 08:36:41.772590 master-0 kubenswrapper[7465]: I0320 08:36:41.771019 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9plc\" (UniqueName: \"kubernetes.io/projected/39288b15-7841-4449-8814-4250f6ba5db0-kube-api-access-z9plc\") pod \"39288b15-7841-4449-8814-4250f6ba5db0\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " Mar 20 08:36:41.772590 master-0 kubenswrapper[7465]: I0320 08:36:41.771038 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-config\") pod \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " Mar 20 08:36:41.772590 master-0 kubenswrapper[7465]: I0320 08:36:41.771081 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-config\") pod \"39288b15-7841-4449-8814-4250f6ba5db0\" (UID: \"39288b15-7841-4449-8814-4250f6ba5db0\") " Mar 20 08:36:41.772590 master-0 kubenswrapper[7465]: I0320 08:36:41.771290 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/7e451189-850e-4d19-a40c-40f642d08511-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.772590 master-0 kubenswrapper[7465]: I0320 08:36:41.771314 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snmpq\" (UniqueName: \"kubernetes.io/projected/7e451189-850e-4d19-a40c-40f642d08511-kube-api-access-snmpq\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.772590 master-0 kubenswrapper[7465]: I0320 08:36:41.772038 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "39288b15-7841-4449-8814-4250f6ba5db0" (UID: "39288b15-7841-4449-8814-4250f6ba5db0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:36:41.772590 master-0 kubenswrapper[7465]: I0320 08:36:41.772269 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7e451189-850e-4d19-a40c-40f642d08511-cache\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.784541 master-0 kubenswrapper[7465]: I0320 08:36:41.772772 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-config" (OuterVolumeSpecName: "config") pod "39288b15-7841-4449-8814-4250f6ba5db0" (UID: "39288b15-7841-4449-8814-4250f6ba5db0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:36:41.784541 master-0 kubenswrapper[7465]: I0320 08:36:41.772966 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/7e451189-850e-4d19-a40c-40f642d08511-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.784541 master-0 kubenswrapper[7465]: I0320 08:36:41.773054 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/7e451189-850e-4d19-a40c-40f642d08511-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.784541 master-0 kubenswrapper[7465]: I0320 08:36:41.773139 7465 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.784541 master-0 kubenswrapper[7465]: I0320 08:36:41.773156 7465 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.784541 master-0 kubenswrapper[7465]: I0320 08:36:41.775477 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/7e451189-850e-4d19-a40c-40f642d08511-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.784541 master-0 kubenswrapper[7465]: I0320 08:36:41.775857 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/7e451189-850e-4d19-a40c-40f642d08511-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.784541 master-0 kubenswrapper[7465]: I0320 08:36:41.776019 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7e451189-850e-4d19-a40c-40f642d08511-cache\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.784541 master-0 kubenswrapper[7465]: I0320 08:36:41.779612 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-config" (OuterVolumeSpecName: "config") pod "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6" (UID: "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:36:41.784541 master-0 kubenswrapper[7465]: I0320 08:36:41.784122 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "39288b15-7841-4449-8814-4250f6ba5db0" (UID: "39288b15-7841-4449-8814-4250f6ba5db0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:36:41.791651 master-0 kubenswrapper[7465]: I0320 08:36:41.787084 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/7e451189-850e-4d19-a40c-40f642d08511-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.797205 master-0 kubenswrapper[7465]: I0320 08:36:41.795419 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39288b15-7841-4449-8814-4250f6ba5db0-kube-api-access-z9plc" (OuterVolumeSpecName: "kube-api-access-z9plc") pod "39288b15-7841-4449-8814-4250f6ba5db0" (UID: "39288b15-7841-4449-8814-4250f6ba5db0"). InnerVolumeSpecName "kube-api-access-z9plc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:36:41.815553 master-0 kubenswrapper[7465]: I0320 08:36:41.815475 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-kube-api-access-wmsks" (OuterVolumeSpecName: "kube-api-access-wmsks") pod "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6" (UID: "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6"). InnerVolumeSpecName "kube-api-access-wmsks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:36:41.832016 master-0 kubenswrapper[7465]: I0320 08:36:41.826965 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snmpq\" (UniqueName: \"kubernetes.io/projected/7e451189-850e-4d19-a40c-40f642d08511-kube-api-access-snmpq\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.851179 master-0 kubenswrapper[7465]: I0320 08:36:41.842038 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-hb77b"] Mar 20 08:36:41.851179 master-0 kubenswrapper[7465]: I0320 08:36:41.842709 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:41.874526 master-0 kubenswrapper[7465]: I0320 08:36:41.874470 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-client\") pod \"0f5535dc-722d-4948-8c71-eac713e57af5\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " Mar 20 08:36:41.874526 master-0 kubenswrapper[7465]: I0320 08:36:41.874528 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f5535dc-722d-4948-8c71-eac713e57af5-audit-dir\") pod \"0f5535dc-722d-4948-8c71-eac713e57af5\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " Mar 20 08:36:41.874867 master-0 kubenswrapper[7465]: I0320 08:36:41.874572 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-image-import-ca\") pod \"0f5535dc-722d-4948-8c71-eac713e57af5\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " Mar 20 08:36:41.874867 master-0 kubenswrapper[7465]: I0320 08:36:41.874617 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw58s\" (UniqueName: \"kubernetes.io/projected/0f5535dc-722d-4948-8c71-eac713e57af5-kube-api-access-pw58s\") pod \"0f5535dc-722d-4948-8c71-eac713e57af5\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " Mar 20 08:36:41.874867 master-0 kubenswrapper[7465]: I0320 08:36:41.874639 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-config\") pod \"0f5535dc-722d-4948-8c71-eac713e57af5\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " Mar 20 08:36:41.874867 master-0 kubenswrapper[7465]: I0320 08:36:41.874656 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-encryption-config\") pod \"0f5535dc-722d-4948-8c71-eac713e57af5\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " Mar 20 08:36:41.874867 master-0 kubenswrapper[7465]: I0320 08:36:41.874684 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-trusted-ca-bundle\") pod \"0f5535dc-722d-4948-8c71-eac713e57af5\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " Mar 20 08:36:41.874867 master-0 kubenswrapper[7465]: I0320 08:36:41.874702 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-serving-ca\") pod \"0f5535dc-722d-4948-8c71-eac713e57af5\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " Mar 20 08:36:41.874867 master-0 kubenswrapper[7465]: I0320 08:36:41.874734 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0f5535dc-722d-4948-8c71-eac713e57af5-node-pullsecrets\") pod \"0f5535dc-722d-4948-8c71-eac713e57af5\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " Mar 20 08:36:41.874867 master-0 kubenswrapper[7465]: I0320 08:36:41.874821 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-serving-cert\") pod \"0f5535dc-722d-4948-8c71-eac713e57af5\" (UID: \"0f5535dc-722d-4948-8c71-eac713e57af5\") " Mar 20 08:36:41.875117 master-0 kubenswrapper[7465]: I0320 08:36:41.875067 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.875117 master-0 kubenswrapper[7465]: I0320 08:36:41.875113 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.875296 master-0 kubenswrapper[7465]: I0320 08:36:41.875153 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qstvb\" (UniqueName: \"kubernetes.io/projected/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-kube-api-access-qstvb\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.875296 master-0 kubenswrapper[7465]: I0320 08:36:41.875289 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-cache\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.875362 master-0 kubenswrapper[7465]: I0320 08:36:41.875319 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.875362 master-0 kubenswrapper[7465]: I0320 08:36:41.875359 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.875442 master-0 kubenswrapper[7465]: I0320 08:36:41.875422 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9plc\" (UniqueName: \"kubernetes.io/projected/39288b15-7841-4449-8814-4250f6ba5db0-kube-api-access-z9plc\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.875442 master-0 kubenswrapper[7465]: I0320 08:36:41.875438 7465 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.875513 master-0 kubenswrapper[7465]: I0320 08:36:41.875450 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmsks\" (UniqueName: \"kubernetes.io/projected/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-kube-api-access-wmsks\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.875513 master-0 kubenswrapper[7465]: I0320 08:36:41.875461 7465 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39288b15-7841-4449-8814-4250f6ba5db0-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.877063 master-0 kubenswrapper[7465]: I0320 08:36:41.876749 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "0f5535dc-722d-4948-8c71-eac713e57af5" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:36:41.877063 master-0 kubenswrapper[7465]: I0320 08:36:41.876808 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f5535dc-722d-4948-8c71-eac713e57af5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "0f5535dc-722d-4948-8c71-eac713e57af5" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:36:41.877063 master-0 kubenswrapper[7465]: I0320 08:36:41.877018 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "0f5535dc-722d-4948-8c71-eac713e57af5" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:36:41.878630 master-0 kubenswrapper[7465]: I0320 08:36:41.878577 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f5535dc-722d-4948-8c71-eac713e57af5-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "0f5535dc-722d-4948-8c71-eac713e57af5" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:36:41.879149 master-0 kubenswrapper[7465]: I0320 08:36:41.879118 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "0f5535dc-722d-4948-8c71-eac713e57af5" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:36:41.880981 master-0 kubenswrapper[7465]: I0320 08:36:41.880938 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-config" (OuterVolumeSpecName: "config") pod "0f5535dc-722d-4948-8c71-eac713e57af5" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:36:41.881494 master-0 kubenswrapper[7465]: I0320 08:36:41.881465 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "0f5535dc-722d-4948-8c71-eac713e57af5" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:36:41.881819 master-0 kubenswrapper[7465]: I0320 08:36:41.881769 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f5535dc-722d-4948-8c71-eac713e57af5-kube-api-access-pw58s" (OuterVolumeSpecName: "kube-api-access-pw58s") pod "0f5535dc-722d-4948-8c71-eac713e57af5" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5"). InnerVolumeSpecName "kube-api-access-pw58s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:36:41.891210 master-0 kubenswrapper[7465]: I0320 08:36:41.887039 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0f5535dc-722d-4948-8c71-eac713e57af5" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:36:41.891210 master-0 kubenswrapper[7465]: I0320 08:36:41.890995 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0f5535dc-722d-4948-8c71-eac713e57af5" (UID: "0f5535dc-722d-4948-8c71-eac713e57af5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:36:41.922616 master-0 kubenswrapper[7465]: I0320 08:36:41.922530 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:41.977603 master-0 kubenswrapper[7465]: I0320 08:36:41.976884 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-var-lib-kubelet\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:41.977603 master-0 kubenswrapper[7465]: I0320 08:36:41.977597 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-run\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:41.977740 master-0 kubenswrapper[7465]: I0320 08:36:41.977654 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.977740 master-0 kubenswrapper[7465]: I0320 08:36:41.977688 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.977740 master-0 kubenswrapper[7465]: I0320 08:36:41.977708 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysctl-conf\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:41.977740 master-0 kubenswrapper[7465]: I0320 08:36:41.977724 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-sys\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:41.977740 master-0 kubenswrapper[7465]: I0320 08:36:41.977741 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysctl-d\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:41.978229 master-0 kubenswrapper[7465]: I0320 08:36:41.977762 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-kubernetes\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:41.978229 master-0 kubenswrapper[7465]: I0320 08:36:41.977779 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-host\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:41.978229 master-0 kubenswrapper[7465]: I0320 08:36:41.977807 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-lib-modules\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:41.978229 master-0 kubenswrapper[7465]: I0320 08:36:41.977831 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qstvb\" (UniqueName: \"kubernetes.io/projected/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-kube-api-access-qstvb\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.978229 master-0 kubenswrapper[7465]: I0320 08:36:41.977860 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-tuned\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:41.978229 master-0 kubenswrapper[7465]: I0320 08:36:41.977876 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a18b9230-de78-41b8-a61e-361b8bb1fbb3-tmp\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:41.978229 master-0 kubenswrapper[7465]: I0320 08:36:41.977896 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-cache\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.978229 master-0 kubenswrapper[7465]: I0320 08:36:41.977933 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8crkc\" (UniqueName: \"kubernetes.io/projected/a18b9230-de78-41b8-a61e-361b8bb1fbb3-kube-api-access-8crkc\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:41.978229 master-0 kubenswrapper[7465]: I0320 08:36:41.978082 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.978987 master-0 kubenswrapper[7465]: I0320 08:36:41.978647 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.978987 master-0 kubenswrapper[7465]: I0320 08:36:41.978679 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-modprobe-d\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:41.978987 master-0 kubenswrapper[7465]: I0320 08:36:41.978713 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-systemd\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:41.978987 master-0 kubenswrapper[7465]: I0320 08:36:41.978742 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.979224 master-0 kubenswrapper[7465]: I0320 08:36:41.979199 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysconfig\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:41.979268 master-0 kubenswrapper[7465]: I0320 08:36:41.979261 7465 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0f5535dc-722d-4948-8c71-eac713e57af5-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.979311 master-0 kubenswrapper[7465]: I0320 08:36:41.979278 7465 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.979311 master-0 kubenswrapper[7465]: I0320 08:36:41.979290 7465 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.979311 master-0 kubenswrapper[7465]: I0320 08:36:41.979300 7465 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f5535dc-722d-4948-8c71-eac713e57af5-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.979311 master-0 kubenswrapper[7465]: I0320 08:36:41.979311 7465 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.979429 master-0 kubenswrapper[7465]: I0320 08:36:41.979327 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw58s\" (UniqueName: \"kubernetes.io/projected/0f5535dc-722d-4948-8c71-eac713e57af5-kube-api-access-pw58s\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.979429 master-0 kubenswrapper[7465]: I0320 08:36:41.979339 7465 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.979429 master-0 kubenswrapper[7465]: I0320 08:36:41.979348 7465 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0f5535dc-722d-4948-8c71-eac713e57af5-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.979429 master-0 kubenswrapper[7465]: I0320 08:36:41.979359 7465 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.979429 master-0 kubenswrapper[7465]: I0320 08:36:41.979369 7465 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:41.979576 master-0 kubenswrapper[7465]: I0320 08:36:41.979487 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.979576 master-0 kubenswrapper[7465]: I0320 08:36:41.979490 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-cache\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.982135 master-0 kubenswrapper[7465]: I0320 08:36:41.982103 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:41.983445 master-0 kubenswrapper[7465]: I0320 08:36:41.983340 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:42.000811 master-0 kubenswrapper[7465]: I0320 08:36:41.999237 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qstvb\" (UniqueName: \"kubernetes.io/projected/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-kube-api-access-qstvb\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:42.074281 master-0 kubenswrapper[7465]: I0320 08:36:42.071859 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:42.081153 master-0 kubenswrapper[7465]: I0320 08:36:42.081093 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysconfig\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081325 master-0 kubenswrapper[7465]: I0320 08:36:42.081196 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-var-lib-kubelet\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081325 master-0 kubenswrapper[7465]: I0320 08:36:42.081217 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-run\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081325 master-0 kubenswrapper[7465]: I0320 08:36:42.081284 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysctl-d\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081325 master-0 kubenswrapper[7465]: I0320 08:36:42.081304 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysctl-conf\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081325 master-0 kubenswrapper[7465]: I0320 08:36:42.081319 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-sys\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081503 master-0 kubenswrapper[7465]: I0320 08:36:42.081334 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-kubernetes\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081503 master-0 kubenswrapper[7465]: I0320 08:36:42.081349 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-host\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081503 master-0 kubenswrapper[7465]: I0320 08:36:42.081372 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-lib-modules\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081503 master-0 kubenswrapper[7465]: I0320 08:36:42.081390 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a18b9230-de78-41b8-a61e-361b8bb1fbb3-tmp\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081503 master-0 kubenswrapper[7465]: I0320 08:36:42.081405 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-tuned\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081503 master-0 kubenswrapper[7465]: I0320 08:36:42.081429 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8crkc\" (UniqueName: \"kubernetes.io/projected/a18b9230-de78-41b8-a61e-361b8bb1fbb3-kube-api-access-8crkc\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081503 master-0 kubenswrapper[7465]: I0320 08:36:42.081462 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-modprobe-d\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081503 master-0 kubenswrapper[7465]: I0320 08:36:42.081479 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-systemd\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081789 master-0 kubenswrapper[7465]: I0320 08:36:42.081569 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-systemd\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081789 master-0 kubenswrapper[7465]: I0320 08:36:42.081613 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysconfig\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081789 master-0 kubenswrapper[7465]: I0320 08:36:42.081643 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-var-lib-kubelet\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081789 master-0 kubenswrapper[7465]: I0320 08:36:42.081787 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-run\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.081903 master-0 kubenswrapper[7465]: I0320 08:36:42.081819 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysctl-d\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.083220 master-0 kubenswrapper[7465]: I0320 08:36:42.081958 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysctl-conf\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.083220 master-0 kubenswrapper[7465]: I0320 08:36:42.082122 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-kubernetes\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.083220 master-0 kubenswrapper[7465]: I0320 08:36:42.082155 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-lib-modules\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.083220 master-0 kubenswrapper[7465]: I0320 08:36:42.082217 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-host\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.083220 master-0 kubenswrapper[7465]: I0320 08:36:42.082172 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-sys\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.083220 master-0 kubenswrapper[7465]: I0320 08:36:42.082316 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-modprobe-d\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.085825 master-0 kubenswrapper[7465]: I0320 08:36:42.085793 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a18b9230-de78-41b8-a61e-361b8bb1fbb3-tmp\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.087050 master-0 kubenswrapper[7465]: I0320 08:36:42.086917 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-tuned\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.105519 master-0 kubenswrapper[7465]: I0320 08:36:42.102014 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8crkc\" (UniqueName: \"kubernetes.io/projected/a18b9230-de78-41b8-a61e-361b8bb1fbb3-kube-api-access-8crkc\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.182770 master-0 kubenswrapper[7465]: I0320 08:36:42.182716 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf"] Mar 20 08:36:42.185201 master-0 kubenswrapper[7465]: I0320 08:36:42.185161 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:36:42.213359 master-0 kubenswrapper[7465]: W0320 08:36:42.210631 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e451189_850e_4d19_a40c_40f642d08511.slice/crio-fd67ef0d5263721af2c9563f40007bfa8118e9b8e56c96c99f48f239bd0b7044 WatchSource:0}: Error finding container fd67ef0d5263721af2c9563f40007bfa8118e9b8e56c96c99f48f239bd0b7044: Status 404 returned error can't find the container with id fd67ef0d5263721af2c9563f40007bfa8118e9b8e56c96c99f48f239bd0b7044 Mar 20 08:36:42.327232 master-0 kubenswrapper[7465]: I0320 08:36:42.322097 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr"] Mar 20 08:36:42.459987 master-0 kubenswrapper[7465]: I0320 08:36:42.459942 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-v5h69"] Mar 20 08:36:42.460641 master-0 kubenswrapper[7465]: I0320 08:36:42.460613 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v5h69" Mar 20 08:36:42.468160 master-0 kubenswrapper[7465]: I0320 08:36:42.463413 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 08:36:42.468160 master-0 kubenswrapper[7465]: I0320 08:36:42.463751 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 08:36:42.468160 master-0 kubenswrapper[7465]: I0320 08:36:42.463918 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 08:36:42.468160 master-0 kubenswrapper[7465]: I0320 08:36:42.464147 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 08:36:42.474736 master-0 kubenswrapper[7465]: I0320 08:36:42.471889 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v5h69"] Mar 20 08:36:42.591901 master-0 kubenswrapper[7465]: I0320 08:36:42.591829 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" event={"ID":"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02","Type":"ContainerStarted","Data":"afe5be40b772a3679b289a32d738409fbbd0267f6b546d1fa3b047d53cf456bb"} Mar 20 08:36:42.595175 master-0 kubenswrapper[7465]: I0320 08:36:42.595140 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2qf7\" (UniqueName: \"kubernetes.io/projected/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-kube-api-access-g2qf7\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:36:42.595260 master-0 kubenswrapper[7465]: I0320 08:36:42.595236 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-config-volume\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:36:42.598869 master-0 kubenswrapper[7465]: I0320 08:36:42.598824 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-metrics-tls\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:36:42.599235 master-0 kubenswrapper[7465]: I0320 08:36:42.599173 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" event={"ID":"4fea9b05-222e-4b58-95c8-735fc1cf3a8b","Type":"ContainerStarted","Data":"6cc2d27a03b36826decc5cc4343612194df412f00fd1e83d62bd9da95cdaba5c"} Mar 20 08:36:42.623399 master-0 kubenswrapper[7465]: I0320 08:36:42.623342 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" event={"ID":"3f471ecc-922c-4cb1-9bdd-fdb5da08c592","Type":"ContainerStarted","Data":"b7f59c60792cc7ff5e71be447612403a3bb4cc5643976d4b99c5a00201eb0b72"} Mar 20 08:36:42.623536 master-0 kubenswrapper[7465]: I0320 08:36:42.623409 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" event={"ID":"3f471ecc-922c-4cb1-9bdd-fdb5da08c592","Type":"ContainerStarted","Data":"cebc013f8d48a631981ef28b781b77dd8f3f5a8d8bf87e9f117a4185f222c73e"} Mar 20 08:36:42.636712 master-0 kubenswrapper[7465]: I0320 08:36:42.636639 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hb77b" event={"ID":"a18b9230-de78-41b8-a61e-361b8bb1fbb3","Type":"ContainerStarted","Data":"5a144517ab4145de856f7fc1dbaa9248dc8d50b14986e3a4c4c4e8525dca2fdd"} Mar 20 08:36:42.636930 master-0 kubenswrapper[7465]: I0320 08:36:42.636720 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hb77b" event={"ID":"a18b9230-de78-41b8-a61e-361b8bb1fbb3","Type":"ContainerStarted","Data":"1e3599d0789edeb8eec1a5568ce03b8378093e7082af3b48ff3c7ee7e6273252"} Mar 20 08:36:42.647645 master-0 kubenswrapper[7465]: I0320 08:36:42.644790 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5" Mar 20 08:36:42.647645 master-0 kubenswrapper[7465]: I0320 08:36:42.645115 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-c765cd67b-xddmb" Mar 20 08:36:42.647645 master-0 kubenswrapper[7465]: I0320 08:36:42.646217 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" event={"ID":"7e451189-850e-4d19-a40c-40f642d08511","Type":"ContainerStarted","Data":"fd67ef0d5263721af2c9563f40007bfa8118e9b8e56c96c99f48f239bd0b7044"} Mar 20 08:36:42.647645 master-0 kubenswrapper[7465]: I0320 08:36:42.646280 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:42.704070 master-0 kubenswrapper[7465]: I0320 08:36:42.704027 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-metrics-tls\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:36:42.710832 master-0 kubenswrapper[7465]: I0320 08:36:42.707130 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2qf7\" (UniqueName: \"kubernetes.io/projected/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-kube-api-access-g2qf7\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:36:42.710832 master-0 kubenswrapper[7465]: I0320 08:36:42.707487 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-config-volume\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:36:42.710832 master-0 kubenswrapper[7465]: I0320 08:36:42.709139 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-config-volume\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:36:42.727014 master-0 kubenswrapper[7465]: I0320 08:36:42.726898 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-metrics-tls\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:36:42.737760 master-0 kubenswrapper[7465]: I0320 08:36:42.737483 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2qf7\" (UniqueName: \"kubernetes.io/projected/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-kube-api-access-g2qf7\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:36:42.754550 master-0 kubenswrapper[7465]: I0320 08:36:42.754330 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hb77b" podStartSLOduration=1.754301041 podStartE2EDuration="1.754301041s" podCreationTimestamp="2026-03-20 08:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:36:42.676086428 +0000 UTC m=+28.319401918" watchObservedRunningTime="2026-03-20 08:36:42.754301041 +0000 UTC m=+28.397616531" Mar 20 08:36:42.772178 master-0 kubenswrapper[7465]: I0320 08:36:42.771541 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5"] Mar 20 08:36:42.772178 master-0 kubenswrapper[7465]: I0320 08:36:42.771624 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq"] Mar 20 08:36:42.773978 master-0 kubenswrapper[7465]: I0320 08:36:42.772632 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:42.781118 master-0 kubenswrapper[7465]: I0320 08:36:42.781077 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 08:36:42.781334 master-0 kubenswrapper[7465]: I0320 08:36:42.781304 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 08:36:42.781546 master-0 kubenswrapper[7465]: I0320 08:36:42.781520 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 08:36:42.781625 master-0 kubenswrapper[7465]: I0320 08:36:42.781345 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 08:36:42.781822 master-0 kubenswrapper[7465]: I0320 08:36:42.781391 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 08:36:42.783171 master-0 kubenswrapper[7465]: I0320 08:36:42.782400 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-f9d74d9cf-ks2l5"] Mar 20 08:36:42.792717 master-0 kubenswrapper[7465]: I0320 08:36:42.792628 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq"] Mar 20 08:36:42.795852 master-0 kubenswrapper[7465]: I0320 08:36:42.795823 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 08:36:42.820708 master-0 kubenswrapper[7465]: I0320 08:36:42.820640 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-c765cd67b-xddmb"] Mar 20 08:36:42.839775 master-0 kubenswrapper[7465]: I0320 08:36:42.830017 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-serving-cert\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:42.839775 master-0 kubenswrapper[7465]: I0320 08:36:42.830128 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:42.839775 master-0 kubenswrapper[7465]: I0320 08:36:42.830164 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:42.839775 master-0 kubenswrapper[7465]: I0320 08:36:42.830220 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca\") pod \"route-controller-manager-5ff6bdb8cf-b6499\" (UID: \"14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6\") " pod="openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499" Mar 20 08:36:42.839775 master-0 kubenswrapper[7465]: I0320 08:36:42.830280 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-config\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:42.839775 master-0 kubenswrapper[7465]: I0320 08:36:42.830299 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn9xv\" (UniqueName: \"kubernetes.io/projected/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-kube-api-access-cn9xv\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:42.839775 master-0 kubenswrapper[7465]: I0320 08:36:42.830401 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-proxy-ca-bundles\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:42.839775 master-0 kubenswrapper[7465]: I0320 08:36:42.830465 7465 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/39288b15-7841-4449-8814-4250f6ba5db0-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:42.839775 master-0 kubenswrapper[7465]: E0320 08:36:42.832967 7465 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: object "openshift-route-controller-manager"/"serving-cert" not registered Mar 20 08:36:42.839775 master-0 kubenswrapper[7465]: E0320 08:36:42.833025 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert podName:14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:58.833008128 +0000 UTC m=+44.476323618 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert") pod "route-controller-manager-5ff6bdb8cf-b6499" (UID: "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6") : object "openshift-route-controller-manager"/"serving-cert" not registered Mar 20 08:36:42.839775 master-0 kubenswrapper[7465]: E0320 08:36:42.833198 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: object "openshift-route-controller-manager"/"client-ca" not registered Mar 20 08:36:42.839775 master-0 kubenswrapper[7465]: E0320 08:36:42.833223 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca podName:14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6 nodeName:}" failed. No retries permitted until 2026-03-20 08:36:58.833215324 +0000 UTC m=+44.476530814 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca") pod "route-controller-manager-5ff6bdb8cf-b6499" (UID: "14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6") : object "openshift-route-controller-manager"/"client-ca" not registered Mar 20 08:36:42.839775 master-0 kubenswrapper[7465]: I0320 08:36:42.837147 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-c765cd67b-xddmb"] Mar 20 08:36:42.866622 master-0 kubenswrapper[7465]: I0320 08:36:42.864523 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499"] Mar 20 08:36:42.868366 master-0 kubenswrapper[7465]: I0320 08:36:42.868336 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ff6bdb8cf-b6499"] Mar 20 08:36:42.875794 master-0 kubenswrapper[7465]: I0320 08:36:42.875740 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v5h69" Mar 20 08:36:42.966246 master-0 kubenswrapper[7465]: I0320 08:36:42.957838 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-proxy-ca-bundles\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:42.966246 master-0 kubenswrapper[7465]: I0320 08:36:42.957908 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-serving-cert\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:42.966246 master-0 kubenswrapper[7465]: I0320 08:36:42.959140 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:42.966246 master-0 kubenswrapper[7465]: I0320 08:36:42.959298 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-config\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:42.966246 master-0 kubenswrapper[7465]: I0320 08:36:42.959332 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn9xv\" (UniqueName: \"kubernetes.io/projected/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-kube-api-access-cn9xv\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:42.966246 master-0 kubenswrapper[7465]: I0320 08:36:42.959400 7465 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:42.966246 master-0 kubenswrapper[7465]: I0320 08:36:42.959413 7465 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0f5535dc-722d-4948-8c71-eac713e57af5-audit\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:42.966246 master-0 kubenswrapper[7465]: I0320 08:36:42.959426 7465 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:42.966246 master-0 kubenswrapper[7465]: I0320 08:36:42.959898 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-proxy-ca-bundles\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:42.966246 master-0 kubenswrapper[7465]: E0320 08:36:42.960007 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:42.966246 master-0 kubenswrapper[7465]: E0320 08:36:42.960065 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca podName:da399c95-96f3-4ea1-bd47-1d6fba7aae8f nodeName:}" failed. No retries permitted until 2026-03-20 08:36:43.460046395 +0000 UTC m=+29.103361875 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca") pod "controller-manager-5dcd9ffc84-qh6pq" (UID: "da399c95-96f3-4ea1-bd47-1d6fba7aae8f") : configmap "client-ca" not found Mar 20 08:36:42.966246 master-0 kubenswrapper[7465]: I0320 08:36:42.961129 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-config\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:42.980851 master-0 kubenswrapper[7465]: I0320 08:36:42.980009 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-serving-cert\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:42.980851 master-0 kubenswrapper[7465]: I0320 08:36:42.980772 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-qnp9w"] Mar 20 08:36:42.981693 master-0 kubenswrapper[7465]: I0320 08:36:42.981449 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qnp9w" Mar 20 08:36:42.992374 master-0 kubenswrapper[7465]: I0320 08:36:42.992324 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn9xv\" (UniqueName: \"kubernetes.io/projected/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-kube-api-access-cn9xv\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:43.064072 master-0 kubenswrapper[7465]: I0320 08:36:43.060741 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s2tb\" (UniqueName: \"kubernetes.io/projected/70020125-af49-47d7-8853-fb951c561dc4-kube-api-access-9s2tb\") pod \"node-resolver-qnp9w\" (UID: \"70020125-af49-47d7-8853-fb951c561dc4\") " pod="openshift-dns/node-resolver-qnp9w" Mar 20 08:36:43.064072 master-0 kubenswrapper[7465]: I0320 08:36:43.060989 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/70020125-af49-47d7-8853-fb951c561dc4-hosts-file\") pod \"node-resolver-qnp9w\" (UID: \"70020125-af49-47d7-8853-fb951c561dc4\") " pod="openshift-dns/node-resolver-qnp9w" Mar 20 08:36:43.163628 master-0 kubenswrapper[7465]: I0320 08:36:43.163535 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s2tb\" (UniqueName: \"kubernetes.io/projected/70020125-af49-47d7-8853-fb951c561dc4-kube-api-access-9s2tb\") pod \"node-resolver-qnp9w\" (UID: \"70020125-af49-47d7-8853-fb951c561dc4\") " pod="openshift-dns/node-resolver-qnp9w" Mar 20 08:36:43.163872 master-0 kubenswrapper[7465]: I0320 08:36:43.163702 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/70020125-af49-47d7-8853-fb951c561dc4-hosts-file\") pod \"node-resolver-qnp9w\" (UID: \"70020125-af49-47d7-8853-fb951c561dc4\") " pod="openshift-dns/node-resolver-qnp9w" Mar 20 08:36:43.164021 master-0 kubenswrapper[7465]: I0320 08:36:43.163978 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/70020125-af49-47d7-8853-fb951c561dc4-hosts-file\") pod \"node-resolver-qnp9w\" (UID: \"70020125-af49-47d7-8853-fb951c561dc4\") " pod="openshift-dns/node-resolver-qnp9w" Mar 20 08:36:43.190248 master-0 kubenswrapper[7465]: I0320 08:36:43.188025 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s2tb\" (UniqueName: \"kubernetes.io/projected/70020125-af49-47d7-8853-fb951c561dc4-kube-api-access-9s2tb\") pod \"node-resolver-qnp9w\" (UID: \"70020125-af49-47d7-8853-fb951c561dc4\") " pod="openshift-dns/node-resolver-qnp9w" Mar 20 08:36:43.196359 master-0 kubenswrapper[7465]: I0320 08:36:43.195920 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v5h69"] Mar 20 08:36:43.318821 master-0 kubenswrapper[7465]: I0320 08:36:43.318608 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qnp9w" Mar 20 08:36:43.338858 master-0 kubenswrapper[7465]: W0320 08:36:43.338820 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70020125_af49_47d7_8853_fb951c561dc4.slice/crio-e78425b2d05a4bc50aa45d6948d8f1a6996398b351024eb248c91be99ea577d1 WatchSource:0}: Error finding container e78425b2d05a4bc50aa45d6948d8f1a6996398b351024eb248c91be99ea577d1: Status 404 returned error can't find the container with id e78425b2d05a4bc50aa45d6948d8f1a6996398b351024eb248c91be99ea577d1 Mar 20 08:36:43.468504 master-0 kubenswrapper[7465]: I0320 08:36:43.468040 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:43.468842 master-0 kubenswrapper[7465]: E0320 08:36:43.468312 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:43.469049 master-0 kubenswrapper[7465]: E0320 08:36:43.469034 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca podName:da399c95-96f3-4ea1-bd47-1d6fba7aae8f nodeName:}" failed. No retries permitted until 2026-03-20 08:36:44.469009378 +0000 UTC m=+30.112324868 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca") pod "controller-manager-5dcd9ffc84-qh6pq" (UID: "da399c95-96f3-4ea1-bd47-1d6fba7aae8f") : configmap "client-ca" not found Mar 20 08:36:43.664212 master-0 kubenswrapper[7465]: I0320 08:36:43.657469 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qnp9w" event={"ID":"70020125-af49-47d7-8853-fb951c561dc4","Type":"ContainerStarted","Data":"e78425b2d05a4bc50aa45d6948d8f1a6996398b351024eb248c91be99ea577d1"} Mar 20 08:36:43.682211 master-0 kubenswrapper[7465]: I0320 08:36:43.676997 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v5h69" event={"ID":"12e1d9e5-96b5-4367-81a5-d87b3f93d8da","Type":"ContainerStarted","Data":"989d132822ac99b97c52492bc7539dcc4d25a3a8fbced6fed73e66c9b3f74f8d"} Mar 20 08:36:43.682211 master-0 kubenswrapper[7465]: I0320 08:36:43.678472 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" event={"ID":"4fea9b05-222e-4b58-95c8-735fc1cf3a8b","Type":"ContainerStarted","Data":"92b79031f76eecd271206da72b0a3408ff8ea5659094905a6bd063d6847591cb"} Mar 20 08:36:43.682211 master-0 kubenswrapper[7465]: I0320 08:36:43.678497 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" event={"ID":"4fea9b05-222e-4b58-95c8-735fc1cf3a8b","Type":"ContainerStarted","Data":"02aff2f4c34f3ffc5f01d06a5769735a5d3c6b81311638c6d9a8ab1333acabbf"} Mar 20 08:36:43.682211 master-0 kubenswrapper[7465]: I0320 08:36:43.679502 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:43.711214 master-0 kubenswrapper[7465]: I0320 08:36:43.709615 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" event={"ID":"7e451189-850e-4d19-a40c-40f642d08511","Type":"ContainerStarted","Data":"546c7b24cfc5ec09edc9a677851e1c9898e06c55218cbc617714bc25cf6c07e6"} Mar 20 08:36:43.711214 master-0 kubenswrapper[7465]: I0320 08:36:43.709677 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:43.711214 master-0 kubenswrapper[7465]: I0320 08:36:43.709692 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" event={"ID":"7e451189-850e-4d19-a40c-40f642d08511","Type":"ContainerStarted","Data":"270e7ed792fece0ff9d9a6dbda1ff1ab238d9c5aab177de687ac26e9f4d69fcc"} Mar 20 08:36:43.726212 master-0 kubenswrapper[7465]: I0320 08:36:43.718434 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" podStartSLOduration=2.7183960860000003 podStartE2EDuration="2.718396086s" podCreationTimestamp="2026-03-20 08:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:36:43.716928643 +0000 UTC m=+29.360244153" watchObservedRunningTime="2026-03-20 08:36:43.718396086 +0000 UTC m=+29.361711576" Mar 20 08:36:43.745218 master-0 kubenswrapper[7465]: I0320 08:36:43.743447 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" podStartSLOduration=2.743398165 podStartE2EDuration="2.743398165s" podCreationTimestamp="2026-03-20 08:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:36:43.742289203 +0000 UTC m=+29.385604693" watchObservedRunningTime="2026-03-20 08:36:43.743398165 +0000 UTC m=+29.386713655" Mar 20 08:36:44.496738 master-0 kubenswrapper[7465]: I0320 08:36:44.496659 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:44.497133 master-0 kubenswrapper[7465]: E0320 08:36:44.496871 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:44.497133 master-0 kubenswrapper[7465]: E0320 08:36:44.496988 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca podName:da399c95-96f3-4ea1-bd47-1d6fba7aae8f nodeName:}" failed. No retries permitted until 2026-03-20 08:36:46.496957536 +0000 UTC m=+32.140273016 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca") pod "controller-manager-5dcd9ffc84-qh6pq" (UID: "da399c95-96f3-4ea1-bd47-1d6fba7aae8f") : configmap "client-ca" not found Mar 20 08:36:44.545111 master-0 kubenswrapper[7465]: I0320 08:36:44.544008 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f5535dc-722d-4948-8c71-eac713e57af5" path="/var/lib/kubelet/pods/0f5535dc-722d-4948-8c71-eac713e57af5/volumes" Mar 20 08:36:44.545111 master-0 kubenswrapper[7465]: I0320 08:36:44.544518 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6" path="/var/lib/kubelet/pods/14a3a13e-0fa4-4059-9aa0-3e167eb8e2f6/volumes" Mar 20 08:36:44.545111 master-0 kubenswrapper[7465]: I0320 08:36:44.544983 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39288b15-7841-4449-8814-4250f6ba5db0" path="/var/lib/kubelet/pods/39288b15-7841-4449-8814-4250f6ba5db0/volumes" Mar 20 08:36:44.719015 master-0 kubenswrapper[7465]: I0320 08:36:44.718205 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qnp9w" event={"ID":"70020125-af49-47d7-8853-fb951c561dc4","Type":"ContainerStarted","Data":"332fd47a2a02eef67172c0fb87a227f35ffda45b1b4e74194c1d5e85f7d71a60"} Mar 20 08:36:44.792340 master-0 kubenswrapper[7465]: I0320 08:36:44.792178 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qnp9w" podStartSLOduration=2.792152191 podStartE2EDuration="2.792152191s" podCreationTimestamp="2026-03-20 08:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:36:44.736033433 +0000 UTC m=+30.379348923" watchObservedRunningTime="2026-03-20 08:36:44.792152191 +0000 UTC m=+30.435467681" Mar 20 08:36:44.793205 master-0 kubenswrapper[7465]: I0320 08:36:44.793149 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-6fccff6869-f98nk"] Mar 20 08:36:44.794228 master-0 kubenswrapper[7465]: I0320 08:36:44.794206 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:44.798210 master-0 kubenswrapper[7465]: I0320 08:36:44.797540 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 08:36:44.798210 master-0 kubenswrapper[7465]: I0320 08:36:44.797867 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 08:36:44.799477 master-0 kubenswrapper[7465]: I0320 08:36:44.799439 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 08:36:44.799943 master-0 kubenswrapper[7465]: I0320 08:36:44.799696 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 08:36:44.801374 master-0 kubenswrapper[7465]: I0320 08:36:44.801307 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 08:36:44.801506 master-0 kubenswrapper[7465]: I0320 08:36:44.801479 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 08:36:44.801923 master-0 kubenswrapper[7465]: I0320 08:36:44.801881 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 08:36:44.802216 master-0 kubenswrapper[7465]: I0320 08:36:44.802177 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 08:36:44.889478 master-0 kubenswrapper[7465]: I0320 08:36:44.885445 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-6fccff6869-f98nk"] Mar 20 08:36:44.901394 master-0 kubenswrapper[7465]: I0320 08:36:44.901334 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-encryption-config\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:44.901394 master-0 kubenswrapper[7465]: I0320 08:36:44.901394 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-etcd-serving-ca\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:44.901719 master-0 kubenswrapper[7465]: I0320 08:36:44.901419 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-audit-policies\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:44.901719 master-0 kubenswrapper[7465]: I0320 08:36:44.901537 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr8gw\" (UniqueName: \"kubernetes.io/projected/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-kube-api-access-fr8gw\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:44.901861 master-0 kubenswrapper[7465]: I0320 08:36:44.901712 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-etcd-client\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:44.901923 master-0 kubenswrapper[7465]: I0320 08:36:44.901894 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-audit-dir\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:44.901963 master-0 kubenswrapper[7465]: I0320 08:36:44.901948 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-serving-cert\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:44.902134 master-0 kubenswrapper[7465]: I0320 08:36:44.902101 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-trusted-ca-bundle\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.003597 master-0 kubenswrapper[7465]: I0320 08:36:45.003530 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr8gw\" (UniqueName: \"kubernetes.io/projected/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-kube-api-access-fr8gw\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.003597 master-0 kubenswrapper[7465]: I0320 08:36:45.003607 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-etcd-client\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.004164 master-0 kubenswrapper[7465]: I0320 08:36:45.004113 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-audit-dir\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.004239 master-0 kubenswrapper[7465]: I0320 08:36:45.004202 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-serving-cert\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.004300 master-0 kubenswrapper[7465]: I0320 08:36:45.004279 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-trusted-ca-bundle\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.004381 master-0 kubenswrapper[7465]: I0320 08:36:45.004360 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-encryption-config\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.004422 master-0 kubenswrapper[7465]: I0320 08:36:45.004400 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-etcd-serving-ca\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.004456 master-0 kubenswrapper[7465]: I0320 08:36:45.004426 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-audit-policies\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.005171 master-0 kubenswrapper[7465]: I0320 08:36:45.005139 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-audit-dir\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.005346 master-0 kubenswrapper[7465]: I0320 08:36:45.005313 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-audit-policies\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.005836 master-0 kubenswrapper[7465]: I0320 08:36:45.005805 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-etcd-serving-ca\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.006157 master-0 kubenswrapper[7465]: I0320 08:36:45.006103 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-trusted-ca-bundle\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.009307 master-0 kubenswrapper[7465]: I0320 08:36:45.009265 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-serving-cert\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.009307 master-0 kubenswrapper[7465]: I0320 08:36:45.009268 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-etcd-client\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.009466 master-0 kubenswrapper[7465]: I0320 08:36:45.009413 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-encryption-config\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.022902 master-0 kubenswrapper[7465]: I0320 08:36:45.022835 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr8gw\" (UniqueName: \"kubernetes.io/projected/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-kube-api-access-fr8gw\") pod \"apiserver-6fccff6869-f98nk\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.116249 master-0 kubenswrapper[7465]: I0320 08:36:45.116138 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:45.372334 master-0 kubenswrapper[7465]: I0320 08:36:45.371161 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-779f85678d-lrzfz"] Mar 20 08:36:45.372549 master-0 kubenswrapper[7465]: I0320 08:36:45.372408 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.373247 master-0 kubenswrapper[7465]: I0320 08:36:45.373163 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm"] Mar 20 08:36:45.374393 master-0 kubenswrapper[7465]: I0320 08:36:45.374306 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 08:36:45.376248 master-0 kubenswrapper[7465]: I0320 08:36:45.375474 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:45.376248 master-0 kubenswrapper[7465]: I0320 08:36:45.375499 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 08:36:45.376248 master-0 kubenswrapper[7465]: I0320 08:36:45.375754 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 08:36:45.376248 master-0 kubenswrapper[7465]: I0320 08:36:45.375890 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 08:36:45.382052 master-0 kubenswrapper[7465]: I0320 08:36:45.382000 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm"] Mar 20 08:36:45.389374 master-0 kubenswrapper[7465]: I0320 08:36:45.385167 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 08:36:45.389374 master-0 kubenswrapper[7465]: I0320 08:36:45.385437 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 08:36:45.389374 master-0 kubenswrapper[7465]: I0320 08:36:45.388087 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 08:36:45.389598 master-0 kubenswrapper[7465]: I0320 08:36:45.389500 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 08:36:45.391903 master-0 kubenswrapper[7465]: I0320 08:36:45.389904 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 08:36:45.391903 master-0 kubenswrapper[7465]: I0320 08:36:45.390095 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 08:36:45.391903 master-0 kubenswrapper[7465]: I0320 08:36:45.390289 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 08:36:45.391903 master-0 kubenswrapper[7465]: I0320 08:36:45.390415 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 08:36:45.391903 master-0 kubenswrapper[7465]: I0320 08:36:45.390482 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 08:36:45.391903 master-0 kubenswrapper[7465]: I0320 08:36:45.390661 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 08:36:45.391903 master-0 kubenswrapper[7465]: I0320 08:36:45.390774 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-779f85678d-lrzfz"] Mar 20 08:36:45.391903 master-0 kubenswrapper[7465]: I0320 08:36:45.391478 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 08:36:45.511833 master-0 kubenswrapper[7465]: I0320 08:36:45.511700 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lk55\" (UniqueName: \"kubernetes.io/projected/0b4262e0-2454-43e1-a9f8-57981354b35b-kube-api-access-7lk55\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:45.511833 master-0 kubenswrapper[7465]: I0320 08:36:45.511769 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-etcd-serving-ca\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.511833 master-0 kubenswrapper[7465]: I0320 08:36:45.511793 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:45.511833 master-0 kubenswrapper[7465]: I0320 08:36:45.511818 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46de2acc-9f5d-4ecf-befe-a480f86466f5-node-pullsecrets\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.512178 master-0 kubenswrapper[7465]: I0320 08:36:45.511876 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-trusted-ca-bundle\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.512178 master-0 kubenswrapper[7465]: I0320 08:36:45.511902 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-config\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:45.512178 master-0 kubenswrapper[7465]: I0320 08:36:45.512034 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f9vt\" (UniqueName: \"kubernetes.io/projected/46de2acc-9f5d-4ecf-befe-a480f86466f5-kube-api-access-4f9vt\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.512178 master-0 kubenswrapper[7465]: I0320 08:36:45.512157 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46de2acc-9f5d-4ecf-befe-a480f86466f5-audit-dir\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.512305 master-0 kubenswrapper[7465]: I0320 08:36:45.512201 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-audit\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.512305 master-0 kubenswrapper[7465]: I0320 08:36:45.512242 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-image-import-ca\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.512378 master-0 kubenswrapper[7465]: I0320 08:36:45.512342 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-etcd-client\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.512471 master-0 kubenswrapper[7465]: I0320 08:36:45.512434 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-config\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.512550 master-0 kubenswrapper[7465]: I0320 08:36:45.512519 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b4262e0-2454-43e1-a9f8-57981354b35b-serving-cert\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:45.512589 master-0 kubenswrapper[7465]: I0320 08:36:45.512560 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-encryption-config\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.512637 master-0 kubenswrapper[7465]: I0320 08:36:45.512599 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-serving-cert\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.614490 master-0 kubenswrapper[7465]: I0320 08:36:45.614430 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lk55\" (UniqueName: \"kubernetes.io/projected/0b4262e0-2454-43e1-a9f8-57981354b35b-kube-api-access-7lk55\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:45.614490 master-0 kubenswrapper[7465]: I0320 08:36:45.614485 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:45.614776 master-0 kubenswrapper[7465]: I0320 08:36:45.614510 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-etcd-serving-ca\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.614776 master-0 kubenswrapper[7465]: I0320 08:36:45.614681 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46de2acc-9f5d-4ecf-befe-a480f86466f5-node-pullsecrets\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.614901 master-0 kubenswrapper[7465]: I0320 08:36:45.614882 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-trusted-ca-bundle\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.614952 master-0 kubenswrapper[7465]: I0320 08:36:45.614935 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-config\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:45.614984 master-0 kubenswrapper[7465]: I0320 08:36:45.614964 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f9vt\" (UniqueName: \"kubernetes.io/projected/46de2acc-9f5d-4ecf-befe-a480f86466f5-kube-api-access-4f9vt\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.615017 master-0 kubenswrapper[7465]: I0320 08:36:45.614996 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46de2acc-9f5d-4ecf-befe-a480f86466f5-audit-dir\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.615045 master-0 kubenswrapper[7465]: I0320 08:36:45.615023 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-audit\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.615075 master-0 kubenswrapper[7465]: I0320 08:36:45.615042 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-image-import-ca\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.615255 master-0 kubenswrapper[7465]: I0320 08:36:45.615177 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-etcd-client\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.615295 master-0 kubenswrapper[7465]: I0320 08:36:45.615252 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46de2acc-9f5d-4ecf-befe-a480f86466f5-audit-dir\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.615334 master-0 kubenswrapper[7465]: I0320 08:36:45.615322 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-config\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.615376 master-0 kubenswrapper[7465]: I0320 08:36:45.615364 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b4262e0-2454-43e1-a9f8-57981354b35b-serving-cert\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:45.615412 master-0 kubenswrapper[7465]: I0320 08:36:45.615383 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-encryption-config\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.615412 master-0 kubenswrapper[7465]: I0320 08:36:45.615405 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-serving-cert\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.615778 master-0 kubenswrapper[7465]: I0320 08:36:45.615718 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-etcd-serving-ca\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.615871 master-0 kubenswrapper[7465]: E0320 08:36:45.615843 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:45.615932 master-0 kubenswrapper[7465]: E0320 08:36:45.615919 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca podName:0b4262e0-2454-43e1-a9f8-57981354b35b nodeName:}" failed. No retries permitted until 2026-03-20 08:36:46.115896959 +0000 UTC m=+31.759212449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca") pod "route-controller-manager-74cf48bcc6-kwngm" (UID: "0b4262e0-2454-43e1-a9f8-57981354b35b") : configmap "client-ca" not found Mar 20 08:36:45.615980 master-0 kubenswrapper[7465]: I0320 08:36:45.615933 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-image-import-ca\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.616774 master-0 kubenswrapper[7465]: I0320 08:36:45.616749 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-audit\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.616774 master-0 kubenswrapper[7465]: I0320 08:36:45.616762 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-trusted-ca-bundle\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.617503 master-0 kubenswrapper[7465]: I0320 08:36:45.614801 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46de2acc-9f5d-4ecf-befe-a480f86466f5-node-pullsecrets\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.617647 master-0 kubenswrapper[7465]: I0320 08:36:45.617621 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-config\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.618829 master-0 kubenswrapper[7465]: I0320 08:36:45.618670 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-config\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:45.620657 master-0 kubenswrapper[7465]: I0320 08:36:45.620621 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b4262e0-2454-43e1-a9f8-57981354b35b-serving-cert\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:45.620818 master-0 kubenswrapper[7465]: I0320 08:36:45.620764 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-encryption-config\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.621602 master-0 kubenswrapper[7465]: I0320 08:36:45.621566 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-serving-cert\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.625550 master-0 kubenswrapper[7465]: I0320 08:36:45.623538 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-etcd-client\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.667242 master-0 kubenswrapper[7465]: I0320 08:36:45.667163 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f9vt\" (UniqueName: \"kubernetes.io/projected/46de2acc-9f5d-4ecf-befe-a480f86466f5-kube-api-access-4f9vt\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:45.678278 master-0 kubenswrapper[7465]: I0320 08:36:45.677967 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lk55\" (UniqueName: \"kubernetes.io/projected/0b4262e0-2454-43e1-a9f8-57981354b35b-kube-api-access-7lk55\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:45.707286 master-0 kubenswrapper[7465]: I0320 08:36:45.707177 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:36:46.128753 master-0 kubenswrapper[7465]: I0320 08:36:46.128695 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:46.129292 master-0 kubenswrapper[7465]: E0320 08:36:46.128893 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:46.129292 master-0 kubenswrapper[7465]: E0320 08:36:46.128996 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca podName:0b4262e0-2454-43e1-a9f8-57981354b35b nodeName:}" failed. No retries permitted until 2026-03-20 08:36:47.128972052 +0000 UTC m=+32.772287542 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca") pod "route-controller-manager-74cf48bcc6-kwngm" (UID: "0b4262e0-2454-43e1-a9f8-57981354b35b") : configmap "client-ca" not found Mar 20 08:36:46.224366 master-0 kubenswrapper[7465]: I0320 08:36:46.224307 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:46.224559 master-0 kubenswrapper[7465]: I0320 08:36:46.224533 7465 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:36:46.252123 master-0 kubenswrapper[7465]: I0320 08:36:46.252078 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:36:46.259630 master-0 kubenswrapper[7465]: I0320 08:36:46.259580 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-6fccff6869-f98nk"] Mar 20 08:36:46.322625 master-0 kubenswrapper[7465]: I0320 08:36:46.322563 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-779f85678d-lrzfz"] Mar 20 08:36:46.540327 master-0 kubenswrapper[7465]: I0320 08:36:46.540242 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:46.540637 master-0 kubenswrapper[7465]: E0320 08:36:46.540523 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:46.540637 master-0 kubenswrapper[7465]: E0320 08:36:46.540593 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca podName:da399c95-96f3-4ea1-bd47-1d6fba7aae8f nodeName:}" failed. No retries permitted until 2026-03-20 08:36:50.540572194 +0000 UTC m=+36.183887684 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca") pod "controller-manager-5dcd9ffc84-qh6pq" (UID: "da399c95-96f3-4ea1-bd47-1d6fba7aae8f") : configmap "client-ca" not found Mar 20 08:36:46.730492 master-0 kubenswrapper[7465]: I0320 08:36:46.730297 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" event={"ID":"e8105468-9ff4-4b94-aa60-90ad9ebbbba1","Type":"ContainerStarted","Data":"6540b4c8b779998910bcc3e1f2484c1d0a29dc61611857e8b378e134a5d31c38"} Mar 20 08:36:46.732711 master-0 kubenswrapper[7465]: I0320 08:36:46.732686 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v5h69" event={"ID":"12e1d9e5-96b5-4367-81a5-d87b3f93d8da","Type":"ContainerStarted","Data":"2cc9c806c560b46d56c6690ddba6f9750c6827399375c05fca98bbd78728f7b8"} Mar 20 08:36:46.732786 master-0 kubenswrapper[7465]: I0320 08:36:46.732717 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v5h69" event={"ID":"12e1d9e5-96b5-4367-81a5-d87b3f93d8da","Type":"ContainerStarted","Data":"4d3b35b91bdc99e207b007c75a667fa870de699431a48d9ff6499d3e08d2063c"} Mar 20 08:36:46.733415 master-0 kubenswrapper[7465]: I0320 08:36:46.733385 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-v5h69" Mar 20 08:36:46.734733 master-0 kubenswrapper[7465]: I0320 08:36:46.734653 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" event={"ID":"46de2acc-9f5d-4ecf-befe-a480f86466f5","Type":"ContainerStarted","Data":"bc3668412459475b58df22c5952b6fe210803ae27cac46ab11b8236701860e95"} Mar 20 08:36:46.754351 master-0 kubenswrapper[7465]: I0320 08:36:46.752706 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-v5h69" podStartSLOduration=2.078023371 podStartE2EDuration="4.752677514s" podCreationTimestamp="2026-03-20 08:36:42 +0000 UTC" firstStartedPulling="2026-03-20 08:36:43.223151383 +0000 UTC m=+28.866466873" lastFinishedPulling="2026-03-20 08:36:45.897805526 +0000 UTC m=+31.541121016" observedRunningTime="2026-03-20 08:36:46.750844061 +0000 UTC m=+32.394159561" watchObservedRunningTime="2026-03-20 08:36:46.752677514 +0000 UTC m=+32.395993014" Mar 20 08:36:47.149349 master-0 kubenswrapper[7465]: I0320 08:36:47.148921 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:47.149349 master-0 kubenswrapper[7465]: E0320 08:36:47.149297 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:47.150025 master-0 kubenswrapper[7465]: E0320 08:36:47.149378 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca podName:0b4262e0-2454-43e1-a9f8-57981354b35b nodeName:}" failed. No retries permitted until 2026-03-20 08:36:49.14935345 +0000 UTC m=+34.792668940 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca") pod "route-controller-manager-74cf48bcc6-kwngm" (UID: "0b4262e0-2454-43e1-a9f8-57981354b35b") : configmap "client-ca" not found Mar 20 08:36:47.352341 master-0 kubenswrapper[7465]: I0320 08:36:47.352279 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:47.352341 master-0 kubenswrapper[7465]: I0320 08:36:47.352345 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:47.353178 master-0 kubenswrapper[7465]: I0320 08:36:47.352838 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:47.353178 master-0 kubenswrapper[7465]: I0320 08:36:47.353029 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:47.354088 master-0 kubenswrapper[7465]: I0320 08:36:47.353481 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:47.354088 master-0 kubenswrapper[7465]: I0320 08:36:47.353584 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:47.354088 master-0 kubenswrapper[7465]: I0320 08:36:47.353641 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:47.354088 master-0 kubenswrapper[7465]: I0320 08:36:47.353705 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:47.354088 master-0 kubenswrapper[7465]: I0320 08:36:47.353731 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:47.356372 master-0 kubenswrapper[7465]: I0320 08:36:47.356266 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:47.357221 master-0 kubenswrapper[7465]: I0320 08:36:47.357107 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:47.357950 master-0 kubenswrapper[7465]: I0320 08:36:47.357909 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:47.362702 master-0 kubenswrapper[7465]: I0320 08:36:47.362652 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:47.362805 master-0 kubenswrapper[7465]: I0320 08:36:47.362769 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:47.365233 master-0 kubenswrapper[7465]: I0320 08:36:47.365176 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:47.365531 master-0 kubenswrapper[7465]: I0320 08:36:47.365460 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:47.365833 master-0 kubenswrapper[7465]: I0320 08:36:47.365800 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:47.369632 master-0 kubenswrapper[7465]: I0320 08:36:47.369526 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:47.455264 master-0 kubenswrapper[7465]: I0320 08:36:47.455083 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:47.458885 master-0 kubenswrapper[7465]: I0320 08:36:47.458831 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:47.588818 master-0 kubenswrapper[7465]: I0320 08:36:47.588742 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:36:47.589818 master-0 kubenswrapper[7465]: I0320 08:36:47.589790 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:36:47.596475 master-0 kubenswrapper[7465]: I0320 08:36:47.596427 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:36:47.599677 master-0 kubenswrapper[7465]: I0320 08:36:47.599580 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:36:47.600162 master-0 kubenswrapper[7465]: I0320 08:36:47.600127 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:36:47.600450 master-0 kubenswrapper[7465]: I0320 08:36:47.600388 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:36:47.600577 master-0 kubenswrapper[7465]: I0320 08:36:47.600548 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:36:47.612501 master-0 kubenswrapper[7465]: I0320 08:36:47.612460 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:36:47.612617 master-0 kubenswrapper[7465]: I0320 08:36:47.612501 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:36:47.861020 master-0 kubenswrapper[7465]: I0320 08:36:47.860950 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77"] Mar 20 08:36:48.129593 master-0 kubenswrapper[7465]: I0320 08:36:48.129158 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l"] Mar 20 08:36:48.334765 master-0 kubenswrapper[7465]: I0320 08:36:48.334705 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh"] Mar 20 08:36:48.346532 master-0 kubenswrapper[7465]: I0320 08:36:48.343787 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr"] Mar 20 08:36:48.346532 master-0 kubenswrapper[7465]: I0320 08:36:48.343855 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr"] Mar 20 08:36:48.465947 master-0 kubenswrapper[7465]: I0320 08:36:48.465845 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd"] Mar 20 08:36:48.468015 master-0 kubenswrapper[7465]: I0320 08:36:48.467990 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-srdjm"] Mar 20 08:36:48.555223 master-0 kubenswrapper[7465]: I0320 08:36:48.555152 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28"] Mar 20 08:36:48.591414 master-0 kubenswrapper[7465]: I0320 08:36:48.591335 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-89ccd998f-mvn4t"] Mar 20 08:36:48.750725 master-0 kubenswrapper[7465]: I0320 08:36:48.750655 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" event={"ID":"bbc0b783-28d5-4554-b49d-c66082546f44","Type":"ContainerStarted","Data":"86d25298a72dddc328867a1ea8164a3314dce7f2eff3eb07267e45c914c0415e"} Mar 20 08:36:48.750725 master-0 kubenswrapper[7465]: I0320 08:36:48.750729 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" event={"ID":"bbc0b783-28d5-4554-b49d-c66082546f44","Type":"ContainerStarted","Data":"21791b9b344da8b052097bc3f6be11ec8238d51625fab3e6901854f679a950ba"} Mar 20 08:36:49.057482 master-0 kubenswrapper[7465]: W0320 08:36:49.054064 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf53bc282_5937_49ac_ac98_2ee37ccb268d.slice/crio-c1a730f8ab11fc1de19f12e0bc73ab4daa97d0d1b64ae152032167057be533ed WatchSource:0}: Error finding container c1a730f8ab11fc1de19f12e0bc73ab4daa97d0d1b64ae152032167057be533ed: Status 404 returned error can't find the container with id c1a730f8ab11fc1de19f12e0bc73ab4daa97d0d1b64ae152032167057be533ed Mar 20 08:36:49.205553 master-0 kubenswrapper[7465]: I0320 08:36:49.205426 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:49.205935 master-0 kubenswrapper[7465]: E0320 08:36:49.205592 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:49.205935 master-0 kubenswrapper[7465]: E0320 08:36:49.205667 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca podName:0b4262e0-2454-43e1-a9f8-57981354b35b nodeName:}" failed. No retries permitted until 2026-03-20 08:36:53.205645318 +0000 UTC m=+38.848960808 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca") pod "route-controller-manager-74cf48bcc6-kwngm" (UID: "0b4262e0-2454-43e1-a9f8-57981354b35b") : configmap "client-ca" not found Mar 20 08:36:49.756420 master-0 kubenswrapper[7465]: I0320 08:36:49.756323 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" event={"ID":"df428d5a-c722-4536-8e7f-cdd85c560481","Type":"ContainerStarted","Data":"edc62dc83d0212adeb196aa9fb63d28b17a6054a019750eef25f143d8b2816f1"} Mar 20 08:36:49.758050 master-0 kubenswrapper[7465]: I0320 08:36:49.758004 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" event={"ID":"7b489385-2c96-4a97-8b31-362162de020e","Type":"ContainerStarted","Data":"8a61c21711f690cdda83fe881555e8ad64b01a2f6d1c312d8da79d83d36082f5"} Mar 20 08:36:49.759547 master-0 kubenswrapper[7465]: I0320 08:36:49.759490 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" event={"ID":"f53bc282-5937-49ac-ac98-2ee37ccb268d","Type":"ContainerStarted","Data":"c1a730f8ab11fc1de19f12e0bc73ab4daa97d0d1b64ae152032167057be533ed"} Mar 20 08:36:49.760572 master-0 kubenswrapper[7465]: I0320 08:36:49.760463 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" event={"ID":"6a80bd6f-2263-4251-8197-5173193f8afc","Type":"ContainerStarted","Data":"4d57d4740bcb6c82be46e50156208d72579c0c7e7b87226d99fe9e3e9e1ff1bb"} Mar 20 08:36:49.761726 master-0 kubenswrapper[7465]: I0320 08:36:49.761685 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" event={"ID":"45e8b72b-564c-4bb1-b911-baff2d6c87ad","Type":"ContainerStarted","Data":"948f733f9e7fc399ff3028ac75f39dbd9ac2f6622b269cc750e23eb9c88dedb1"} Mar 20 08:36:49.763046 master-0 kubenswrapper[7465]: I0320 08:36:49.762943 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" event={"ID":"acb704a9-6c8d-4378-ae93-e7095b1fce85","Type":"ContainerStarted","Data":"d1f4c3462eb562d7885b549a3182d1636527f9d646efb4fbbe9ff562004c787d"} Mar 20 08:36:49.764165 master-0 kubenswrapper[7465]: I0320 08:36:49.764108 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-srdjm" event={"ID":"813f91c2-2b37-4681-968d-4217e286e22f","Type":"ContainerStarted","Data":"48ea9b1e1ed051eaf5386ce4d24d2d55f57d357f51f1c79f94723fc2aed83c0f"} Mar 20 08:36:50.223780 master-0 kubenswrapper[7465]: I0320 08:36:50.223623 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-6fccff6869-f98nk"] Mar 20 08:36:50.636482 master-0 kubenswrapper[7465]: I0320 08:36:50.636341 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:50.636801 master-0 kubenswrapper[7465]: E0320 08:36:50.636518 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:50.636801 master-0 kubenswrapper[7465]: E0320 08:36:50.636627 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca podName:da399c95-96f3-4ea1-bd47-1d6fba7aae8f nodeName:}" failed. No retries permitted until 2026-03-20 08:36:58.636594307 +0000 UTC m=+44.279909967 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca") pod "controller-manager-5dcd9ffc84-qh6pq" (UID: "da399c95-96f3-4ea1-bd47-1d6fba7aae8f") : configmap "client-ca" not found Mar 20 08:36:51.209217 master-0 kubenswrapper[7465]: W0320 08:36:51.207039 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42df77ec_94aa_48ba_bb35_7b1f1e8b8e97.slice/crio-ac63789abc1163c5f9db4788ba7a388635d218ac761e65be8043709a0019d115 WatchSource:0}: Error finding container ac63789abc1163c5f9db4788ba7a388635d218ac761e65be8043709a0019d115: Status 404 returned error can't find the container with id ac63789abc1163c5f9db4788ba7a388635d218ac761e65be8043709a0019d115 Mar 20 08:36:51.693257 master-0 kubenswrapper[7465]: I0320 08:36:51.691021 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:36:51.930739 master-0 kubenswrapper[7465]: I0320 08:36:51.929438 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:36:52.075965 master-0 kubenswrapper[7465]: I0320 08:36:52.075786 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:36:52.084200 master-0 kubenswrapper[7465]: I0320 08:36:52.084109 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" event={"ID":"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97","Type":"ContainerStarted","Data":"ba0427f5fde4559006d8e5e0960edad997c51f4ea55994718ca7aa91f3b87a5b"} Mar 20 08:36:52.084200 master-0 kubenswrapper[7465]: I0320 08:36:52.084151 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" event={"ID":"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97","Type":"ContainerStarted","Data":"becf6a9468ee5d2197c4916442372c9501293c27732b04e68b431411779a05c6"} Mar 20 08:36:52.084200 master-0 kubenswrapper[7465]: I0320 08:36:52.084161 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" event={"ID":"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97","Type":"ContainerStarted","Data":"ac63789abc1163c5f9db4788ba7a388635d218ac761e65be8043709a0019d115"} Mar 20 08:36:52.086284 master-0 kubenswrapper[7465]: I0320 08:36:52.086254 7465 generic.go:334] "Generic (PLEG): container finished" podID="e8105468-9ff4-4b94-aa60-90ad9ebbbba1" containerID="e8ffd81691ae32cf9c4d42f6843b3a0ae25e007353a2a7b2169f4c90ee9786f0" exitCode=0 Mar 20 08:36:52.086346 master-0 kubenswrapper[7465]: I0320 08:36:52.086329 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" event={"ID":"e8105468-9ff4-4b94-aa60-90ad9ebbbba1","Type":"ContainerDied","Data":"e8ffd81691ae32cf9c4d42f6843b3a0ae25e007353a2a7b2169f4c90ee9786f0"} Mar 20 08:36:52.089829 master-0 kubenswrapper[7465]: I0320 08:36:52.089742 7465 generic.go:334] "Generic (PLEG): container finished" podID="46de2acc-9f5d-4ecf-befe-a480f86466f5" containerID="a5663c5e028603466a885bf8e6c2930eae2c60da4e4cb920e82d9e29e7d29f42" exitCode=0 Mar 20 08:36:52.089829 master-0 kubenswrapper[7465]: I0320 08:36:52.089817 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" event={"ID":"46de2acc-9f5d-4ecf-befe-a480f86466f5","Type":"ContainerDied","Data":"a5663c5e028603466a885bf8e6c2930eae2c60da4e4cb920e82d9e29e7d29f42"} Mar 20 08:36:52.494621 master-0 kubenswrapper[7465]: I0320 08:36:52.494580 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:52.618460 master-0 kubenswrapper[7465]: I0320 08:36:52.618247 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-etcd-client\") pod \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " Mar 20 08:36:52.618460 master-0 kubenswrapper[7465]: I0320 08:36:52.618306 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-encryption-config\") pod \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " Mar 20 08:36:52.618460 master-0 kubenswrapper[7465]: I0320 08:36:52.618337 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-audit-policies\") pod \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " Mar 20 08:36:52.618460 master-0 kubenswrapper[7465]: I0320 08:36:52.618373 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-etcd-serving-ca\") pod \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " Mar 20 08:36:52.618460 master-0 kubenswrapper[7465]: I0320 08:36:52.618427 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr8gw\" (UniqueName: \"kubernetes.io/projected/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-kube-api-access-fr8gw\") pod \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " Mar 20 08:36:52.619421 master-0 kubenswrapper[7465]: I0320 08:36:52.619395 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e8105468-9ff4-4b94-aa60-90ad9ebbbba1" (UID: "e8105468-9ff4-4b94-aa60-90ad9ebbbba1"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:36:52.619510 master-0 kubenswrapper[7465]: I0320 08:36:52.619418 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "e8105468-9ff4-4b94-aa60-90ad9ebbbba1" (UID: "e8105468-9ff4-4b94-aa60-90ad9ebbbba1"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:36:52.620177 master-0 kubenswrapper[7465]: I0320 08:36:52.620080 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-trusted-ca-bundle\") pod \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " Mar 20 08:36:52.620177 master-0 kubenswrapper[7465]: I0320 08:36:52.620123 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-serving-cert\") pod \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " Mar 20 08:36:52.620177 master-0 kubenswrapper[7465]: I0320 08:36:52.620167 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-audit-dir\") pod \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\" (UID: \"e8105468-9ff4-4b94-aa60-90ad9ebbbba1\") " Mar 20 08:36:52.620550 master-0 kubenswrapper[7465]: I0320 08:36:52.620515 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e8105468-9ff4-4b94-aa60-90ad9ebbbba1" (UID: "e8105468-9ff4-4b94-aa60-90ad9ebbbba1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:36:52.620718 master-0 kubenswrapper[7465]: I0320 08:36:52.620647 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e8105468-9ff4-4b94-aa60-90ad9ebbbba1" (UID: "e8105468-9ff4-4b94-aa60-90ad9ebbbba1"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:36:52.620854 master-0 kubenswrapper[7465]: I0320 08:36:52.620702 7465 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:52.620854 master-0 kubenswrapper[7465]: I0320 08:36:52.620785 7465 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:52.620854 master-0 kubenswrapper[7465]: I0320 08:36:52.620806 7465 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:52.624447 master-0 kubenswrapper[7465]: I0320 08:36:52.624296 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "e8105468-9ff4-4b94-aa60-90ad9ebbbba1" (UID: "e8105468-9ff4-4b94-aa60-90ad9ebbbba1"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:36:52.624447 master-0 kubenswrapper[7465]: I0320 08:36:52.624424 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-kube-api-access-fr8gw" (OuterVolumeSpecName: "kube-api-access-fr8gw") pod "e8105468-9ff4-4b94-aa60-90ad9ebbbba1" (UID: "e8105468-9ff4-4b94-aa60-90ad9ebbbba1"). InnerVolumeSpecName "kube-api-access-fr8gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:36:52.631562 master-0 kubenswrapper[7465]: I0320 08:36:52.631497 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e8105468-9ff4-4b94-aa60-90ad9ebbbba1" (UID: "e8105468-9ff4-4b94-aa60-90ad9ebbbba1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:36:52.640000 master-0 kubenswrapper[7465]: I0320 08:36:52.639856 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "e8105468-9ff4-4b94-aa60-90ad9ebbbba1" (UID: "e8105468-9ff4-4b94-aa60-90ad9ebbbba1"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:36:52.725524 master-0 kubenswrapper[7465]: I0320 08:36:52.724494 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr8gw\" (UniqueName: \"kubernetes.io/projected/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-kube-api-access-fr8gw\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:52.725524 master-0 kubenswrapper[7465]: I0320 08:36:52.724552 7465 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:52.725524 master-0 kubenswrapper[7465]: I0320 08:36:52.724566 7465 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:52.725524 master-0 kubenswrapper[7465]: I0320 08:36:52.724578 7465 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:52.725524 master-0 kubenswrapper[7465]: I0320 08:36:52.724592 7465 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8105468-9ff4-4b94-aa60-90ad9ebbbba1-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 20 08:36:53.099638 master-0 kubenswrapper[7465]: I0320 08:36:53.099561 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" event={"ID":"e8105468-9ff4-4b94-aa60-90ad9ebbbba1","Type":"ContainerDied","Data":"6540b4c8b779998910bcc3e1f2484c1d0a29dc61611857e8b378e134a5d31c38"} Mar 20 08:36:53.099638 master-0 kubenswrapper[7465]: I0320 08:36:53.099652 7465 scope.go:117] "RemoveContainer" containerID="e8ffd81691ae32cf9c4d42f6843b3a0ae25e007353a2a7b2169f4c90ee9786f0" Mar 20 08:36:53.100317 master-0 kubenswrapper[7465]: I0320 08:36:53.099775 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-6fccff6869-f98nk" Mar 20 08:36:53.110994 master-0 kubenswrapper[7465]: I0320 08:36:53.110910 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" event={"ID":"46de2acc-9f5d-4ecf-befe-a480f86466f5","Type":"ContainerStarted","Data":"bcb35fb659786fc08cf75d55d32a2988d612f73857d18fd89b1d7870b73afb52"} Mar 20 08:36:53.236284 master-0 kubenswrapper[7465]: I0320 08:36:53.235699 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2"] Mar 20 08:36:53.236682 master-0 kubenswrapper[7465]: I0320 08:36:53.236591 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:36:53.240929 master-0 kubenswrapper[7465]: E0320 08:36:53.237601 7465 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:53.240929 master-0 kubenswrapper[7465]: E0320 08:36:53.237682 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca podName:0b4262e0-2454-43e1-a9f8-57981354b35b nodeName:}" failed. No retries permitted until 2026-03-20 08:37:01.237656673 +0000 UTC m=+46.880972163 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca") pod "route-controller-manager-74cf48bcc6-kwngm" (UID: "0b4262e0-2454-43e1-a9f8-57981354b35b") : configmap "client-ca" not found Mar 20 08:36:53.240929 master-0 kubenswrapper[7465]: E0320 08:36:53.238352 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8105468-9ff4-4b94-aa60-90ad9ebbbba1" containerName="fix-audit-permissions" Mar 20 08:36:53.240929 master-0 kubenswrapper[7465]: I0320 08:36:53.238386 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8105468-9ff4-4b94-aa60-90ad9ebbbba1" containerName="fix-audit-permissions" Mar 20 08:36:53.240929 master-0 kubenswrapper[7465]: I0320 08:36:53.238480 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8105468-9ff4-4b94-aa60-90ad9ebbbba1" containerName="fix-audit-permissions" Mar 20 08:36:53.240929 master-0 kubenswrapper[7465]: I0320 08:36:53.239046 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.244989 master-0 kubenswrapper[7465]: I0320 08:36:53.243623 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 08:36:53.244989 master-0 kubenswrapper[7465]: I0320 08:36:53.243958 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 08:36:53.244989 master-0 kubenswrapper[7465]: I0320 08:36:53.244602 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 08:36:53.256294 master-0 kubenswrapper[7465]: I0320 08:36:53.253241 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-6fccff6869-f98nk"] Mar 20 08:36:53.256294 master-0 kubenswrapper[7465]: I0320 08:36:53.253833 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 08:36:53.256294 master-0 kubenswrapper[7465]: I0320 08:36:53.254075 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 08:36:53.256294 master-0 kubenswrapper[7465]: I0320 08:36:53.254229 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 08:36:53.256294 master-0 kubenswrapper[7465]: I0320 08:36:53.254335 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 08:36:53.256294 master-0 kubenswrapper[7465]: I0320 08:36:53.254766 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 08:36:53.256294 master-0 kubenswrapper[7465]: I0320 08:36:53.254892 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2"] Mar 20 08:36:53.263319 master-0 kubenswrapper[7465]: I0320 08:36:53.261506 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-oauth-apiserver/apiserver-6fccff6869-f98nk"] Mar 20 08:36:53.349205 master-0 kubenswrapper[7465]: I0320 08:36:53.342046 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-etcd-serving-ca\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.349205 master-0 kubenswrapper[7465]: I0320 08:36:53.342122 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-etcd-client\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.349205 master-0 kubenswrapper[7465]: I0320 08:36:53.342148 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-serving-cert\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.349205 master-0 kubenswrapper[7465]: I0320 08:36:53.342175 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr8br\" (UniqueName: \"kubernetes.io/projected/3de37144-a9ab-45fb-a23f-2287a5198edf-kube-api-access-zr8br\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.349205 master-0 kubenswrapper[7465]: I0320 08:36:53.343497 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-encryption-config\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.349205 master-0 kubenswrapper[7465]: I0320 08:36:53.343602 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-audit-policies\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.349205 master-0 kubenswrapper[7465]: I0320 08:36:53.343682 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3de37144-a9ab-45fb-a23f-2287a5198edf-audit-dir\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.349205 master-0 kubenswrapper[7465]: I0320 08:36:53.343764 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-trusted-ca-bundle\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.445333 master-0 kubenswrapper[7465]: I0320 08:36:53.445171 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-audit-policies\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.445333 master-0 kubenswrapper[7465]: I0320 08:36:53.445248 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3de37144-a9ab-45fb-a23f-2287a5198edf-audit-dir\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.445333 master-0 kubenswrapper[7465]: I0320 08:36:53.445289 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-trusted-ca-bundle\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.445333 master-0 kubenswrapper[7465]: I0320 08:36:53.445321 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-etcd-serving-ca\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.445646 master-0 kubenswrapper[7465]: I0320 08:36:53.445350 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-etcd-client\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.445646 master-0 kubenswrapper[7465]: I0320 08:36:53.445371 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-serving-cert\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.445646 master-0 kubenswrapper[7465]: I0320 08:36:53.445396 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr8br\" (UniqueName: \"kubernetes.io/projected/3de37144-a9ab-45fb-a23f-2287a5198edf-kube-api-access-zr8br\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.445646 master-0 kubenswrapper[7465]: I0320 08:36:53.445415 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-encryption-config\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.446602 master-0 kubenswrapper[7465]: I0320 08:36:53.446574 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-etcd-serving-ca\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.447119 master-0 kubenswrapper[7465]: I0320 08:36:53.447095 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-audit-policies\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.447175 master-0 kubenswrapper[7465]: I0320 08:36:53.447106 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3de37144-a9ab-45fb-a23f-2287a5198edf-audit-dir\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.448444 master-0 kubenswrapper[7465]: I0320 08:36:53.448406 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-trusted-ca-bundle\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.449318 master-0 kubenswrapper[7465]: I0320 08:36:53.449176 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-encryption-config\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.455495 master-0 kubenswrapper[7465]: I0320 08:36:53.455416 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-serving-cert\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.473130 master-0 kubenswrapper[7465]: I0320 08:36:53.473075 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr8br\" (UniqueName: \"kubernetes.io/projected/3de37144-a9ab-45fb-a23f-2287a5198edf-kube-api-access-zr8br\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.477082 master-0 kubenswrapper[7465]: I0320 08:36:53.476556 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-etcd-client\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:53.602219 master-0 kubenswrapper[7465]: I0320 08:36:53.595519 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:36:54.377275 master-0 kubenswrapper[7465]: I0320 08:36:54.376274 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-9t8x6"] Mar 20 08:36:54.377689 master-0 kubenswrapper[7465]: I0320 08:36:54.377397 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:36:54.379578 master-0 kubenswrapper[7465]: I0320 08:36:54.379548 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 08:36:54.464105 master-0 kubenswrapper[7465]: I0320 08:36:54.464051 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5782718-9118-4682-a287-7998cd0304b3-proxy-tls\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:36:54.464105 master-0 kubenswrapper[7465]: I0320 08:36:54.464118 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f5782718-9118-4682-a287-7998cd0304b3-rootfs\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:36:54.464458 master-0 kubenswrapper[7465]: I0320 08:36:54.464199 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbvtp\" (UniqueName: \"kubernetes.io/projected/f5782718-9118-4682-a287-7998cd0304b3-kube-api-access-bbvtp\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:36:54.464458 master-0 kubenswrapper[7465]: I0320 08:36:54.464223 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5782718-9118-4682-a287-7998cd0304b3-mcd-auth-proxy-config\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:36:54.562171 master-0 kubenswrapper[7465]: I0320 08:36:54.562109 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8105468-9ff4-4b94-aa60-90ad9ebbbba1" path="/var/lib/kubelet/pods/e8105468-9ff4-4b94-aa60-90ad9ebbbba1/volumes" Mar 20 08:36:54.566725 master-0 kubenswrapper[7465]: I0320 08:36:54.565005 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f5782718-9118-4682-a287-7998cd0304b3-rootfs\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:36:54.566846 master-0 kubenswrapper[7465]: I0320 08:36:54.566804 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbvtp\" (UniqueName: \"kubernetes.io/projected/f5782718-9118-4682-a287-7998cd0304b3-kube-api-access-bbvtp\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:36:54.567011 master-0 kubenswrapper[7465]: I0320 08:36:54.566988 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5782718-9118-4682-a287-7998cd0304b3-mcd-auth-proxy-config\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:36:54.567170 master-0 kubenswrapper[7465]: I0320 08:36:54.567105 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5782718-9118-4682-a287-7998cd0304b3-proxy-tls\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:36:54.567560 master-0 kubenswrapper[7465]: I0320 08:36:54.565068 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f5782718-9118-4682-a287-7998cd0304b3-rootfs\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:36:54.569106 master-0 kubenswrapper[7465]: I0320 08:36:54.569065 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5782718-9118-4682-a287-7998cd0304b3-mcd-auth-proxy-config\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:36:54.572809 master-0 kubenswrapper[7465]: I0320 08:36:54.572744 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5782718-9118-4682-a287-7998cd0304b3-proxy-tls\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:36:54.600280 master-0 kubenswrapper[7465]: I0320 08:36:54.597978 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbvtp\" (UniqueName: \"kubernetes.io/projected/f5782718-9118-4682-a287-7998cd0304b3-kube-api-access-bbvtp\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:36:54.717180 master-0 kubenswrapper[7465]: I0320 08:36:54.717133 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:36:56.840722 master-0 kubenswrapper[7465]: I0320 08:36:56.840576 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 20 08:36:56.841986 master-0 kubenswrapper[7465]: I0320 08:36:56.841959 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 20 08:36:56.844407 master-0 kubenswrapper[7465]: I0320 08:36:56.844293 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 20 08:36:56.849715 master-0 kubenswrapper[7465]: I0320 08:36:56.849664 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 20 08:36:56.916386 master-0 kubenswrapper[7465]: I0320 08:36:56.916309 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4490a747-da2d-4f1a-8986-bc2c1c58424b-var-lock\") pod \"installer-1-master-0\" (UID: \"4490a747-da2d-4f1a-8986-bc2c1c58424b\") " pod="openshift-etcd/installer-1-master-0" Mar 20 08:36:56.916621 master-0 kubenswrapper[7465]: I0320 08:36:56.916413 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4490a747-da2d-4f1a-8986-bc2c1c58424b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"4490a747-da2d-4f1a-8986-bc2c1c58424b\") " pod="openshift-etcd/installer-1-master-0" Mar 20 08:36:56.916621 master-0 kubenswrapper[7465]: I0320 08:36:56.916501 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4490a747-da2d-4f1a-8986-bc2c1c58424b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"4490a747-da2d-4f1a-8986-bc2c1c58424b\") " pod="openshift-etcd/installer-1-master-0" Mar 20 08:36:57.019332 master-0 kubenswrapper[7465]: I0320 08:36:57.019265 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4490a747-da2d-4f1a-8986-bc2c1c58424b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"4490a747-da2d-4f1a-8986-bc2c1c58424b\") " pod="openshift-etcd/installer-1-master-0" Mar 20 08:36:57.019612 master-0 kubenswrapper[7465]: I0320 08:36:57.019348 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4490a747-da2d-4f1a-8986-bc2c1c58424b-var-lock\") pod \"installer-1-master-0\" (UID: \"4490a747-da2d-4f1a-8986-bc2c1c58424b\") " pod="openshift-etcd/installer-1-master-0" Mar 20 08:36:57.019612 master-0 kubenswrapper[7465]: I0320 08:36:57.019397 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4490a747-da2d-4f1a-8986-bc2c1c58424b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"4490a747-da2d-4f1a-8986-bc2c1c58424b\") " pod="openshift-etcd/installer-1-master-0" Mar 20 08:36:57.020041 master-0 kubenswrapper[7465]: I0320 08:36:57.020002 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4490a747-da2d-4f1a-8986-bc2c1c58424b-var-lock\") pod \"installer-1-master-0\" (UID: \"4490a747-da2d-4f1a-8986-bc2c1c58424b\") " pod="openshift-etcd/installer-1-master-0" Mar 20 08:36:57.020105 master-0 kubenswrapper[7465]: I0320 08:36:57.020060 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4490a747-da2d-4f1a-8986-bc2c1c58424b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"4490a747-da2d-4f1a-8986-bc2c1c58424b\") " pod="openshift-etcd/installer-1-master-0" Mar 20 08:36:57.048278 master-0 kubenswrapper[7465]: I0320 08:36:57.047585 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg"] Mar 20 08:36:57.050422 master-0 kubenswrapper[7465]: I0320 08:36:57.049652 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" podUID="dab97c35-fe60-4134-8715-a7c6dd085fb3" containerName="cluster-version-operator" containerID="cri-o://120e4b0ea2b4cfd16661b5116aa5a9d09dd784f31b0c1dd992cc3cf6ddc463ef" gracePeriod=130 Mar 20 08:36:57.083240 master-0 kubenswrapper[7465]: I0320 08:36:57.083170 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4490a747-da2d-4f1a-8986-bc2c1c58424b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"4490a747-da2d-4f1a-8986-bc2c1c58424b\") " pod="openshift-etcd/installer-1-master-0" Mar 20 08:36:57.166056 master-0 kubenswrapper[7465]: I0320 08:36:57.165368 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 20 08:36:57.878981 master-0 kubenswrapper[7465]: I0320 08:36:57.878920 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-v5h69" Mar 20 08:36:58.157760 master-0 kubenswrapper[7465]: I0320 08:36:58.157587 7465 generic.go:334] "Generic (PLEG): container finished" podID="dab97c35-fe60-4134-8715-a7c6dd085fb3" containerID="120e4b0ea2b4cfd16661b5116aa5a9d09dd784f31b0c1dd992cc3cf6ddc463ef" exitCode=0 Mar 20 08:36:58.157760 master-0 kubenswrapper[7465]: I0320 08:36:58.157664 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" event={"ID":"dab97c35-fe60-4134-8715-a7c6dd085fb3","Type":"ContainerDied","Data":"120e4b0ea2b4cfd16661b5116aa5a9d09dd784f31b0c1dd992cc3cf6ddc463ef"} Mar 20 08:36:58.652360 master-0 kubenswrapper[7465]: I0320 08:36:58.651507 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca\") pod \"controller-manager-5dcd9ffc84-qh6pq\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:36:58.652360 master-0 kubenswrapper[7465]: E0320 08:36:58.651783 7465 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 20 08:36:58.652360 master-0 kubenswrapper[7465]: E0320 08:36:58.651908 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca podName:da399c95-96f3-4ea1-bd47-1d6fba7aae8f nodeName:}" failed. No retries permitted until 2026-03-20 08:37:14.651881996 +0000 UTC m=+60.295197486 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca") pod "controller-manager-5dcd9ffc84-qh6pq" (UID: "da399c95-96f3-4ea1-bd47-1d6fba7aae8f") : configmap "client-ca" not found Mar 20 08:36:59.843826 master-0 kubenswrapper[7465]: I0320 08:36:59.843773 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 20 08:36:59.846675 master-0 kubenswrapper[7465]: I0320 08:36:59.846419 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 20 08:36:59.851923 master-0 kubenswrapper[7465]: I0320 08:36:59.851882 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 08:36:59.861413 master-0 kubenswrapper[7465]: I0320 08:36:59.861251 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 20 08:36:59.980778 master-0 kubenswrapper[7465]: I0320 08:36:59.980558 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d17f953e-3ca4-4bd5-ad89-678447774687-var-lock\") pod \"installer-1-master-0\" (UID: \"d17f953e-3ca4-4bd5-ad89-678447774687\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 20 08:36:59.980778 master-0 kubenswrapper[7465]: I0320 08:36:59.980638 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d17f953e-3ca4-4bd5-ad89-678447774687-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"d17f953e-3ca4-4bd5-ad89-678447774687\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 20 08:36:59.980778 master-0 kubenswrapper[7465]: I0320 08:36:59.980672 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d17f953e-3ca4-4bd5-ad89-678447774687-kube-api-access\") pod \"installer-1-master-0\" (UID: \"d17f953e-3ca4-4bd5-ad89-678447774687\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 20 08:37:00.082128 master-0 kubenswrapper[7465]: I0320 08:37:00.081987 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d17f953e-3ca4-4bd5-ad89-678447774687-var-lock\") pod \"installer-1-master-0\" (UID: \"d17f953e-3ca4-4bd5-ad89-678447774687\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 20 08:37:00.082128 master-0 kubenswrapper[7465]: I0320 08:37:00.082058 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d17f953e-3ca4-4bd5-ad89-678447774687-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"d17f953e-3ca4-4bd5-ad89-678447774687\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 20 08:37:00.082128 master-0 kubenswrapper[7465]: I0320 08:37:00.082154 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d17f953e-3ca4-4bd5-ad89-678447774687-kube-api-access\") pod \"installer-1-master-0\" (UID: \"d17f953e-3ca4-4bd5-ad89-678447774687\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 20 08:37:00.082128 master-0 kubenswrapper[7465]: I0320 08:37:00.082208 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d17f953e-3ca4-4bd5-ad89-678447774687-var-lock\") pod \"installer-1-master-0\" (UID: \"d17f953e-3ca4-4bd5-ad89-678447774687\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 20 08:37:00.083128 master-0 kubenswrapper[7465]: I0320 08:37:00.082262 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d17f953e-3ca4-4bd5-ad89-678447774687-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"d17f953e-3ca4-4bd5-ad89-678447774687\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 20 08:37:00.100804 master-0 kubenswrapper[7465]: I0320 08:37:00.100683 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d17f953e-3ca4-4bd5-ad89-678447774687-kube-api-access\") pod \"installer-1-master-0\" (UID: \"d17f953e-3ca4-4bd5-ad89-678447774687\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 20 08:37:00.225804 master-0 kubenswrapper[7465]: I0320 08:37:00.225723 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 20 08:37:00.619589 master-0 kubenswrapper[7465]: I0320 08:37:00.618496 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq"] Mar 20 08:37:00.619589 master-0 kubenswrapper[7465]: E0320 08:37:00.619349 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" podUID="da399c95-96f3-4ea1-bd47-1d6fba7aae8f" Mar 20 08:37:00.640215 master-0 kubenswrapper[7465]: I0320 08:37:00.638861 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm"] Mar 20 08:37:00.640215 master-0 kubenswrapper[7465]: E0320 08:37:00.639444 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" podUID="0b4262e0-2454-43e1-a9f8-57981354b35b" Mar 20 08:37:00.862315 master-0 kubenswrapper[7465]: I0320 08:37:00.857030 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 20 08:37:00.862315 master-0 kubenswrapper[7465]: I0320 08:37:00.857846 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 20 08:37:00.862315 master-0 kubenswrapper[7465]: I0320 08:37:00.860930 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 20 08:37:00.880779 master-0 kubenswrapper[7465]: I0320 08:37:00.880662 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 20 08:37:01.002884 master-0 kubenswrapper[7465]: I0320 08:37:01.002765 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8682b669-c173-4b96-80f6-029292f5c25b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"8682b669-c173-4b96-80f6-029292f5c25b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 20 08:37:01.003284 master-0 kubenswrapper[7465]: I0320 08:37:01.003126 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8682b669-c173-4b96-80f6-029292f5c25b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"8682b669-c173-4b96-80f6-029292f5c25b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 20 08:37:01.003284 master-0 kubenswrapper[7465]: I0320 08:37:01.003242 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8682b669-c173-4b96-80f6-029292f5c25b-var-lock\") pod \"installer-1-master-0\" (UID: \"8682b669-c173-4b96-80f6-029292f5c25b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 20 08:37:01.104981 master-0 kubenswrapper[7465]: I0320 08:37:01.104906 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8682b669-c173-4b96-80f6-029292f5c25b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"8682b669-c173-4b96-80f6-029292f5c25b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 20 08:37:01.104981 master-0 kubenswrapper[7465]: I0320 08:37:01.105013 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8682b669-c173-4b96-80f6-029292f5c25b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"8682b669-c173-4b96-80f6-029292f5c25b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 20 08:37:01.104981 master-0 kubenswrapper[7465]: I0320 08:37:01.105036 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8682b669-c173-4b96-80f6-029292f5c25b-var-lock\") pod \"installer-1-master-0\" (UID: \"8682b669-c173-4b96-80f6-029292f5c25b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 20 08:37:01.105450 master-0 kubenswrapper[7465]: I0320 08:37:01.105158 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8682b669-c173-4b96-80f6-029292f5c25b-var-lock\") pod \"installer-1-master-0\" (UID: \"8682b669-c173-4b96-80f6-029292f5c25b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 20 08:37:01.105778 master-0 kubenswrapper[7465]: I0320 08:37:01.105748 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8682b669-c173-4b96-80f6-029292f5c25b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"8682b669-c173-4b96-80f6-029292f5c25b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 20 08:37:01.128668 master-0 kubenswrapper[7465]: I0320 08:37:01.128525 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8682b669-c173-4b96-80f6-029292f5c25b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"8682b669-c173-4b96-80f6-029292f5c25b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 20 08:37:01.177278 master-0 kubenswrapper[7465]: I0320 08:37:01.177172 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:37:01.177848 master-0 kubenswrapper[7465]: I0320 08:37:01.177829 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:37:01.179541 master-0 kubenswrapper[7465]: I0320 08:37:01.179433 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 20 08:37:01.185655 master-0 kubenswrapper[7465]: I0320 08:37:01.185626 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:37:01.191072 master-0 kubenswrapper[7465]: I0320 08:37:01.191034 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:37:01.307689 master-0 kubenswrapper[7465]: I0320 08:37:01.307591 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-serving-cert\") pod \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " Mar 20 08:37:01.307689 master-0 kubenswrapper[7465]: I0320 08:37:01.307677 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-config\") pod \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " Mar 20 08:37:01.308017 master-0 kubenswrapper[7465]: I0320 08:37:01.307763 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-proxy-ca-bundles\") pod \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " Mar 20 08:37:01.308017 master-0 kubenswrapper[7465]: I0320 08:37:01.307806 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lk55\" (UniqueName: \"kubernetes.io/projected/0b4262e0-2454-43e1-a9f8-57981354b35b-kube-api-access-7lk55\") pod \"0b4262e0-2454-43e1-a9f8-57981354b35b\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " Mar 20 08:37:01.308017 master-0 kubenswrapper[7465]: I0320 08:37:01.307836 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b4262e0-2454-43e1-a9f8-57981354b35b-serving-cert\") pod \"0b4262e0-2454-43e1-a9f8-57981354b35b\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " Mar 20 08:37:01.308017 master-0 kubenswrapper[7465]: I0320 08:37:01.307885 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-config\") pod \"0b4262e0-2454-43e1-a9f8-57981354b35b\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " Mar 20 08:37:01.308017 master-0 kubenswrapper[7465]: I0320 08:37:01.307957 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn9xv\" (UniqueName: \"kubernetes.io/projected/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-kube-api-access-cn9xv\") pod \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\" (UID: \"da399c95-96f3-4ea1-bd47-1d6fba7aae8f\") " Mar 20 08:37:01.308243 master-0 kubenswrapper[7465]: I0320 08:37:01.308170 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:37:01.308501 master-0 kubenswrapper[7465]: I0320 08:37:01.308397 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "da399c95-96f3-4ea1-bd47-1d6fba7aae8f" (UID: "da399c95-96f3-4ea1-bd47-1d6fba7aae8f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:37:01.309093 master-0 kubenswrapper[7465]: I0320 08:37:01.308725 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-config" (OuterVolumeSpecName: "config") pod "da399c95-96f3-4ea1-bd47-1d6fba7aae8f" (UID: "da399c95-96f3-4ea1-bd47-1d6fba7aae8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:37:01.309776 master-0 kubenswrapper[7465]: I0320 08:37:01.309728 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca\") pod \"route-controller-manager-74cf48bcc6-kwngm\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:37:01.309828 master-0 kubenswrapper[7465]: I0320 08:37:01.309758 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-config" (OuterVolumeSpecName: "config") pod "0b4262e0-2454-43e1-a9f8-57981354b35b" (UID: "0b4262e0-2454-43e1-a9f8-57981354b35b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:37:01.312130 master-0 kubenswrapper[7465]: I0320 08:37:01.312087 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b4262e0-2454-43e1-a9f8-57981354b35b-kube-api-access-7lk55" (OuterVolumeSpecName: "kube-api-access-7lk55") pod "0b4262e0-2454-43e1-a9f8-57981354b35b" (UID: "0b4262e0-2454-43e1-a9f8-57981354b35b"). InnerVolumeSpecName "kube-api-access-7lk55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:37:01.312296 master-0 kubenswrapper[7465]: I0320 08:37:01.312271 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "da399c95-96f3-4ea1-bd47-1d6fba7aae8f" (UID: "da399c95-96f3-4ea1-bd47-1d6fba7aae8f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:37:01.313605 master-0 kubenswrapper[7465]: I0320 08:37:01.313537 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-kube-api-access-cn9xv" (OuterVolumeSpecName: "kube-api-access-cn9xv") pod "da399c95-96f3-4ea1-bd47-1d6fba7aae8f" (UID: "da399c95-96f3-4ea1-bd47-1d6fba7aae8f"). InnerVolumeSpecName "kube-api-access-cn9xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:37:01.325254 master-0 kubenswrapper[7465]: I0320 08:37:01.325220 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b4262e0-2454-43e1-a9f8-57981354b35b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b4262e0-2454-43e1-a9f8-57981354b35b" (UID: "0b4262e0-2454-43e1-a9f8-57981354b35b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:37:01.411084 master-0 kubenswrapper[7465]: I0320 08:37:01.410954 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca\") pod \"0b4262e0-2454-43e1-a9f8-57981354b35b\" (UID: \"0b4262e0-2454-43e1-a9f8-57981354b35b\") " Mar 20 08:37:01.411443 master-0 kubenswrapper[7465]: I0320 08:37:01.411255 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lk55\" (UniqueName: \"kubernetes.io/projected/0b4262e0-2454-43e1-a9f8-57981354b35b-kube-api-access-7lk55\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:01.411443 master-0 kubenswrapper[7465]: I0320 08:37:01.411273 7465 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b4262e0-2454-43e1-a9f8-57981354b35b-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:01.411443 master-0 kubenswrapper[7465]: I0320 08:37:01.411286 7465 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:01.411443 master-0 kubenswrapper[7465]: I0320 08:37:01.411296 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn9xv\" (UniqueName: \"kubernetes.io/projected/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-kube-api-access-cn9xv\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:01.411443 master-0 kubenswrapper[7465]: I0320 08:37:01.411306 7465 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:01.411443 master-0 kubenswrapper[7465]: I0320 08:37:01.411317 7465 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:01.411443 master-0 kubenswrapper[7465]: I0320 08:37:01.411330 7465 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:01.412113 master-0 kubenswrapper[7465]: I0320 08:37:01.411732 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca" (OuterVolumeSpecName: "client-ca") pod "0b4262e0-2454-43e1-a9f8-57981354b35b" (UID: "0b4262e0-2454-43e1-a9f8-57981354b35b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:37:01.513948 master-0 kubenswrapper[7465]: I0320 08:37:01.513848 7465 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b4262e0-2454-43e1-a9f8-57981354b35b-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:01.887847 master-0 kubenswrapper[7465]: I0320 08:37:01.887773 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:37:02.022141 master-0 kubenswrapper[7465]: I0320 08:37:02.021985 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-ssl-certs\") pod \"dab97c35-fe60-4134-8715-a7c6dd085fb3\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " Mar 20 08:37:02.022141 master-0 kubenswrapper[7465]: I0320 08:37:02.022048 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab97c35-fe60-4134-8715-a7c6dd085fb3-service-ca\") pod \"dab97c35-fe60-4134-8715-a7c6dd085fb3\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " Mar 20 08:37:02.022141 master-0 kubenswrapper[7465]: I0320 08:37:02.022105 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") pod \"dab97c35-fe60-4134-8715-a7c6dd085fb3\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " Mar 20 08:37:02.022289 master-0 kubenswrapper[7465]: I0320 08:37:02.022155 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dab97c35-fe60-4134-8715-a7c6dd085fb3-kube-api-access\") pod \"dab97c35-fe60-4134-8715-a7c6dd085fb3\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " Mar 20 08:37:02.022289 master-0 kubenswrapper[7465]: I0320 08:37:02.022163 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "dab97c35-fe60-4134-8715-a7c6dd085fb3" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:02.022289 master-0 kubenswrapper[7465]: I0320 08:37:02.022253 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-cvo-updatepayloads\") pod \"dab97c35-fe60-4134-8715-a7c6dd085fb3\" (UID: \"dab97c35-fe60-4134-8715-a7c6dd085fb3\") " Mar 20 08:37:02.022678 master-0 kubenswrapper[7465]: I0320 08:37:02.022636 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "dab97c35-fe60-4134-8715-a7c6dd085fb3" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:02.023244 master-0 kubenswrapper[7465]: I0320 08:37:02.023162 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab97c35-fe60-4134-8715-a7c6dd085fb3-service-ca" (OuterVolumeSpecName: "service-ca") pod "dab97c35-fe60-4134-8715-a7c6dd085fb3" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:37:02.023365 master-0 kubenswrapper[7465]: I0320 08:37:02.023336 7465 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:02.028473 master-0 kubenswrapper[7465]: I0320 08:37:02.027561 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab97c35-fe60-4134-8715-a7c6dd085fb3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dab97c35-fe60-4134-8715-a7c6dd085fb3" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:37:02.028473 master-0 kubenswrapper[7465]: I0320 08:37:02.028402 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dab97c35-fe60-4134-8715-a7c6dd085fb3" (UID: "dab97c35-fe60-4134-8715-a7c6dd085fb3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:37:02.044759 master-0 kubenswrapper[7465]: W0320 08:37:02.044688 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5782718_9118_4682_a287_7998cd0304b3.slice/crio-48eeec4a0ddb4a6c0b04997cccaac081db5581507160a9804ec767ba66cbdb34 WatchSource:0}: Error finding container 48eeec4a0ddb4a6c0b04997cccaac081db5581507160a9804ec767ba66cbdb34: Status 404 returned error can't find the container with id 48eeec4a0ddb4a6c0b04997cccaac081db5581507160a9804ec767ba66cbdb34 Mar 20 08:37:02.125041 master-0 kubenswrapper[7465]: I0320 08:37:02.124994 7465 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dab97c35-fe60-4134-8715-a7c6dd085fb3-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:02.125041 master-0 kubenswrapper[7465]: I0320 08:37:02.125040 7465 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dab97c35-fe60-4134-8715-a7c6dd085fb3-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:02.125155 master-0 kubenswrapper[7465]: I0320 08:37:02.125050 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dab97c35-fe60-4134-8715-a7c6dd085fb3-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:02.125155 master-0 kubenswrapper[7465]: I0320 08:37:02.125066 7465 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dab97c35-fe60-4134-8715-a7c6dd085fb3-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:02.186250 master-0 kubenswrapper[7465]: I0320 08:37:02.186152 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" event={"ID":"f5782718-9118-4682-a287-7998cd0304b3","Type":"ContainerStarted","Data":"48eeec4a0ddb4a6c0b04997cccaac081db5581507160a9804ec767ba66cbdb34"} Mar 20 08:37:02.189638 master-0 kubenswrapper[7465]: I0320 08:37:02.189609 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq" Mar 20 08:37:02.190144 master-0 kubenswrapper[7465]: I0320 08:37:02.190071 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" Mar 20 08:37:02.190677 master-0 kubenswrapper[7465]: I0320 08:37:02.190546 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg" event={"ID":"dab97c35-fe60-4134-8715-a7c6dd085fb3","Type":"ContainerDied","Data":"37362c35c821b752ac43451f735aec18a90fc1833fb68a65ce307045393dec48"} Mar 20 08:37:02.190677 master-0 kubenswrapper[7465]: I0320 08:37:02.190638 7465 scope.go:117] "RemoveContainer" containerID="120e4b0ea2b4cfd16661b5116aa5a9d09dd784f31b0c1dd992cc3cf6ddc463ef" Mar 20 08:37:02.191251 master-0 kubenswrapper[7465]: I0320 08:37:02.190901 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm" Mar 20 08:37:02.644675 master-0 kubenswrapper[7465]: I0320 08:37:02.644615 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 20 08:37:02.657511 master-0 kubenswrapper[7465]: I0320 08:37:02.649988 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 20 08:37:02.665944 master-0 kubenswrapper[7465]: I0320 08:37:02.665161 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2"] Mar 20 08:37:02.732300 master-0 kubenswrapper[7465]: I0320 08:37:02.724236 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq"] Mar 20 08:37:02.736438 master-0 kubenswrapper[7465]: I0320 08:37:02.733111 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5dcd9ffc84-qh6pq"] Mar 20 08:37:02.811560 master-0 kubenswrapper[7465]: I0320 08:37:02.811476 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg"] Mar 20 08:37:02.833461 master-0 kubenswrapper[7465]: I0320 08:37:02.830178 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-9gfkg"] Mar 20 08:37:02.856305 master-0 kubenswrapper[7465]: I0320 08:37:02.843642 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 20 08:37:02.856305 master-0 kubenswrapper[7465]: I0320 08:37:02.846017 7465 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/da399c95-96f3-4ea1-bd47-1d6fba7aae8f-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:02.884468 master-0 kubenswrapper[7465]: I0320 08:37:02.879150 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm"] Mar 20 08:37:02.911971 master-0 kubenswrapper[7465]: I0320 08:37:02.906950 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74cf48bcc6-kwngm"] Mar 20 08:37:02.946523 master-0 kubenswrapper[7465]: I0320 08:37:02.946042 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h"] Mar 20 08:37:02.947577 master-0 kubenswrapper[7465]: E0320 08:37:02.947285 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab97c35-fe60-4134-8715-a7c6dd085fb3" containerName="cluster-version-operator" Mar 20 08:37:02.947577 master-0 kubenswrapper[7465]: I0320 08:37:02.947315 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab97c35-fe60-4134-8715-a7c6dd085fb3" containerName="cluster-version-operator" Mar 20 08:37:02.947577 master-0 kubenswrapper[7465]: I0320 08:37:02.947431 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab97c35-fe60-4134-8715-a7c6dd085fb3" containerName="cluster-version-operator" Mar 20 08:37:02.948356 master-0 kubenswrapper[7465]: I0320 08:37:02.948224 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:02.960940 master-0 kubenswrapper[7465]: W0320 08:37:02.958830 7465 reflector.go:561] object-"openshift-cluster-version"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'master-0' and this object Mar 20 08:37:02.960940 master-0 kubenswrapper[7465]: E0320 08:37:02.958932 7465 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 20 08:37:02.960940 master-0 kubenswrapper[7465]: W0320 08:37:02.959081 7465 reflector.go:561] object-"openshift-cluster-version"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'master-0' and this object Mar 20 08:37:02.960940 master-0 kubenswrapper[7465]: E0320 08:37:02.959101 7465 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 20 08:37:02.960940 master-0 kubenswrapper[7465]: W0320 08:37:02.959143 7465 reflector.go:561] object-"openshift-cluster-version"/"cluster-version-operator-serving-cert": failed to list *v1.Secret: secrets "cluster-version-operator-serving-cert" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-version": no relationship found between node 'master-0' and this object Mar 20 08:37:02.960940 master-0 kubenswrapper[7465]: E0320 08:37:02.959159 7465 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-version\"/\"cluster-version-operator-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cluster-version-operator-serving-cert\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-version\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 20 08:37:03.051236 master-0 kubenswrapper[7465]: I0320 08:37:03.050423 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1375da42-ecaf-4d86-b554-25fd1c3d00bd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:03.051236 master-0 kubenswrapper[7465]: I0320 08:37:03.050467 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1375da42-ecaf-4d86-b554-25fd1c3d00bd-serving-cert\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:03.051236 master-0 kubenswrapper[7465]: I0320 08:37:03.050515 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1375da42-ecaf-4d86-b554-25fd1c3d00bd-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:03.051236 master-0 kubenswrapper[7465]: I0320 08:37:03.050538 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1375da42-ecaf-4d86-b554-25fd1c3d00bd-kube-api-access\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:03.051236 master-0 kubenswrapper[7465]: I0320 08:37:03.050558 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1375da42-ecaf-4d86-b554-25fd1c3d00bd-service-ca\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:03.161087 master-0 kubenswrapper[7465]: I0320 08:37:03.161041 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1375da42-ecaf-4d86-b554-25fd1c3d00bd-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:03.161235 master-0 kubenswrapper[7465]: I0320 08:37:03.161098 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1375da42-ecaf-4d86-b554-25fd1c3d00bd-kube-api-access\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:03.161235 master-0 kubenswrapper[7465]: I0320 08:37:03.161119 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1375da42-ecaf-4d86-b554-25fd1c3d00bd-service-ca\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:03.161235 master-0 kubenswrapper[7465]: I0320 08:37:03.161165 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1375da42-ecaf-4d86-b554-25fd1c3d00bd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:03.161370 master-0 kubenswrapper[7465]: I0320 08:37:03.161309 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1375da42-ecaf-4d86-b554-25fd1c3d00bd-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:03.161458 master-0 kubenswrapper[7465]: I0320 08:37:03.161437 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1375da42-ecaf-4d86-b554-25fd1c3d00bd-serving-cert\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:03.161696 master-0 kubenswrapper[7465]: I0320 08:37:03.161675 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1375da42-ecaf-4d86-b554-25fd1c3d00bd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:03.213150 master-0 kubenswrapper[7465]: I0320 08:37:03.213075 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"8682b669-c173-4b96-80f6-029292f5c25b","Type":"ContainerStarted","Data":"a7a94acdb1ced20f1398af7489166fd2b70ae13922fdab81533d23a1b96c7db0"} Mar 20 08:37:03.225250 master-0 kubenswrapper[7465]: I0320 08:37:03.222698 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"d17f953e-3ca4-4bd5-ad89-678447774687","Type":"ContainerStarted","Data":"4e6b04ca634ce1b5886b7da997d2a8569817c1b864a82899979971c78a3d51e1"} Mar 20 08:37:03.233934 master-0 kubenswrapper[7465]: I0320 08:37:03.233686 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-srdjm" event={"ID":"813f91c2-2b37-4681-968d-4217e286e22f","Type":"ContainerStarted","Data":"0cdda3b89325b61f030fe62f7e1e40dae9fd6495c82df19791b581b4f2a2b2bd"} Mar 20 08:37:03.238901 master-0 kubenswrapper[7465]: I0320 08:37:03.238647 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" event={"ID":"46de2acc-9f5d-4ecf-befe-a480f86466f5","Type":"ContainerStarted","Data":"6cd834e174ccc442576b15f26c10970d3f2f599cb7a1f56492db3a3174d18af6"} Mar 20 08:37:03.247023 master-0 kubenswrapper[7465]: I0320 08:37:03.246817 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" event={"ID":"45e8b72b-564c-4bb1-b911-baff2d6c87ad","Type":"ContainerStarted","Data":"6af5a2f2427206da008543d3c2e9de1d09b1789d70a831c73c81aa5b4993f15e"} Mar 20 08:37:03.259421 master-0 kubenswrapper[7465]: I0320 08:37:03.259376 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" event={"ID":"f5782718-9118-4682-a287-7998cd0304b3","Type":"ContainerStarted","Data":"4100829f793cce2954b64f7463941089e6db3bf46fc83040646900405ad68496"} Mar 20 08:37:03.278463 master-0 kubenswrapper[7465]: I0320 08:37:03.278358 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" podStartSLOduration=18.330896376 podStartE2EDuration="23.278332467s" podCreationTimestamp="2026-03-20 08:36:40 +0000 UTC" firstStartedPulling="2026-03-20 08:36:46.355987217 +0000 UTC m=+31.999302707" lastFinishedPulling="2026-03-20 08:36:51.303423308 +0000 UTC m=+36.946738798" observedRunningTime="2026-03-20 08:37:03.274814674 +0000 UTC m=+48.918130184" watchObservedRunningTime="2026-03-20 08:37:03.278332467 +0000 UTC m=+48.921647957" Mar 20 08:37:03.398213 master-0 kubenswrapper[7465]: I0320 08:37:03.393625 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" event={"ID":"acb704a9-6c8d-4378-ae93-e7095b1fce85","Type":"ContainerStarted","Data":"4edf8229d20f78af8d6c9bd895781409ea67ff831e7f63eef258c5dd26b7d38d"} Mar 20 08:37:03.398213 master-0 kubenswrapper[7465]: I0320 08:37:03.394109 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:37:03.422369 master-0 kubenswrapper[7465]: I0320 08:37:03.413645 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" podStartSLOduration=9.413604925 podStartE2EDuration="9.413604925s" podCreationTimestamp="2026-03-20 08:36:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:37:03.305054077 +0000 UTC m=+48.948369567" watchObservedRunningTime="2026-03-20 08:37:03.413604925 +0000 UTC m=+49.056920425" Mar 20 08:37:03.423179 master-0 kubenswrapper[7465]: I0320 08:37:03.423109 7465 patch_prober.go:28] interesting pod/marketplace-operator-89ccd998f-mvn4t container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" start-of-body= Mar 20 08:37:03.423372 master-0 kubenswrapper[7465]: I0320 08:37:03.423228 7465 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" podUID="acb704a9-6c8d-4378-ae93-e7095b1fce85" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" Mar 20 08:37:03.473989 master-0 kubenswrapper[7465]: I0320 08:37:03.473935 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" event={"ID":"f53bc282-5937-49ac-ac98-2ee37ccb268d","Type":"ContainerStarted","Data":"20027ed3d8ce2945b58e2ed2edbfa6fa2b33157326dd7e152264af390a255b26"} Mar 20 08:37:03.474103 master-0 kubenswrapper[7465]: I0320 08:37:03.473995 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" event={"ID":"f53bc282-5937-49ac-ac98-2ee37ccb268d","Type":"ContainerStarted","Data":"84b569249d8110b2d1ae9f2ad68f8d09c1249f9b0523e1c444309cb25f725be7"} Mar 20 08:37:03.556281 master-0 kubenswrapper[7465]: I0320 08:37:03.556128 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" event={"ID":"6a80bd6f-2263-4251-8197-5173193f8afc","Type":"ContainerStarted","Data":"199be5e4e7ebbf5d512a41b53ac83599e0d0e19351865228cfbac1cc0f5d6aa8"} Mar 20 08:37:03.562585 master-0 kubenswrapper[7465]: I0320 08:37:03.562305 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" event={"ID":"3de37144-a9ab-45fb-a23f-2287a5198edf","Type":"ContainerStarted","Data":"02b8b46e9f6cf48ded279c24ec1e51a94bbe25b122e72584be4a8549a6a9d74b"} Mar 20 08:37:03.570754 master-0 kubenswrapper[7465]: I0320 08:37:03.570547 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"4490a747-da2d-4f1a-8986-bc2c1c58424b","Type":"ContainerStarted","Data":"0aa1305a973a71f928c142131df579b42fa3e776fd7926a4aa71bddb2c85fcba"} Mar 20 08:37:04.136007 master-0 kubenswrapper[7465]: I0320 08:37:04.135773 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 08:37:04.143744 master-0 kubenswrapper[7465]: I0320 08:37:04.143698 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1375da42-ecaf-4d86-b554-25fd1c3d00bd-service-ca\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:04.144617 master-0 kubenswrapper[7465]: I0320 08:37:04.144579 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 08:37:04.155763 master-0 kubenswrapper[7465]: I0320 08:37:04.155705 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1375da42-ecaf-4d86-b554-25fd1c3d00bd-serving-cert\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:04.188368 master-0 kubenswrapper[7465]: E0320 08:37:04.187687 7465 projected.go:288] Couldn't get configMap openshift-cluster-version/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:37:04.188368 master-0 kubenswrapper[7465]: E0320 08:37:04.187741 7465 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:37:04.188368 master-0 kubenswrapper[7465]: E0320 08:37:04.187832 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1375da42-ecaf-4d86-b554-25fd1c3d00bd-kube-api-access podName:1375da42-ecaf-4d86-b554-25fd1c3d00bd nodeName:}" failed. No retries permitted until 2026-03-20 08:37:04.687806548 +0000 UTC m=+50.331122038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/1375da42-ecaf-4d86-b554-25fd1c3d00bd-kube-api-access") pod "cluster-version-operator-7d58488df-qmm8h" (UID: "1375da42-ecaf-4d86-b554-25fd1c3d00bd") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:37:04.506624 master-0 kubenswrapper[7465]: I0320 08:37:04.506454 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 08:37:04.547425 master-0 kubenswrapper[7465]: I0320 08:37:04.547169 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b4262e0-2454-43e1-a9f8-57981354b35b" path="/var/lib/kubelet/pods/0b4262e0-2454-43e1-a9f8-57981354b35b/volumes" Mar 20 08:37:04.547982 master-0 kubenswrapper[7465]: I0320 08:37:04.547936 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da399c95-96f3-4ea1-bd47-1d6fba7aae8f" path="/var/lib/kubelet/pods/da399c95-96f3-4ea1-bd47-1d6fba7aae8f/volumes" Mar 20 08:37:04.548746 master-0 kubenswrapper[7465]: I0320 08:37:04.548700 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab97c35-fe60-4134-8715-a7c6dd085fb3" path="/var/lib/kubelet/pods/dab97c35-fe60-4134-8715-a7c6dd085fb3/volumes" Mar 20 08:37:04.581352 master-0 kubenswrapper[7465]: I0320 08:37:04.581007 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"8682b669-c173-4b96-80f6-029292f5c25b","Type":"ContainerStarted","Data":"7dcd8a99fc665575842ace97eb1ad322b466b3c9cd44a9d54ad2389140cfb2bc"} Mar 20 08:37:04.584676 master-0 kubenswrapper[7465]: I0320 08:37:04.584623 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" event={"ID":"7b489385-2c96-4a97-8b31-362162de020e","Type":"ContainerStarted","Data":"efd6836c5e507ec16e6e082bc5946a6c45ff929a136363cbf3994fcefbdc7906"} Mar 20 08:37:04.584871 master-0 kubenswrapper[7465]: I0320 08:37:04.584853 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:37:04.588482 master-0 kubenswrapper[7465]: I0320 08:37:04.587993 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" event={"ID":"f5782718-9118-4682-a287-7998cd0304b3","Type":"ContainerStarted","Data":"b0cb9210b4b2bf3cfe59c44d3722ebab536e55adf6ae14c57d5a60f0e9fe993b"} Mar 20 08:37:04.591894 master-0 kubenswrapper[7465]: I0320 08:37:04.591785 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:37:04.592493 master-0 kubenswrapper[7465]: I0320 08:37:04.592454 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" event={"ID":"6a80bd6f-2263-4251-8197-5173193f8afc","Type":"ContainerStarted","Data":"d02eea2683ac6efae0f70fd8f717e97bc121a80e06da8071bf5f8c5bd619e9f0"} Mar 20 08:37:04.599458 master-0 kubenswrapper[7465]: I0320 08:37:04.598630 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"4490a747-da2d-4f1a-8986-bc2c1c58424b","Type":"ContainerStarted","Data":"0521d9515acccdbef13de273c2fd3fc8c0c08193b40755e745ddfeeb3789e32d"} Mar 20 08:37:04.601255 master-0 kubenswrapper[7465]: I0320 08:37:04.601170 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" event={"ID":"df428d5a-c722-4536-8e7f-cdd85c560481","Type":"ContainerStarted","Data":"514dbe166dec5b1f878e0f4a5bf082ca6bf2afdda7b02979cf775c8d27e07456"} Mar 20 08:37:04.601900 master-0 kubenswrapper[7465]: I0320 08:37:04.601871 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:37:04.603474 master-0 kubenswrapper[7465]: I0320 08:37:04.603423 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"d17f953e-3ca4-4bd5-ad89-678447774687","Type":"ContainerStarted","Data":"e11ff3b52271d029d42cce254ad48e40793e3cdb6d379d24178958fc78484abf"} Mar 20 08:37:04.606306 master-0 kubenswrapper[7465]: I0320 08:37:04.606246 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-srdjm" event={"ID":"813f91c2-2b37-4681-968d-4217e286e22f","Type":"ContainerStarted","Data":"8b504057c998514e9a6f75544fd4b2e6f3e06b14334afca0cc280a0d4b21513a"} Mar 20 08:37:04.609147 master-0 kubenswrapper[7465]: I0320 08:37:04.609094 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:37:04.609691 master-0 kubenswrapper[7465]: I0320 08:37:04.609642 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" event={"ID":"bbc0b783-28d5-4554-b49d-c66082546f44","Type":"ContainerStarted","Data":"7ee99faecdaa8ce9ade5aaa3b49dd8416a312e96db798b1de9fced997f6fd077"} Mar 20 08:37:04.609987 master-0 kubenswrapper[7465]: I0320 08:37:04.609744 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:37:04.612062 master-0 kubenswrapper[7465]: I0320 08:37:04.612019 7465 generic.go:334] "Generic (PLEG): container finished" podID="3de37144-a9ab-45fb-a23f-2287a5198edf" containerID="078946765e0bcafd3c39a471f72aabe9c5152a4c66ba4a584be214e5cb42544f" exitCode=0 Mar 20 08:37:04.613458 master-0 kubenswrapper[7465]: I0320 08:37:04.613427 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" event={"ID":"3de37144-a9ab-45fb-a23f-2287a5198edf","Type":"ContainerDied","Data":"078946765e0bcafd3c39a471f72aabe9c5152a4c66ba4a584be214e5cb42544f"} Mar 20 08:37:04.619524 master-0 kubenswrapper[7465]: I0320 08:37:04.619467 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:37:04.704696 master-0 kubenswrapper[7465]: I0320 08:37:04.704353 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1375da42-ecaf-4d86-b554-25fd1c3d00bd-kube-api-access\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:04.714118 master-0 kubenswrapper[7465]: I0320 08:37:04.712160 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1375da42-ecaf-4d86-b554-25fd1c3d00bd-kube-api-access\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:04.714118 master-0 kubenswrapper[7465]: I0320 08:37:04.713476 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-1-master-0" podStartSLOduration=4.713450568 podStartE2EDuration="4.713450568s" podCreationTimestamp="2026-03-20 08:37:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:37:04.707511965 +0000 UTC m=+50.350827465" watchObservedRunningTime="2026-03-20 08:37:04.713450568 +0000 UTC m=+50.356766058" Mar 20 08:37:04.720242 master-0 kubenswrapper[7465]: I0320 08:37:04.719330 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-srjqw"] Mar 20 08:37:04.723094 master-0 kubenswrapper[7465]: I0320 08:37:04.723035 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:37:04.726094 master-0 kubenswrapper[7465]: I0320 08:37:04.725910 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-srjqw"] Mar 20 08:37:04.839689 master-0 kubenswrapper[7465]: I0320 08:37:04.839625 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:37:04.882354 master-0 kubenswrapper[7465]: I0320 08:37:04.878011 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-1-master-0" podStartSLOduration=5.87798937 podStartE2EDuration="5.87798937s" podCreationTimestamp="2026-03-20 08:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:37:04.836907571 +0000 UTC m=+50.480223061" watchObservedRunningTime="2026-03-20 08:37:04.87798937 +0000 UTC m=+50.521304860" Mar 20 08:37:04.909138 master-0 kubenswrapper[7465]: I0320 08:37:04.909080 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956a8ee3-2618-4f04-8d1c-97c3a4595c94-catalog-content\") pod \"redhat-marketplace-srjqw\" (UID: \"956a8ee3-2618-4f04-8d1c-97c3a4595c94\") " pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:37:04.909257 master-0 kubenswrapper[7465]: I0320 08:37:04.909143 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgmtk\" (UniqueName: \"kubernetes.io/projected/956a8ee3-2618-4f04-8d1c-97c3a4595c94-kube-api-access-xgmtk\") pod \"redhat-marketplace-srjqw\" (UID: \"956a8ee3-2618-4f04-8d1c-97c3a4595c94\") " pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:37:04.909257 master-0 kubenswrapper[7465]: I0320 08:37:04.909203 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956a8ee3-2618-4f04-8d1c-97c3a4595c94-utilities\") pod \"redhat-marketplace-srjqw\" (UID: \"956a8ee3-2618-4f04-8d1c-97c3a4595c94\") " pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:37:05.013014 master-0 kubenswrapper[7465]: I0320 08:37:05.010898 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956a8ee3-2618-4f04-8d1c-97c3a4595c94-catalog-content\") pod \"redhat-marketplace-srjqw\" (UID: \"956a8ee3-2618-4f04-8d1c-97c3a4595c94\") " pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:37:05.013014 master-0 kubenswrapper[7465]: I0320 08:37:05.010951 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgmtk\" (UniqueName: \"kubernetes.io/projected/956a8ee3-2618-4f04-8d1c-97c3a4595c94-kube-api-access-xgmtk\") pod \"redhat-marketplace-srjqw\" (UID: \"956a8ee3-2618-4f04-8d1c-97c3a4595c94\") " pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:37:05.013014 master-0 kubenswrapper[7465]: I0320 08:37:05.010979 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956a8ee3-2618-4f04-8d1c-97c3a4595c94-utilities\") pod \"redhat-marketplace-srjqw\" (UID: \"956a8ee3-2618-4f04-8d1c-97c3a4595c94\") " pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:37:05.013014 master-0 kubenswrapper[7465]: I0320 08:37:05.011696 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956a8ee3-2618-4f04-8d1c-97c3a4595c94-utilities\") pod \"redhat-marketplace-srjqw\" (UID: \"956a8ee3-2618-4f04-8d1c-97c3a4595c94\") " pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:37:05.013014 master-0 kubenswrapper[7465]: I0320 08:37:05.012025 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=9.011998841 podStartE2EDuration="9.011998841s" podCreationTimestamp="2026-03-20 08:36:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:37:05.011070314 +0000 UTC m=+50.654385804" watchObservedRunningTime="2026-03-20 08:37:05.011998841 +0000 UTC m=+50.655314321" Mar 20 08:37:05.015819 master-0 kubenswrapper[7465]: I0320 08:37:05.015761 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956a8ee3-2618-4f04-8d1c-97c3a4595c94-catalog-content\") pod \"redhat-marketplace-srjqw\" (UID: \"956a8ee3-2618-4f04-8d1c-97c3a4595c94\") " pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:37:05.056063 master-0 kubenswrapper[7465]: I0320 08:37:05.056016 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgmtk\" (UniqueName: \"kubernetes.io/projected/956a8ee3-2618-4f04-8d1c-97c3a4595c94-kube-api-access-xgmtk\") pod \"redhat-marketplace-srjqw\" (UID: \"956a8ee3-2618-4f04-8d1c-97c3a4595c94\") " pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:37:05.294242 master-0 kubenswrapper[7465]: I0320 08:37:05.294197 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mgcb9"] Mar 20 08:37:05.295296 master-0 kubenswrapper[7465]: I0320 08:37:05.295265 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:37:05.315448 master-0 kubenswrapper[7465]: I0320 08:37:05.315395 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mgcb9"] Mar 20 08:37:05.355308 master-0 kubenswrapper[7465]: I0320 08:37:05.353659 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:37:05.414918 master-0 kubenswrapper[7465]: I0320 08:37:05.414821 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v"] Mar 20 08:37:05.415616 master-0 kubenswrapper[7465]: I0320 08:37:05.415593 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6486d766f9-5b77h"] Mar 20 08:37:05.416029 master-0 kubenswrapper[7465]: I0320 08:37:05.415998 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.419568 master-0 kubenswrapper[7465]: I0320 08:37:05.419157 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:05.420909 master-0 kubenswrapper[7465]: I0320 08:37:05.420848 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 08:37:05.426193 master-0 kubenswrapper[7465]: I0320 08:37:05.425535 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06159643-e9b8-417f-ba79-ee1e1ca5c951-utilities\") pod \"redhat-operators-mgcb9\" (UID: \"06159643-e9b8-417f-ba79-ee1e1ca5c951\") " pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:37:05.426193 master-0 kubenswrapper[7465]: I0320 08:37:05.425690 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06159643-e9b8-417f-ba79-ee1e1ca5c951-catalog-content\") pod \"redhat-operators-mgcb9\" (UID: \"06159643-e9b8-417f-ba79-ee1e1ca5c951\") " pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:37:05.426193 master-0 kubenswrapper[7465]: I0320 08:37:05.425724 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s9bz\" (UniqueName: \"kubernetes.io/projected/06159643-e9b8-417f-ba79-ee1e1ca5c951-kube-api-access-8s9bz\") pod \"redhat-operators-mgcb9\" (UID: \"06159643-e9b8-417f-ba79-ee1e1ca5c951\") " pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:37:05.426749 master-0 kubenswrapper[7465]: I0320 08:37:05.426719 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 08:37:05.427008 master-0 kubenswrapper[7465]: I0320 08:37:05.426989 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 08:37:05.427275 master-0 kubenswrapper[7465]: I0320 08:37:05.427197 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 08:37:05.427338 master-0 kubenswrapper[7465]: I0320 08:37:05.427315 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 08:37:05.428006 master-0 kubenswrapper[7465]: I0320 08:37:05.427486 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 08:37:05.430390 master-0 kubenswrapper[7465]: I0320 08:37:05.430353 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v"] Mar 20 08:37:05.432657 master-0 kubenswrapper[7465]: I0320 08:37:05.432632 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 08:37:05.433729 master-0 kubenswrapper[7465]: I0320 08:37:05.433702 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 08:37:05.433913 master-0 kubenswrapper[7465]: I0320 08:37:05.433880 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 08:37:05.434056 master-0 kubenswrapper[7465]: I0320 08:37:05.434028 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 08:37:05.438911 master-0 kubenswrapper[7465]: I0320 08:37:05.438832 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 08:37:05.449388 master-0 kubenswrapper[7465]: I0320 08:37:05.447370 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6486d766f9-5b77h"] Mar 20 08:37:05.532211 master-0 kubenswrapper[7465]: I0320 08:37:05.526769 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e180bf9a-03f7-405b-90c3-b2e46008213e-serving-cert\") pod \"route-controller-manager-7ffc895647-6j97v\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:05.532211 master-0 kubenswrapper[7465]: I0320 08:37:05.527351 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk7cs\" (UniqueName: \"kubernetes.io/projected/e180bf9a-03f7-405b-90c3-b2e46008213e-kube-api-access-zk7cs\") pod \"route-controller-manager-7ffc895647-6j97v\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:05.532211 master-0 kubenswrapper[7465]: I0320 08:37:05.527383 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e180bf9a-03f7-405b-90c3-b2e46008213e-client-ca\") pod \"route-controller-manager-7ffc895647-6j97v\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:05.532211 master-0 kubenswrapper[7465]: I0320 08:37:05.527418 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06159643-e9b8-417f-ba79-ee1e1ca5c951-catalog-content\") pod \"redhat-operators-mgcb9\" (UID: \"06159643-e9b8-417f-ba79-ee1e1ca5c951\") " pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:37:05.532211 master-0 kubenswrapper[7465]: I0320 08:37:05.527445 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-config\") pod \"controller-manager-6486d766f9-5b77h\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.532211 master-0 kubenswrapper[7465]: I0320 08:37:05.527473 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s9bz\" (UniqueName: \"kubernetes.io/projected/06159643-e9b8-417f-ba79-ee1e1ca5c951-kube-api-access-8s9bz\") pod \"redhat-operators-mgcb9\" (UID: \"06159643-e9b8-417f-ba79-ee1e1ca5c951\") " pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:37:05.532211 master-0 kubenswrapper[7465]: I0320 08:37:05.527501 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e180bf9a-03f7-405b-90c3-b2e46008213e-config\") pod \"route-controller-manager-7ffc895647-6j97v\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:05.532211 master-0 kubenswrapper[7465]: I0320 08:37:05.527531 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-proxy-ca-bundles\") pod \"controller-manager-6486d766f9-5b77h\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.532211 master-0 kubenswrapper[7465]: I0320 08:37:05.527550 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxv95\" (UniqueName: \"kubernetes.io/projected/590dd533-c8db-42e3-9485-1c9df719773f-kube-api-access-wxv95\") pod \"controller-manager-6486d766f9-5b77h\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.532211 master-0 kubenswrapper[7465]: I0320 08:37:05.527584 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06159643-e9b8-417f-ba79-ee1e1ca5c951-utilities\") pod \"redhat-operators-mgcb9\" (UID: \"06159643-e9b8-417f-ba79-ee1e1ca5c951\") " pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:37:05.532211 master-0 kubenswrapper[7465]: I0320 08:37:05.527621 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-client-ca\") pod \"controller-manager-6486d766f9-5b77h\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.532211 master-0 kubenswrapper[7465]: I0320 08:37:05.527640 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/590dd533-c8db-42e3-9485-1c9df719773f-serving-cert\") pod \"controller-manager-6486d766f9-5b77h\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.532211 master-0 kubenswrapper[7465]: I0320 08:37:05.528235 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06159643-e9b8-417f-ba79-ee1e1ca5c951-catalog-content\") pod \"redhat-operators-mgcb9\" (UID: \"06159643-e9b8-417f-ba79-ee1e1ca5c951\") " pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:37:05.532211 master-0 kubenswrapper[7465]: I0320 08:37:05.529012 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06159643-e9b8-417f-ba79-ee1e1ca5c951-utilities\") pod \"redhat-operators-mgcb9\" (UID: \"06159643-e9b8-417f-ba79-ee1e1ca5c951\") " pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:37:05.549396 master-0 kubenswrapper[7465]: I0320 08:37:05.548596 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s9bz\" (UniqueName: \"kubernetes.io/projected/06159643-e9b8-417f-ba79-ee1e1ca5c951-kube-api-access-8s9bz\") pod \"redhat-operators-mgcb9\" (UID: \"06159643-e9b8-417f-ba79-ee1e1ca5c951\") " pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:37:05.610012 master-0 kubenswrapper[7465]: I0320 08:37:05.609935 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:37:05.633956 master-0 kubenswrapper[7465]: I0320 08:37:05.633786 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-client-ca\") pod \"controller-manager-6486d766f9-5b77h\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.633956 master-0 kubenswrapper[7465]: I0320 08:37:05.633956 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/590dd533-c8db-42e3-9485-1c9df719773f-serving-cert\") pod \"controller-manager-6486d766f9-5b77h\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.634357 master-0 kubenswrapper[7465]: I0320 08:37:05.634001 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e180bf9a-03f7-405b-90c3-b2e46008213e-serving-cert\") pod \"route-controller-manager-7ffc895647-6j97v\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:05.634357 master-0 kubenswrapper[7465]: I0320 08:37:05.634029 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk7cs\" (UniqueName: \"kubernetes.io/projected/e180bf9a-03f7-405b-90c3-b2e46008213e-kube-api-access-zk7cs\") pod \"route-controller-manager-7ffc895647-6j97v\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:05.634357 master-0 kubenswrapper[7465]: I0320 08:37:05.634065 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e180bf9a-03f7-405b-90c3-b2e46008213e-client-ca\") pod \"route-controller-manager-7ffc895647-6j97v\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:05.634357 master-0 kubenswrapper[7465]: I0320 08:37:05.634092 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-config\") pod \"controller-manager-6486d766f9-5b77h\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.634357 master-0 kubenswrapper[7465]: I0320 08:37:05.634122 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e180bf9a-03f7-405b-90c3-b2e46008213e-config\") pod \"route-controller-manager-7ffc895647-6j97v\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:05.634357 master-0 kubenswrapper[7465]: I0320 08:37:05.634150 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-proxy-ca-bundles\") pod \"controller-manager-6486d766f9-5b77h\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.634357 master-0 kubenswrapper[7465]: I0320 08:37:05.634172 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxv95\" (UniqueName: \"kubernetes.io/projected/590dd533-c8db-42e3-9485-1c9df719773f-kube-api-access-wxv95\") pod \"controller-manager-6486d766f9-5b77h\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.637343 master-0 kubenswrapper[7465]: I0320 08:37:05.635446 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e180bf9a-03f7-405b-90c3-b2e46008213e-client-ca\") pod \"route-controller-manager-7ffc895647-6j97v\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:05.637343 master-0 kubenswrapper[7465]: I0320 08:37:05.635606 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-client-ca\") pod \"controller-manager-6486d766f9-5b77h\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.637343 master-0 kubenswrapper[7465]: I0320 08:37:05.636908 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-config\") pod \"controller-manager-6486d766f9-5b77h\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.637700 master-0 kubenswrapper[7465]: I0320 08:37:05.637659 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e180bf9a-03f7-405b-90c3-b2e46008213e-config\") pod \"route-controller-manager-7ffc895647-6j97v\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:05.638333 master-0 kubenswrapper[7465]: I0320 08:37:05.638290 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-proxy-ca-bundles\") pod \"controller-manager-6486d766f9-5b77h\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.654163 master-0 kubenswrapper[7465]: I0320 08:37:05.643071 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/590dd533-c8db-42e3-9485-1c9df719773f-serving-cert\") pod \"controller-manager-6486d766f9-5b77h\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.654163 master-0 kubenswrapper[7465]: I0320 08:37:05.643986 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" event={"ID":"1375da42-ecaf-4d86-b554-25fd1c3d00bd","Type":"ContainerStarted","Data":"ef48bc7a298f21dc7e1c4f0e8ec7b05b2de65f0d7e2d6a14897ed741dcf440bd"} Mar 20 08:37:05.654163 master-0 kubenswrapper[7465]: I0320 08:37:05.647612 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" event={"ID":"1375da42-ecaf-4d86-b554-25fd1c3d00bd","Type":"ContainerStarted","Data":"d3032285e1cfcfd919da168e10b18ee5ee2720e85e2457d64bfd97de17bf8050"} Mar 20 08:37:05.657999 master-0 kubenswrapper[7465]: I0320 08:37:05.657949 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj"] Mar 20 08:37:05.659372 master-0 kubenswrapper[7465]: I0320 08:37:05.659348 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:37:05.660722 master-0 kubenswrapper[7465]: I0320 08:37:05.660666 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj"] Mar 20 08:37:05.660814 master-0 kubenswrapper[7465]: I0320 08:37:05.660781 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e180bf9a-03f7-405b-90c3-b2e46008213e-serving-cert\") pod \"route-controller-manager-7ffc895647-6j97v\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:05.661453 master-0 kubenswrapper[7465]: I0320 08:37:05.661336 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 08:37:05.667395 master-0 kubenswrapper[7465]: I0320 08:37:05.667352 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxv95\" (UniqueName: \"kubernetes.io/projected/590dd533-c8db-42e3-9485-1c9df719773f-kube-api-access-wxv95\") pod \"controller-manager-6486d766f9-5b77h\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.668834 master-0 kubenswrapper[7465]: I0320 08:37:05.668770 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" event={"ID":"3de37144-a9ab-45fb-a23f-2287a5198edf","Type":"ContainerStarted","Data":"540db07e954f0bcdcd80c73acd405d75eb9f98f42b7c9e3fe89e5f93f2f7e1a8"} Mar 20 08:37:05.669024 master-0 kubenswrapper[7465]: I0320 08:37:05.668986 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk7cs\" (UniqueName: \"kubernetes.io/projected/e180bf9a-03f7-405b-90c3-b2e46008213e-kube-api-access-zk7cs\") pod \"route-controller-manager-7ffc895647-6j97v\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:05.692494 master-0 kubenswrapper[7465]: I0320 08:37:05.691945 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" podStartSLOduration=3.691923483 podStartE2EDuration="3.691923483s" podCreationTimestamp="2026-03-20 08:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:37:05.690666946 +0000 UTC m=+51.333982446" watchObservedRunningTime="2026-03-20 08:37:05.691923483 +0000 UTC m=+51.335238973" Mar 20 08:37:05.708082 master-0 kubenswrapper[7465]: I0320 08:37:05.707971 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:37:05.708082 master-0 kubenswrapper[7465]: I0320 08:37:05.708062 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:37:05.729958 master-0 kubenswrapper[7465]: I0320 08:37:05.729889 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:37:05.796214 master-0 kubenswrapper[7465]: I0320 08:37:05.795754 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" podStartSLOduration=15.795727652 podStartE2EDuration="15.795727652s" podCreationTimestamp="2026-03-20 08:36:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:37:05.764735278 +0000 UTC m=+51.408050758" watchObservedRunningTime="2026-03-20 08:37:05.795727652 +0000 UTC m=+51.439043142" Mar 20 08:37:05.800336 master-0 kubenswrapper[7465]: I0320 08:37:05.799314 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:05.828222 master-0 kubenswrapper[7465]: I0320 08:37:05.817603 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:05.889421 master-0 kubenswrapper[7465]: I0320 08:37:05.889359 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:37:05.889822 master-0 kubenswrapper[7465]: I0320 08:37:05.889801 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66kz7\" (UniqueName: \"kubernetes.io/projected/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-kube-api-access-66kz7\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:37:05.890292 master-0 kubenswrapper[7465]: I0320 08:37:05.890205 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:37:05.952820 master-0 kubenswrapper[7465]: I0320 08:37:05.952739 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-srjqw"] Mar 20 08:37:06.009733 master-0 kubenswrapper[7465]: I0320 08:37:06.008315 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:37:06.009733 master-0 kubenswrapper[7465]: I0320 08:37:06.008378 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66kz7\" (UniqueName: \"kubernetes.io/projected/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-kube-api-access-66kz7\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:37:06.009733 master-0 kubenswrapper[7465]: I0320 08:37:06.008450 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:37:06.020917 master-0 kubenswrapper[7465]: I0320 08:37:06.010082 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:37:06.021928 master-0 kubenswrapper[7465]: I0320 08:37:06.021806 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:37:06.033160 master-0 kubenswrapper[7465]: I0320 08:37:06.033052 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66kz7\" (UniqueName: \"kubernetes.io/projected/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-kube-api-access-66kz7\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:37:06.227686 master-0 kubenswrapper[7465]: I0320 08:37:06.226924 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v"] Mar 20 08:37:06.253864 master-0 kubenswrapper[7465]: I0320 08:37:06.253695 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mgcb9"] Mar 20 08:37:06.266943 master-0 kubenswrapper[7465]: I0320 08:37:06.266873 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6486d766f9-5b77h"] Mar 20 08:37:06.284052 master-0 kubenswrapper[7465]: W0320 08:37:06.282352 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod590dd533_c8db_42e3_9485_1c9df719773f.slice/crio-56a2947905e36af4c7e4eee7385aba059d957c24fbe5028cd5296acd03b88f48 WatchSource:0}: Error finding container 56a2947905e36af4c7e4eee7385aba059d957c24fbe5028cd5296acd03b88f48: Status 404 returned error can't find the container with id 56a2947905e36af4c7e4eee7385aba059d957c24fbe5028cd5296acd03b88f48 Mar 20 08:37:06.285797 master-0 kubenswrapper[7465]: I0320 08:37:06.285728 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:37:06.686565 master-0 kubenswrapper[7465]: I0320 08:37:06.686500 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mgcb9" event={"ID":"06159643-e9b8-417f-ba79-ee1e1ca5c951","Type":"ContainerStarted","Data":"b48c4aacb301fc26474516047fb9f07667987577a2ed5332421858d27dce7d77"} Mar 20 08:37:06.694650 master-0 kubenswrapper[7465]: I0320 08:37:06.694595 7465 generic.go:334] "Generic (PLEG): container finished" podID="956a8ee3-2618-4f04-8d1c-97c3a4595c94" containerID="33768c74647cf1a7ac9859d965b50b5b37abd6757807573438c40398b17fc2b5" exitCode=0 Mar 20 08:37:06.694812 master-0 kubenswrapper[7465]: I0320 08:37:06.694693 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srjqw" event={"ID":"956a8ee3-2618-4f04-8d1c-97c3a4595c94","Type":"ContainerDied","Data":"33768c74647cf1a7ac9859d965b50b5b37abd6757807573438c40398b17fc2b5"} Mar 20 08:37:06.694812 master-0 kubenswrapper[7465]: I0320 08:37:06.694731 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srjqw" event={"ID":"956a8ee3-2618-4f04-8d1c-97c3a4595c94","Type":"ContainerStarted","Data":"13ee22fe72630c657645fbce4809a9a3e59153a0a482338b9c4adf8c528528a1"} Mar 20 08:37:06.699138 master-0 kubenswrapper[7465]: I0320 08:37:06.699052 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" event={"ID":"e180bf9a-03f7-405b-90c3-b2e46008213e","Type":"ContainerStarted","Data":"02b2465660b354ff71c8f1178956736d7360fb5bea9412ce65fe2d7c69c27724"} Mar 20 08:37:06.700761 master-0 kubenswrapper[7465]: I0320 08:37:06.700723 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" event={"ID":"590dd533-c8db-42e3-9485-1c9df719773f","Type":"ContainerStarted","Data":"56a2947905e36af4c7e4eee7385aba059d957c24fbe5028cd5296acd03b88f48"} Mar 20 08:37:06.710870 master-0 kubenswrapper[7465]: I0320 08:37:06.710804 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:37:07.068378 master-0 kubenswrapper[7465]: I0320 08:37:07.068306 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-86g9n"] Mar 20 08:37:07.069561 master-0 kubenswrapper[7465]: I0320 08:37:07.069513 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:37:07.079123 master-0 kubenswrapper[7465]: I0320 08:37:07.079054 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj"] Mar 20 08:37:07.079455 master-0 kubenswrapper[7465]: W0320 08:37:07.079404 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde6078d7_2aad_46fe_b17a_b6b38e4eaa41.slice/crio-8a073909baddafc91ef0c1a8a7d1dde84b7bce6d841c77590b71f4a6754c9117 WatchSource:0}: Error finding container 8a073909baddafc91ef0c1a8a7d1dde84b7bce6d841c77590b71f4a6754c9117: Status 404 returned error can't find the container with id 8a073909baddafc91ef0c1a8a7d1dde84b7bce6d841c77590b71f4a6754c9117 Mar 20 08:37:07.139598 master-0 kubenswrapper[7465]: I0320 08:37:07.139527 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-86g9n"] Mar 20 08:37:07.145915 master-0 kubenswrapper[7465]: I0320 08:37:07.145826 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxf7x\" (UniqueName: \"kubernetes.io/projected/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-kube-api-access-gxf7x\") pod \"certified-operators-86g9n\" (UID: \"55cefc56-c008-45c1-a6ac-b1d3c8778c7b\") " pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:37:07.145915 master-0 kubenswrapper[7465]: I0320 08:37:07.145899 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-catalog-content\") pod \"certified-operators-86g9n\" (UID: \"55cefc56-c008-45c1-a6ac-b1d3c8778c7b\") " pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:37:07.146275 master-0 kubenswrapper[7465]: I0320 08:37:07.145956 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-utilities\") pod \"certified-operators-86g9n\" (UID: \"55cefc56-c008-45c1-a6ac-b1d3c8778c7b\") " pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:37:07.249209 master-0 kubenswrapper[7465]: I0320 08:37:07.248504 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxf7x\" (UniqueName: \"kubernetes.io/projected/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-kube-api-access-gxf7x\") pod \"certified-operators-86g9n\" (UID: \"55cefc56-c008-45c1-a6ac-b1d3c8778c7b\") " pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:37:07.249209 master-0 kubenswrapper[7465]: I0320 08:37:07.248590 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-catalog-content\") pod \"certified-operators-86g9n\" (UID: \"55cefc56-c008-45c1-a6ac-b1d3c8778c7b\") " pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:37:07.249209 master-0 kubenswrapper[7465]: I0320 08:37:07.248726 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-utilities\") pod \"certified-operators-86g9n\" (UID: \"55cefc56-c008-45c1-a6ac-b1d3c8778c7b\") " pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:37:07.278529 master-0 kubenswrapper[7465]: I0320 08:37:07.249698 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-utilities\") pod \"certified-operators-86g9n\" (UID: \"55cefc56-c008-45c1-a6ac-b1d3c8778c7b\") " pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:37:07.278529 master-0 kubenswrapper[7465]: I0320 08:37:07.250826 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-catalog-content\") pod \"certified-operators-86g9n\" (UID: \"55cefc56-c008-45c1-a6ac-b1d3c8778c7b\") " pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:37:07.278529 master-0 kubenswrapper[7465]: I0320 08:37:07.278459 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxf7x\" (UniqueName: \"kubernetes.io/projected/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-kube-api-access-gxf7x\") pod \"certified-operators-86g9n\" (UID: \"55cefc56-c008-45c1-a6ac-b1d3c8778c7b\") " pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:37:07.399490 master-0 kubenswrapper[7465]: I0320 08:37:07.394546 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:37:07.445247 master-0 kubenswrapper[7465]: I0320 08:37:07.441421 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg"] Mar 20 08:37:07.445247 master-0 kubenswrapper[7465]: I0320 08:37:07.442119 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:07.445569 master-0 kubenswrapper[7465]: I0320 08:37:07.445510 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 08:37:07.475217 master-0 kubenswrapper[7465]: I0320 08:37:07.470117 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg"] Mar 20 08:37:07.559518 master-0 kubenswrapper[7465]: I0320 08:37:07.559430 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c2rq\" (UniqueName: \"kubernetes.io/projected/b543f82e-683d-47c1-af73-4dcede4cf4df-kube-api-access-4c2rq\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:07.559518 master-0 kubenswrapper[7465]: I0320 08:37:07.559501 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b543f82e-683d-47c1-af73-4dcede4cf4df-tmpfs\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:07.559518 master-0 kubenswrapper[7465]: I0320 08:37:07.559537 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-webhook-cert\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:07.559892 master-0 kubenswrapper[7465]: I0320 08:37:07.559568 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-apiservice-cert\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:07.634758 master-0 kubenswrapper[7465]: I0320 08:37:07.634001 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7dcf5569b5-xmvwz"] Mar 20 08:37:07.641752 master-0 kubenswrapper[7465]: I0320 08:37:07.635578 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.651164 master-0 kubenswrapper[7465]: I0320 08:37:07.646368 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j"] Mar 20 08:37:07.651164 master-0 kubenswrapper[7465]: I0320 08:37:07.648953 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" Mar 20 08:37:07.653113 master-0 kubenswrapper[7465]: I0320 08:37:07.652356 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 08:37:07.654662 master-0 kubenswrapper[7465]: I0320 08:37:07.653616 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 08:37:07.654662 master-0 kubenswrapper[7465]: I0320 08:37:07.653672 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 08:37:07.654662 master-0 kubenswrapper[7465]: I0320 08:37:07.653620 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 08:37:07.657688 master-0 kubenswrapper[7465]: I0320 08:37:07.657642 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 08:37:07.663042 master-0 kubenswrapper[7465]: I0320 08:37:07.662545 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-metrics-certs\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.663042 master-0 kubenswrapper[7465]: I0320 08:37:07.662622 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-stats-auth\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.663042 master-0 kubenswrapper[7465]: I0320 08:37:07.662698 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c2rq\" (UniqueName: \"kubernetes.io/projected/b543f82e-683d-47c1-af73-4dcede4cf4df-kube-api-access-4c2rq\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:07.663042 master-0 kubenswrapper[7465]: I0320 08:37:07.662777 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b543f82e-683d-47c1-af73-4dcede4cf4df-tmpfs\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:07.663042 master-0 kubenswrapper[7465]: I0320 08:37:07.662829 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-webhook-cert\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:07.663042 master-0 kubenswrapper[7465]: I0320 08:37:07.662854 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqtvp\" (UniqueName: \"kubernetes.io/projected/91b2899e-8d24-41a0-bec8-d11c67b8f955-kube-api-access-hqtvp\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.663042 master-0 kubenswrapper[7465]: I0320 08:37:07.662912 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-apiservice-cert\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:07.663042 master-0 kubenswrapper[7465]: I0320 08:37:07.663048 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-default-certificate\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.663471 master-0 kubenswrapper[7465]: I0320 08:37:07.663093 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91b2899e-8d24-41a0-bec8-d11c67b8f955-service-ca-bundle\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.676281 master-0 kubenswrapper[7465]: I0320 08:37:07.664738 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 08:37:07.676281 master-0 kubenswrapper[7465]: I0320 08:37:07.665147 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 20 08:37:07.676281 master-0 kubenswrapper[7465]: I0320 08:37:07.665998 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b543f82e-683d-47c1-af73-4dcede4cf4df-tmpfs\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:07.676281 master-0 kubenswrapper[7465]: I0320 08:37:07.671326 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6"] Mar 20 08:37:07.676281 master-0 kubenswrapper[7465]: I0320 08:37:07.672345 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6" Mar 20 08:37:07.676281 master-0 kubenswrapper[7465]: I0320 08:37:07.672480 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-apiservice-cert\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:07.687639 master-0 kubenswrapper[7465]: I0320 08:37:07.686317 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6"] Mar 20 08:37:07.697812 master-0 kubenswrapper[7465]: I0320 08:37:07.696846 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-webhook-cert\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:07.710427 master-0 kubenswrapper[7465]: I0320 08:37:07.709126 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c2rq\" (UniqueName: \"kubernetes.io/projected/b543f82e-683d-47c1-af73-4dcede4cf4df-kube-api-access-4c2rq\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:07.720445 master-0 kubenswrapper[7465]: I0320 08:37:07.719796 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j"] Mar 20 08:37:07.739307 master-0 kubenswrapper[7465]: I0320 08:37:07.739163 7465 generic.go:334] "Generic (PLEG): container finished" podID="06159643-e9b8-417f-ba79-ee1e1ca5c951" containerID="e009b9f493edc688ad18ed0c51c3e2a8f79f6c2dff58a7c93226076a653427d7" exitCode=0 Mar 20 08:37:07.740005 master-0 kubenswrapper[7465]: I0320 08:37:07.739768 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mgcb9" event={"ID":"06159643-e9b8-417f-ba79-ee1e1ca5c951","Type":"ContainerDied","Data":"e009b9f493edc688ad18ed0c51c3e2a8f79f6c2dff58a7c93226076a653427d7"} Mar 20 08:37:07.750532 master-0 kubenswrapper[7465]: I0320 08:37:07.750317 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" event={"ID":"de6078d7-2aad-46fe-b17a-b6b38e4eaa41","Type":"ContainerStarted","Data":"614acf21995c6ef4e652413ccece98d1915da356d7813f8b0dcd90d12e6d4a8d"} Mar 20 08:37:07.750532 master-0 kubenswrapper[7465]: I0320 08:37:07.750372 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" event={"ID":"de6078d7-2aad-46fe-b17a-b6b38e4eaa41","Type":"ContainerStarted","Data":"d47bba92b6fb8946edb6fa2f6a021436ea604b27f3b2a8581b9108a215eab3e8"} Mar 20 08:37:07.750532 master-0 kubenswrapper[7465]: I0320 08:37:07.750385 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" event={"ID":"de6078d7-2aad-46fe-b17a-b6b38e4eaa41","Type":"ContainerStarted","Data":"8a073909baddafc91ef0c1a8a7d1dde84b7bce6d841c77590b71f4a6754c9117"} Mar 20 08:37:07.759678 master-0 kubenswrapper[7465]: I0320 08:37:07.759633 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zspn5"] Mar 20 08:37:07.765346 master-0 kubenswrapper[7465]: I0320 08:37:07.765299 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:37:07.767678 master-0 kubenswrapper[7465]: I0320 08:37:07.767275 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-default-certificate\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.767678 master-0 kubenswrapper[7465]: I0320 08:37:07.767445 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91b2899e-8d24-41a0-bec8-d11c67b8f955-service-ca-bundle\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.767678 master-0 kubenswrapper[7465]: I0320 08:37:07.767572 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-metrics-certs\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.767678 master-0 kubenswrapper[7465]: I0320 08:37:07.767605 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-stats-auth\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.767678 master-0 kubenswrapper[7465]: I0320 08:37:07.767663 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/47eadda0-35a6-4b5c-a96c-24854be15098-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-tkc2j\" (UID: \"47eadda0-35a6-4b5c-a96c-24854be15098\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" Mar 20 08:37:07.767843 master-0 kubenswrapper[7465]: I0320 08:37:07.767819 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wl7f\" (UniqueName: \"kubernetes.io/projected/e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97-kube-api-access-6wl7f\") pod \"network-check-source-b4bf74f6-fhvg6\" (UID: \"e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6" Mar 20 08:37:07.768162 master-0 kubenswrapper[7465]: I0320 08:37:07.768119 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zspn5"] Mar 20 08:37:07.769693 master-0 kubenswrapper[7465]: I0320 08:37:07.767933 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqtvp\" (UniqueName: \"kubernetes.io/projected/91b2899e-8d24-41a0-bec8-d11c67b8f955-kube-api-access-hqtvp\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.771071 master-0 kubenswrapper[7465]: I0320 08:37:07.770712 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91b2899e-8d24-41a0-bec8-d11c67b8f955-service-ca-bundle\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.776475 master-0 kubenswrapper[7465]: I0320 08:37:07.776429 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-stats-auth\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.779161 master-0 kubenswrapper[7465]: I0320 08:37:07.777381 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-default-certificate\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.779161 master-0 kubenswrapper[7465]: I0320 08:37:07.779143 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-metrics-certs\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.803210 master-0 kubenswrapper[7465]: I0320 08:37:07.799740 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:07.803210 master-0 kubenswrapper[7465]: I0320 08:37:07.800310 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqtvp\" (UniqueName: \"kubernetes.io/projected/91b2899e-8d24-41a0-bec8-d11c67b8f955-kube-api-access-hqtvp\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:07.824340 master-0 kubenswrapper[7465]: I0320 08:37:07.823743 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" podStartSLOduration=2.823709414 podStartE2EDuration="2.823709414s" podCreationTimestamp="2026-03-20 08:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:37:07.818341367 +0000 UTC m=+53.461656877" watchObservedRunningTime="2026-03-20 08:37:07.823709414 +0000 UTC m=+53.467024904" Mar 20 08:37:07.875923 master-0 kubenswrapper[7465]: I0320 08:37:07.875853 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c649d964-ba32-44e4-a3cb-06a285972d97-catalog-content\") pod \"community-operators-zspn5\" (UID: \"c649d964-ba32-44e4-a3cb-06a285972d97\") " pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:37:07.877602 master-0 kubenswrapper[7465]: I0320 08:37:07.877258 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c649d964-ba32-44e4-a3cb-06a285972d97-utilities\") pod \"community-operators-zspn5\" (UID: \"c649d964-ba32-44e4-a3cb-06a285972d97\") " pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:37:07.884835 master-0 kubenswrapper[7465]: I0320 08:37:07.884735 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/47eadda0-35a6-4b5c-a96c-24854be15098-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-tkc2j\" (UID: \"47eadda0-35a6-4b5c-a96c-24854be15098\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" Mar 20 08:37:07.893274 master-0 kubenswrapper[7465]: I0320 08:37:07.889088 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wl7f\" (UniqueName: \"kubernetes.io/projected/e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97-kube-api-access-6wl7f\") pod \"network-check-source-b4bf74f6-fhvg6\" (UID: \"e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6" Mar 20 08:37:07.893274 master-0 kubenswrapper[7465]: I0320 08:37:07.889213 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-697rh\" (UniqueName: \"kubernetes.io/projected/c649d964-ba32-44e4-a3cb-06a285972d97-kube-api-access-697rh\") pod \"community-operators-zspn5\" (UID: \"c649d964-ba32-44e4-a3cb-06a285972d97\") " pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:37:07.893274 master-0 kubenswrapper[7465]: I0320 08:37:07.892498 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/47eadda0-35a6-4b5c-a96c-24854be15098-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-tkc2j\" (UID: \"47eadda0-35a6-4b5c-a96c-24854be15098\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" Mar 20 08:37:07.915854 master-0 kubenswrapper[7465]: I0320 08:37:07.915803 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wl7f\" (UniqueName: \"kubernetes.io/projected/e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97-kube-api-access-6wl7f\") pod \"network-check-source-b4bf74f6-fhvg6\" (UID: \"e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6" Mar 20 08:37:07.974727 master-0 kubenswrapper[7465]: I0320 08:37:07.974130 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:08.010773 master-0 kubenswrapper[7465]: I0320 08:37:08.009121 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" Mar 20 08:37:08.012367 master-0 kubenswrapper[7465]: I0320 08:37:08.011581 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c649d964-ba32-44e4-a3cb-06a285972d97-utilities\") pod \"community-operators-zspn5\" (UID: \"c649d964-ba32-44e4-a3cb-06a285972d97\") " pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:37:08.012367 master-0 kubenswrapper[7465]: I0320 08:37:08.011684 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-697rh\" (UniqueName: \"kubernetes.io/projected/c649d964-ba32-44e4-a3cb-06a285972d97-kube-api-access-697rh\") pod \"community-operators-zspn5\" (UID: \"c649d964-ba32-44e4-a3cb-06a285972d97\") " pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:37:08.012367 master-0 kubenswrapper[7465]: I0320 08:37:08.011760 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c649d964-ba32-44e4-a3cb-06a285972d97-catalog-content\") pod \"community-operators-zspn5\" (UID: \"c649d964-ba32-44e4-a3cb-06a285972d97\") " pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:37:08.012367 master-0 kubenswrapper[7465]: I0320 08:37:08.012308 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c649d964-ba32-44e4-a3cb-06a285972d97-catalog-content\") pod \"community-operators-zspn5\" (UID: \"c649d964-ba32-44e4-a3cb-06a285972d97\") " pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:37:08.015107 master-0 kubenswrapper[7465]: I0320 08:37:08.012666 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c649d964-ba32-44e4-a3cb-06a285972d97-utilities\") pod \"community-operators-zspn5\" (UID: \"c649d964-ba32-44e4-a3cb-06a285972d97\") " pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:37:08.029244 master-0 kubenswrapper[7465]: I0320 08:37:08.028681 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6" Mar 20 08:37:08.224410 master-0 kubenswrapper[7465]: I0320 08:37:08.222367 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-697rh\" (UniqueName: \"kubernetes.io/projected/c649d964-ba32-44e4-a3cb-06a285972d97-kube-api-access-697rh\") pod \"community-operators-zspn5\" (UID: \"c649d964-ba32-44e4-a3cb-06a285972d97\") " pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:37:08.426702 master-0 kubenswrapper[7465]: I0320 08:37:08.425058 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 20 08:37:08.426702 master-0 kubenswrapper[7465]: I0320 08:37:08.426054 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-86g9n"] Mar 20 08:37:08.426702 master-0 kubenswrapper[7465]: I0320 08:37:08.426151 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 20 08:37:08.428945 master-0 kubenswrapper[7465]: I0320 08:37:08.428766 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:37:08.439748 master-0 kubenswrapper[7465]: I0320 08:37:08.437838 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 08:37:08.525039 master-0 kubenswrapper[7465]: I0320 08:37:08.524302 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1d21f11-7386-4a04-a82e-5a03f3602a3b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"e1d21f11-7386-4a04-a82e-5a03f3602a3b\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 20 08:37:08.525039 master-0 kubenswrapper[7465]: I0320 08:37:08.524600 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1d21f11-7386-4a04-a82e-5a03f3602a3b-var-lock\") pod \"installer-1-master-0\" (UID: \"e1d21f11-7386-4a04-a82e-5a03f3602a3b\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 20 08:37:08.525039 master-0 kubenswrapper[7465]: I0320 08:37:08.524794 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1d21f11-7386-4a04-a82e-5a03f3602a3b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"e1d21f11-7386-4a04-a82e-5a03f3602a3b\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 20 08:37:08.597098 master-0 kubenswrapper[7465]: I0320 08:37:08.597041 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:37:08.597098 master-0 kubenswrapper[7465]: I0320 08:37:08.597110 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:37:08.632732 master-0 kubenswrapper[7465]: I0320 08:37:08.631314 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1d21f11-7386-4a04-a82e-5a03f3602a3b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"e1d21f11-7386-4a04-a82e-5a03f3602a3b\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 20 08:37:08.632732 master-0 kubenswrapper[7465]: I0320 08:37:08.631403 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1d21f11-7386-4a04-a82e-5a03f3602a3b-var-lock\") pod \"installer-1-master-0\" (UID: \"e1d21f11-7386-4a04-a82e-5a03f3602a3b\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 20 08:37:08.632732 master-0 kubenswrapper[7465]: I0320 08:37:08.631464 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1d21f11-7386-4a04-a82e-5a03f3602a3b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"e1d21f11-7386-4a04-a82e-5a03f3602a3b\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 20 08:37:08.632732 master-0 kubenswrapper[7465]: I0320 08:37:08.631563 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1d21f11-7386-4a04-a82e-5a03f3602a3b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"e1d21f11-7386-4a04-a82e-5a03f3602a3b\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 20 08:37:08.632732 master-0 kubenswrapper[7465]: I0320 08:37:08.632082 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1d21f11-7386-4a04-a82e-5a03f3602a3b-var-lock\") pod \"installer-1-master-0\" (UID: \"e1d21f11-7386-4a04-a82e-5a03f3602a3b\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 20 08:37:08.758504 master-0 kubenswrapper[7465]: I0320 08:37:08.758362 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" event={"ID":"91b2899e-8d24-41a0-bec8-d11c67b8f955","Type":"ContainerStarted","Data":"630f3ef68fb2ab037a83499120027474c94dfe12bf91c1a5c52579bd6c878cbf"} Mar 20 08:37:08.760978 master-0 kubenswrapper[7465]: I0320 08:37:08.760947 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86g9n" event={"ID":"55cefc56-c008-45c1-a6ac-b1d3c8778c7b","Type":"ContainerStarted","Data":"07e40aa377bfe7a7fc8825ffe8c45483249a93c88f052dadd76fb2c790f314d3"} Mar 20 08:37:08.851472 master-0 kubenswrapper[7465]: I0320 08:37:08.847879 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:37:08.851472 master-0 kubenswrapper[7465]: I0320 08:37:08.849698 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 20 08:37:08.861614 master-0 kubenswrapper[7465]: I0320 08:37:08.858740 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg"] Mar 20 08:37:08.861614 master-0 kubenswrapper[7465]: I0320 08:37:08.859248 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:37:09.077413 master-0 kubenswrapper[7465]: I0320 08:37:09.075548 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j"] Mar 20 08:37:09.079412 master-0 kubenswrapper[7465]: I0320 08:37:09.079350 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6"] Mar 20 08:37:09.081002 master-0 kubenswrapper[7465]: I0320 08:37:09.080942 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1d21f11-7386-4a04-a82e-5a03f3602a3b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"e1d21f11-7386-4a04-a82e-5a03f3602a3b\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 20 08:37:09.095004 master-0 kubenswrapper[7465]: W0320 08:37:09.091354 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47eadda0_35a6_4b5c_a96c_24854be15098.slice/crio-5a9e26d5feffb5d032a36611c5da4c454dba6454147ff1cf841070d4b4feeb67 WatchSource:0}: Error finding container 5a9e26d5feffb5d032a36611c5da4c454dba6454147ff1cf841070d4b4feeb67: Status 404 returned error can't find the container with id 5a9e26d5feffb5d032a36611c5da4c454dba6454147ff1cf841070d4b4feeb67 Mar 20 08:37:09.176995 master-0 kubenswrapper[7465]: I0320 08:37:09.176923 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 20 08:37:09.334227 master-0 kubenswrapper[7465]: I0320 08:37:09.333016 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zspn5"] Mar 20 08:37:09.355719 master-0 kubenswrapper[7465]: W0320 08:37:09.354847 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc649d964_ba32_44e4_a3cb_06a285972d97.slice/crio-72ef06cb8ef1d212f0ed5b75f9026683a6bf969cd551915b43925cfb6e211dc1 WatchSource:0}: Error finding container 72ef06cb8ef1d212f0ed5b75f9026683a6bf969cd551915b43925cfb6e211dc1: Status 404 returned error can't find the container with id 72ef06cb8ef1d212f0ed5b75f9026683a6bf969cd551915b43925cfb6e211dc1 Mar 20 08:37:09.828233 master-0 kubenswrapper[7465]: I0320 08:37:09.817677 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zspn5" event={"ID":"c649d964-ba32-44e4-a3cb-06a285972d97","Type":"ContainerStarted","Data":"72ef06cb8ef1d212f0ed5b75f9026683a6bf969cd551915b43925cfb6e211dc1"} Mar 20 08:37:09.849227 master-0 kubenswrapper[7465]: I0320 08:37:09.846269 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6" event={"ID":"e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97","Type":"ContainerStarted","Data":"17d9a381fe77c2a99690d4e954254b88e9da3b66911db388af1c343ca887780e"} Mar 20 08:37:09.849227 master-0 kubenswrapper[7465]: I0320 08:37:09.846325 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6" event={"ID":"e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97","Type":"ContainerStarted","Data":"a56a69cfc23cf8add77dfc1a237e33143ff59495f1a2048a86a1759c1954faee"} Mar 20 08:37:09.863325 master-0 kubenswrapper[7465]: I0320 08:37:09.856584 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" event={"ID":"47eadda0-35a6-4b5c-a96c-24854be15098","Type":"ContainerStarted","Data":"5a9e26d5feffb5d032a36611c5da4c454dba6454147ff1cf841070d4b4feeb67"} Mar 20 08:37:09.878236 master-0 kubenswrapper[7465]: I0320 08:37:09.872527 7465 generic.go:334] "Generic (PLEG): container finished" podID="55cefc56-c008-45c1-a6ac-b1d3c8778c7b" containerID="eb9e9fd88203cc82f4f42778ae8752420346c9a6317474f729b19d61f2a0b11e" exitCode=0 Mar 20 08:37:09.878236 master-0 kubenswrapper[7465]: I0320 08:37:09.872670 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86g9n" event={"ID":"55cefc56-c008-45c1-a6ac-b1d3c8778c7b","Type":"ContainerDied","Data":"eb9e9fd88203cc82f4f42778ae8752420346c9a6317474f729b19d61f2a0b11e"} Mar 20 08:37:09.901211 master-0 kubenswrapper[7465]: I0320 08:37:09.897532 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" event={"ID":"b543f82e-683d-47c1-af73-4dcede4cf4df","Type":"ContainerStarted","Data":"f1451cb7c441d0c0436b2b43cfeec19d47072600b8f030c7cbe1b7c9914bab91"} Mar 20 08:37:09.901211 master-0 kubenswrapper[7465]: I0320 08:37:09.897588 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" event={"ID":"b543f82e-683d-47c1-af73-4dcede4cf4df","Type":"ContainerStarted","Data":"285790bb4eeaea0e1399502a5e31c8d8bf1bd484bccae96128ad9795ef9ca21a"} Mar 20 08:37:09.901211 master-0 kubenswrapper[7465]: I0320 08:37:09.898315 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:10.209152 master-0 kubenswrapper[7465]: I0320 08:37:10.209077 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 20 08:37:10.897956 master-0 kubenswrapper[7465]: I0320 08:37:10.897884 7465 patch_prober.go:28] interesting pod/packageserver-6c85f64bb9-fmpsg container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.55:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:37:10.898546 master-0 kubenswrapper[7465]: I0320 08:37:10.897976 7465 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" podUID="b543f82e-683d-47c1-af73-4dcede4cf4df" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.55:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:37:10.911070 master-0 kubenswrapper[7465]: I0320 08:37:10.911025 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zspn5" event={"ID":"c649d964-ba32-44e4-a3cb-06a285972d97","Type":"ContainerStarted","Data":"413513ca3ab43ee903b1af9b07a1e3a6753fe8f1a044a9c14cd7c2703547a117"} Mar 20 08:37:10.990695 master-0 kubenswrapper[7465]: I0320 08:37:10.990526 7465 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 08:37:11.126214 master-0 kubenswrapper[7465]: I0320 08:37:11.124744 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" podStartSLOduration=4.124722276 podStartE2EDuration="4.124722276s" podCreationTimestamp="2026-03-20 08:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:37:11.119893615 +0000 UTC m=+56.763209115" watchObservedRunningTime="2026-03-20 08:37:11.124722276 +0000 UTC m=+56.768037766" Mar 20 08:37:11.126472 master-0 kubenswrapper[7465]: I0320 08:37:11.126402 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 20 08:37:11.126840 master-0 kubenswrapper[7465]: I0320 08:37:11.126731 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-1-master-0" podUID="8682b669-c173-4b96-80f6-029292f5c25b" containerName="installer" containerID="cri-o://7dcd8a99fc665575842ace97eb1ad322b466b3c9cd44a9d54ad2389140cfb2bc" gracePeriod=30 Mar 20 08:37:11.632946 master-0 kubenswrapper[7465]: I0320 08:37:11.632531 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6" podStartSLOduration=111.632493675 podStartE2EDuration="1m51.632493675s" podCreationTimestamp="2026-03-20 08:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:37:11.619962209 +0000 UTC m=+57.263277709" watchObservedRunningTime="2026-03-20 08:37:11.632493675 +0000 UTC m=+57.275809165" Mar 20 08:37:11.726257 master-0 kubenswrapper[7465]: I0320 08:37:11.723030 7465 patch_prober.go:28] interesting pod/packageserver-6c85f64bb9-fmpsg container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 08:37:11.726257 master-0 kubenswrapper[7465]: [+]log ok Mar 20 08:37:11.726257 master-0 kubenswrapper[7465]: [-]poststarthook/generic-apiserver-start-informers failed: reason withheld Mar 20 08:37:11.726257 master-0 kubenswrapper[7465]: [-]poststarthook/max-in-flight-filter failed: reason withheld Mar 20 08:37:11.726257 master-0 kubenswrapper[7465]: [-]poststarthook/storage-object-count-tracker-hook failed: reason withheld Mar 20 08:37:11.726257 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:11.726257 master-0 kubenswrapper[7465]: I0320 08:37:11.723137 7465 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" podUID="b543f82e-683d-47c1-af73-4dcede4cf4df" containerName="packageserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:11.927567 master-0 kubenswrapper[7465]: I0320 08:37:11.922097 7465 generic.go:334] "Generic (PLEG): container finished" podID="c649d964-ba32-44e4-a3cb-06a285972d97" containerID="413513ca3ab43ee903b1af9b07a1e3a6753fe8f1a044a9c14cd7c2703547a117" exitCode=0 Mar 20 08:37:11.927567 master-0 kubenswrapper[7465]: I0320 08:37:11.923153 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zspn5" event={"ID":"c649d964-ba32-44e4-a3cb-06a285972d97","Type":"ContainerDied","Data":"413513ca3ab43ee903b1af9b07a1e3a6753fe8f1a044a9c14cd7c2703547a117"} Mar 20 08:37:11.942411 master-0 kubenswrapper[7465]: I0320 08:37:11.940069 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"e1d21f11-7386-4a04-a82e-5a03f3602a3b","Type":"ContainerStarted","Data":"fc9fcf2245b5e00e0473ecdf9c16e18d2e148c7fa6e4f86bf8df81bc8b274006"} Mar 20 08:37:11.954579 master-0 kubenswrapper[7465]: I0320 08:37:11.951078 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:37:12.539577 master-0 kubenswrapper[7465]: I0320 08:37:12.538274 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 20 08:37:12.540982 master-0 kubenswrapper[7465]: I0320 08:37:12.540890 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 20 08:37:12.559209 master-0 kubenswrapper[7465]: I0320 08:37:12.556362 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 20 08:37:12.629508 master-0 kubenswrapper[7465]: I0320 08:37:12.629417 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fac672fa-7660-449e-a0d1-244dc6282d76-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"fac672fa-7660-449e-a0d1-244dc6282d76\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 20 08:37:12.630277 master-0 kubenswrapper[7465]: I0320 08:37:12.629560 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fac672fa-7660-449e-a0d1-244dc6282d76-var-lock\") pod \"installer-2-master-0\" (UID: \"fac672fa-7660-449e-a0d1-244dc6282d76\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 20 08:37:12.630277 master-0 kubenswrapper[7465]: I0320 08:37:12.629587 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fac672fa-7660-449e-a0d1-244dc6282d76-kube-api-access\") pod \"installer-2-master-0\" (UID: \"fac672fa-7660-449e-a0d1-244dc6282d76\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 20 08:37:12.730958 master-0 kubenswrapper[7465]: I0320 08:37:12.730799 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fac672fa-7660-449e-a0d1-244dc6282d76-kube-api-access\") pod \"installer-2-master-0\" (UID: \"fac672fa-7660-449e-a0d1-244dc6282d76\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 20 08:37:12.730958 master-0 kubenswrapper[7465]: I0320 08:37:12.730864 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fac672fa-7660-449e-a0d1-244dc6282d76-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"fac672fa-7660-449e-a0d1-244dc6282d76\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 20 08:37:12.730958 master-0 kubenswrapper[7465]: I0320 08:37:12.730929 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fac672fa-7660-449e-a0d1-244dc6282d76-var-lock\") pod \"installer-2-master-0\" (UID: \"fac672fa-7660-449e-a0d1-244dc6282d76\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 20 08:37:12.731402 master-0 kubenswrapper[7465]: I0320 08:37:12.731012 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fac672fa-7660-449e-a0d1-244dc6282d76-var-lock\") pod \"installer-2-master-0\" (UID: \"fac672fa-7660-449e-a0d1-244dc6282d76\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 20 08:37:12.731402 master-0 kubenswrapper[7465]: I0320 08:37:12.731400 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fac672fa-7660-449e-a0d1-244dc6282d76-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"fac672fa-7660-449e-a0d1-244dc6282d76\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 20 08:37:12.780285 master-0 kubenswrapper[7465]: I0320 08:37:12.780114 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fac672fa-7660-449e-a0d1-244dc6282d76-kube-api-access\") pod \"installer-2-master-0\" (UID: \"fac672fa-7660-449e-a0d1-244dc6282d76\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 20 08:37:12.882640 master-0 kubenswrapper[7465]: I0320 08:37:12.882562 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 20 08:37:14.110036 master-0 kubenswrapper[7465]: I0320 08:37:14.106160 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 20 08:37:14.110036 master-0 kubenswrapper[7465]: I0320 08:37:14.106620 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-1-master-0" podUID="d17f953e-3ca4-4bd5-ad89-678447774687" containerName="installer" containerID="cri-o://e11ff3b52271d029d42cce254ad48e40793e3cdb6d379d24178958fc78484abf" gracePeriod=30 Mar 20 08:37:14.605957 master-0 kubenswrapper[7465]: I0320 08:37:14.605896 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-gj4pm"] Mar 20 08:37:14.609085 master-0 kubenswrapper[7465]: I0320 08:37:14.606623 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:37:14.610095 master-0 kubenswrapper[7465]: I0320 08:37:14.610041 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 08:37:14.610351 master-0 kubenswrapper[7465]: I0320 08:37:14.610328 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 08:37:14.676721 master-0 kubenswrapper[7465]: I0320 08:37:14.676651 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-certs\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:37:14.676995 master-0 kubenswrapper[7465]: I0320 08:37:14.676788 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27j9q\" (UniqueName: \"kubernetes.io/projected/4ddac301-a604-4f07-8849-5928befd336e-kube-api-access-27j9q\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:37:14.676995 master-0 kubenswrapper[7465]: I0320 08:37:14.676839 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-node-bootstrap-token\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:37:14.781444 master-0 kubenswrapper[7465]: I0320 08:37:14.781377 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-node-bootstrap-token\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:37:14.781728 master-0 kubenswrapper[7465]: I0320 08:37:14.781459 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-certs\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:37:14.781921 master-0 kubenswrapper[7465]: I0320 08:37:14.781799 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27j9q\" (UniqueName: \"kubernetes.io/projected/4ddac301-a604-4f07-8849-5928befd336e-kube-api-access-27j9q\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:37:14.786085 master-0 kubenswrapper[7465]: I0320 08:37:14.786048 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-node-bootstrap-token\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:37:14.804521 master-0 kubenswrapper[7465]: I0320 08:37:14.802005 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27j9q\" (UniqueName: \"kubernetes.io/projected/4ddac301-a604-4f07-8849-5928befd336e-kube-api-access-27j9q\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:37:14.805722 master-0 kubenswrapper[7465]: I0320 08:37:14.805578 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-certs\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:37:14.939077 master-0 kubenswrapper[7465]: I0320 08:37:14.938429 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:37:17.987774 master-0 kubenswrapper[7465]: I0320 08:37:17.987699 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 20 08:37:17.988819 master-0 kubenswrapper[7465]: I0320 08:37:17.988785 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:37:18.050883 master-0 kubenswrapper[7465]: I0320 08:37:18.050754 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-kube-api-access\") pod \"installer-2-master-0\" (UID: \"8b1c7a56-5d00-468a-bb8d-dbaf8e854951\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:37:18.050883 master-0 kubenswrapper[7465]: I0320 08:37:18.050837 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"8b1c7a56-5d00-468a-bb8d-dbaf8e854951\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:37:18.050883 master-0 kubenswrapper[7465]: I0320 08:37:18.050860 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-var-lock\") pod \"installer-2-master-0\" (UID: \"8b1c7a56-5d00-468a-bb8d-dbaf8e854951\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:37:18.152952 master-0 kubenswrapper[7465]: I0320 08:37:18.152869 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-kube-api-access\") pod \"installer-2-master-0\" (UID: \"8b1c7a56-5d00-468a-bb8d-dbaf8e854951\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:37:18.152952 master-0 kubenswrapper[7465]: I0320 08:37:18.152946 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"8b1c7a56-5d00-468a-bb8d-dbaf8e854951\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:37:18.153336 master-0 kubenswrapper[7465]: I0320 08:37:18.152981 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-var-lock\") pod \"installer-2-master-0\" (UID: \"8b1c7a56-5d00-468a-bb8d-dbaf8e854951\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:37:18.153336 master-0 kubenswrapper[7465]: I0320 08:37:18.153101 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-var-lock\") pod \"installer-2-master-0\" (UID: \"8b1c7a56-5d00-468a-bb8d-dbaf8e854951\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:37:18.153336 master-0 kubenswrapper[7465]: I0320 08:37:18.153258 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"8b1c7a56-5d00-468a-bb8d-dbaf8e854951\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:37:19.182900 master-0 kubenswrapper[7465]: I0320 08:37:19.182835 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 20 08:37:19.997007 master-0 kubenswrapper[7465]: I0320 08:37:19.996929 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-gj4pm" event={"ID":"4ddac301-a604-4f07-8849-5928befd336e","Type":"ContainerStarted","Data":"e9d5f349b622bea576ae3dd04cdf2c2da1c82af6b9e42a0b5011a9e0e2cc47e6"} Mar 20 08:37:20.401964 master-0 kubenswrapper[7465]: I0320 08:37:20.401879 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-kube-api-access\") pod \"installer-2-master-0\" (UID: \"8b1c7a56-5d00-468a-bb8d-dbaf8e854951\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:37:20.408145 master-0 kubenswrapper[7465]: I0320 08:37:20.407983 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:37:20.441978 master-0 kubenswrapper[7465]: I0320 08:37:20.441892 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 20 08:37:20.609267 master-0 kubenswrapper[7465]: I0320 08:37:20.608390 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv"] Mar 20 08:37:20.609518 master-0 kubenswrapper[7465]: I0320 08:37:20.609485 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv"] Mar 20 08:37:20.612241 master-0 kubenswrapper[7465]: I0320 08:37:20.609618 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:37:20.616746 master-0 kubenswrapper[7465]: I0320 08:37:20.616682 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 08:37:20.751533 master-0 kubenswrapper[7465]: I0320 08:37:20.751259 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htv9s\" (UniqueName: \"kubernetes.io/projected/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e-kube-api-access-htv9s\") pod \"control-plane-machine-set-operator-6f97756bc8-7t5qv\" (UID: \"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:37:20.751533 master-0 kubenswrapper[7465]: I0320 08:37:20.751363 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-7t5qv\" (UID: \"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:37:20.831324 master-0 kubenswrapper[7465]: I0320 08:37:20.830279 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6486d766f9-5b77h"] Mar 20 08:37:20.858255 master-0 kubenswrapper[7465]: I0320 08:37:20.856346 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-7t5qv\" (UID: \"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:37:20.858255 master-0 kubenswrapper[7465]: I0320 08:37:20.856511 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htv9s\" (UniqueName: \"kubernetes.io/projected/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e-kube-api-access-htv9s\") pod \"control-plane-machine-set-operator-6f97756bc8-7t5qv\" (UID: \"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:37:20.865944 master-0 kubenswrapper[7465]: I0320 08:37:20.865895 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-7t5qv\" (UID: \"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:37:20.874623 master-0 kubenswrapper[7465]: I0320 08:37:20.874574 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v"] Mar 20 08:37:20.908655 master-0 kubenswrapper[7465]: I0320 08:37:20.908573 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htv9s\" (UniqueName: \"kubernetes.io/projected/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e-kube-api-access-htv9s\") pod \"control-plane-machine-set-operator-6f97756bc8-7t5qv\" (UID: \"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:37:20.960608 master-0 kubenswrapper[7465]: I0320 08:37:20.960468 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:37:21.007313 master-0 kubenswrapper[7465]: I0320 08:37:21.007236 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-gj4pm" event={"ID":"4ddac301-a604-4f07-8849-5928befd336e","Type":"ContainerStarted","Data":"1ceab06d6d63f112c8a02eecb5b4790818231a1b5a5eaba30ed3842eed1cfc03"} Mar 20 08:37:21.009914 master-0 kubenswrapper[7465]: I0320 08:37:21.009533 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"fac672fa-7660-449e-a0d1-244dc6282d76","Type":"ContainerStarted","Data":"49b715d08715612464f503c3f66bf7c99b13a5e872383e023e27eea30084adb2"} Mar 20 08:37:21.012364 master-0 kubenswrapper[7465]: I0320 08:37:21.012294 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"e1d21f11-7386-4a04-a82e-5a03f3602a3b","Type":"ContainerStarted","Data":"bf471cdb978763d680a893df02a2a47dbe930e97fc0ccb05e480229f6feda593"} Mar 20 08:37:21.013961 master-0 kubenswrapper[7465]: I0320 08:37:21.013780 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" event={"ID":"e180bf9a-03f7-405b-90c3-b2e46008213e","Type":"ContainerStarted","Data":"18fb1fff91a4cce821d193a03b4576ce661c211b770f644f35eae3cf19c8da77"} Mar 20 08:37:21.014365 master-0 kubenswrapper[7465]: I0320 08:37:21.014340 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:21.023752 master-0 kubenswrapper[7465]: I0320 08:37:21.023705 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:21.061498 master-0 kubenswrapper[7465]: I0320 08:37:21.061405 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-gj4pm" podStartSLOduration=7.061376425 podStartE2EDuration="7.061376425s" podCreationTimestamp="2026-03-20 08:37:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:37:21.057952235 +0000 UTC m=+66.701267745" watchObservedRunningTime="2026-03-20 08:37:21.061376425 +0000 UTC m=+66.704691915" Mar 20 08:37:21.095878 master-0 kubenswrapper[7465]: I0320 08:37:21.091287 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=13.091265187 podStartE2EDuration="13.091265187s" podCreationTimestamp="2026-03-20 08:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:37:21.090631338 +0000 UTC m=+66.733946828" watchObservedRunningTime="2026-03-20 08:37:21.091265187 +0000 UTC m=+66.734580677" Mar 20 08:37:21.133328 master-0 kubenswrapper[7465]: I0320 08:37:21.133224 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" podStartSLOduration=8.005504029 podStartE2EDuration="21.13317874s" podCreationTimestamp="2026-03-20 08:37:00 +0000 UTC" firstStartedPulling="2026-03-20 08:37:06.280595782 +0000 UTC m=+51.923911272" lastFinishedPulling="2026-03-20 08:37:19.408270483 +0000 UTC m=+65.051585983" observedRunningTime="2026-03-20 08:37:21.132399547 +0000 UTC m=+66.775715037" watchObservedRunningTime="2026-03-20 08:37:21.13317874 +0000 UTC m=+66.776494230" Mar 20 08:37:22.022025 master-0 kubenswrapper[7465]: I0320 08:37:22.021105 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" podUID="e180bf9a-03f7-405b-90c3-b2e46008213e" containerName="route-controller-manager" containerID="cri-o://18fb1fff91a4cce821d193a03b4576ce661c211b770f644f35eae3cf19c8da77" gracePeriod=30 Mar 20 08:37:22.423761 master-0 kubenswrapper[7465]: I0320 08:37:22.423124 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758"] Mar 20 08:37:22.424431 master-0 kubenswrapper[7465]: I0320 08:37:22.424362 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:37:22.428889 master-0 kubenswrapper[7465]: I0320 08:37:22.428745 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 08:37:22.429160 master-0 kubenswrapper[7465]: I0320 08:37:22.429117 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 08:37:22.429335 master-0 kubenswrapper[7465]: I0320 08:37:22.429298 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 08:37:22.429802 master-0 kubenswrapper[7465]: I0320 08:37:22.429763 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 08:37:22.429983 master-0 kubenswrapper[7465]: I0320 08:37:22.429954 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 08:37:22.583687 master-0 kubenswrapper[7465]: I0320 08:37:22.583626 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4117cb69-45cf-4966-82b6-a31340c7db11-machine-approver-tls\") pod \"machine-approver-6cb57bb5db-mt758\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:37:22.583973 master-0 kubenswrapper[7465]: I0320 08:37:22.583806 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bwxq\" (UniqueName: \"kubernetes.io/projected/4117cb69-45cf-4966-82b6-a31340c7db11-kube-api-access-9bwxq\") pod \"machine-approver-6cb57bb5db-mt758\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:37:22.583973 master-0 kubenswrapper[7465]: I0320 08:37:22.583864 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4117cb69-45cf-4966-82b6-a31340c7db11-auth-proxy-config\") pod \"machine-approver-6cb57bb5db-mt758\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:37:22.583973 master-0 kubenswrapper[7465]: I0320 08:37:22.583894 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4117cb69-45cf-4966-82b6-a31340c7db11-config\") pod \"machine-approver-6cb57bb5db-mt758\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:37:22.685696 master-0 kubenswrapper[7465]: I0320 08:37:22.685547 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4117cb69-45cf-4966-82b6-a31340c7db11-auth-proxy-config\") pod \"machine-approver-6cb57bb5db-mt758\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:37:22.685696 master-0 kubenswrapper[7465]: I0320 08:37:22.685624 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4117cb69-45cf-4966-82b6-a31340c7db11-config\") pod \"machine-approver-6cb57bb5db-mt758\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:37:22.685696 master-0 kubenswrapper[7465]: I0320 08:37:22.685669 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4117cb69-45cf-4966-82b6-a31340c7db11-machine-approver-tls\") pod \"machine-approver-6cb57bb5db-mt758\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:37:22.685696 master-0 kubenswrapper[7465]: I0320 08:37:22.685703 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bwxq\" (UniqueName: \"kubernetes.io/projected/4117cb69-45cf-4966-82b6-a31340c7db11-kube-api-access-9bwxq\") pod \"machine-approver-6cb57bb5db-mt758\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:37:22.687073 master-0 kubenswrapper[7465]: I0320 08:37:22.687043 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4117cb69-45cf-4966-82b6-a31340c7db11-auth-proxy-config\") pod \"machine-approver-6cb57bb5db-mt758\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:37:22.687748 master-0 kubenswrapper[7465]: I0320 08:37:22.687536 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4117cb69-45cf-4966-82b6-a31340c7db11-config\") pod \"machine-approver-6cb57bb5db-mt758\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:37:22.696679 master-0 kubenswrapper[7465]: I0320 08:37:22.696626 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4117cb69-45cf-4966-82b6-a31340c7db11-machine-approver-tls\") pod \"machine-approver-6cb57bb5db-mt758\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:37:22.706481 master-0 kubenswrapper[7465]: I0320 08:37:22.706399 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bwxq\" (UniqueName: \"kubernetes.io/projected/4117cb69-45cf-4966-82b6-a31340c7db11-kube-api-access-9bwxq\") pod \"machine-approver-6cb57bb5db-mt758\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:37:22.747815 master-0 kubenswrapper[7465]: I0320 08:37:22.747542 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:37:25.819162 master-0 kubenswrapper[7465]: I0320 08:37:25.818900 7465 patch_prober.go:28] interesting pod/route-controller-manager-7ffc895647-6j97v container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.51:8443/healthz\": dial tcp 10.128.0.51:8443: connect: connection refused" start-of-body= Mar 20 08:37:25.819945 master-0 kubenswrapper[7465]: I0320 08:37:25.819238 7465 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" podUID="e180bf9a-03f7-405b-90c3-b2e46008213e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.51:8443/healthz\": dial tcp 10.128.0.51:8443: connect: connection refused" Mar 20 08:37:25.855313 master-0 kubenswrapper[7465]: I0320 08:37:25.852758 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh"] Mar 20 08:37:25.861166 master-0 kubenswrapper[7465]: I0320 08:37:25.857863 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:37:25.866224 master-0 kubenswrapper[7465]: I0320 08:37:25.864231 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 20 08:37:25.866224 master-0 kubenswrapper[7465]: I0320 08:37:25.864360 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 20 08:37:25.867630 master-0 kubenswrapper[7465]: I0320 08:37:25.866632 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 20 08:37:25.867630 master-0 kubenswrapper[7465]: I0320 08:37:25.867357 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 20 08:37:25.897903 master-0 kubenswrapper[7465]: I0320 08:37:25.887469 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh"] Mar 20 08:37:25.938872 master-0 kubenswrapper[7465]: I0320 08:37:25.938796 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dtbl\" (UniqueName: \"kubernetes.io/projected/469183dd-dc54-467d-82a1-611132ae8ec4-kube-api-access-8dtbl\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:37:25.939166 master-0 kubenswrapper[7465]: I0320 08:37:25.938894 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/469183dd-dc54-467d-82a1-611132ae8ec4-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:37:25.939166 master-0 kubenswrapper[7465]: I0320 08:37:25.938954 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/469183dd-dc54-467d-82a1-611132ae8ec4-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:37:25.951884 master-0 kubenswrapper[7465]: I0320 08:37:25.951813 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx"] Mar 20 08:37:25.953253 master-0 kubenswrapper[7465]: I0320 08:37:25.953236 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:37:25.961277 master-0 kubenswrapper[7465]: I0320 08:37:25.957737 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 08:37:25.961277 master-0 kubenswrapper[7465]: I0320 08:37:25.957951 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 08:37:25.961277 master-0 kubenswrapper[7465]: I0320 08:37:25.958091 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 08:37:25.972428 master-0 kubenswrapper[7465]: I0320 08:37:25.972367 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx"] Mar 20 08:37:26.040976 master-0 kubenswrapper[7465]: I0320 08:37:26.040904 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-mwfgx\" (UID: \"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:37:26.041254 master-0 kubenswrapper[7465]: I0320 08:37:26.041020 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/469183dd-dc54-467d-82a1-611132ae8ec4-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:37:26.041335 master-0 kubenswrapper[7465]: I0320 08:37:26.041268 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dtbl\" (UniqueName: \"kubernetes.io/projected/469183dd-dc54-467d-82a1-611132ae8ec4-kube-api-access-8dtbl\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:37:26.041404 master-0 kubenswrapper[7465]: I0320 08:37:26.041381 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkccn\" (UniqueName: \"kubernetes.io/projected/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466-kube-api-access-gkccn\") pod \"cluster-samples-operator-85f7577d78-mwfgx\" (UID: \"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:37:26.041565 master-0 kubenswrapper[7465]: I0320 08:37:26.041540 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/469183dd-dc54-467d-82a1-611132ae8ec4-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:37:26.048493 master-0 kubenswrapper[7465]: I0320 08:37:26.048438 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/469183dd-dc54-467d-82a1-611132ae8ec4-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:37:26.048917 master-0 kubenswrapper[7465]: I0320 08:37:26.048883 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/469183dd-dc54-467d-82a1-611132ae8ec4-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:37:26.061859 master-0 kubenswrapper[7465]: I0320 08:37:26.061808 7465 generic.go:334] "Generic (PLEG): container finished" podID="e180bf9a-03f7-405b-90c3-b2e46008213e" containerID="18fb1fff91a4cce821d193a03b4576ce661c211b770f644f35eae3cf19c8da77" exitCode=0 Mar 20 08:37:26.062225 master-0 kubenswrapper[7465]: I0320 08:37:26.061902 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" event={"ID":"e180bf9a-03f7-405b-90c3-b2e46008213e","Type":"ContainerDied","Data":"18fb1fff91a4cce821d193a03b4576ce661c211b770f644f35eae3cf19c8da77"} Mar 20 08:37:26.062690 master-0 kubenswrapper[7465]: I0320 08:37:26.062610 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dtbl\" (UniqueName: \"kubernetes.io/projected/469183dd-dc54-467d-82a1-611132ae8ec4-kube-api-access-8dtbl\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:37:26.143366 master-0 kubenswrapper[7465]: I0320 08:37:26.143033 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkccn\" (UniqueName: \"kubernetes.io/projected/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466-kube-api-access-gkccn\") pod \"cluster-samples-operator-85f7577d78-mwfgx\" (UID: \"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:37:26.143366 master-0 kubenswrapper[7465]: I0320 08:37:26.143220 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-mwfgx\" (UID: \"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:37:26.159861 master-0 kubenswrapper[7465]: I0320 08:37:26.159802 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-mwfgx\" (UID: \"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:37:26.160446 master-0 kubenswrapper[7465]: I0320 08:37:26.160416 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkccn\" (UniqueName: \"kubernetes.io/projected/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466-kube-api-access-gkccn\") pod \"cluster-samples-operator-85f7577d78-mwfgx\" (UID: \"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:37:26.186853 master-0 kubenswrapper[7465]: I0320 08:37:26.186691 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:37:26.274864 master-0 kubenswrapper[7465]: I0320 08:37:26.274803 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:37:27.617656 master-0 kubenswrapper[7465]: I0320 08:37:27.617533 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7"] Mar 20 08:37:27.620649 master-0 kubenswrapper[7465]: I0320 08:37:27.618676 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:37:27.620952 master-0 kubenswrapper[7465]: I0320 08:37:27.620916 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 20 08:37:27.621622 master-0 kubenswrapper[7465]: I0320 08:37:27.621599 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 20 08:37:27.636360 master-0 kubenswrapper[7465]: I0320 08:37:27.635368 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7"] Mar 20 08:37:27.781677 master-0 kubenswrapper[7465]: I0320 08:37:27.781304 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsht7\" (UniqueName: \"kubernetes.io/projected/21bebade-17fa-444e-92a9-eea53d6cd673-kube-api-access-zsht7\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:37:27.781677 master-0 kubenswrapper[7465]: I0320 08:37:27.781399 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21bebade-17fa-444e-92a9-eea53d6cd673-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:37:27.781677 master-0 kubenswrapper[7465]: I0320 08:37:27.781443 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21bebade-17fa-444e-92a9-eea53d6cd673-cert\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:37:27.883913 master-0 kubenswrapper[7465]: I0320 08:37:27.882737 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsht7\" (UniqueName: \"kubernetes.io/projected/21bebade-17fa-444e-92a9-eea53d6cd673-kube-api-access-zsht7\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:37:27.883913 master-0 kubenswrapper[7465]: I0320 08:37:27.882843 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21bebade-17fa-444e-92a9-eea53d6cd673-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:37:27.883913 master-0 kubenswrapper[7465]: I0320 08:37:27.882888 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21bebade-17fa-444e-92a9-eea53d6cd673-cert\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:37:27.885023 master-0 kubenswrapper[7465]: I0320 08:37:27.884983 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21bebade-17fa-444e-92a9-eea53d6cd673-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:37:27.899919 master-0 kubenswrapper[7465]: I0320 08:37:27.899860 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21bebade-17fa-444e-92a9-eea53d6cd673-cert\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:37:27.907439 master-0 kubenswrapper[7465]: I0320 08:37:27.907367 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsht7\" (UniqueName: \"kubernetes.io/projected/21bebade-17fa-444e-92a9-eea53d6cd673-kube-api-access-zsht7\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:37:27.941124 master-0 kubenswrapper[7465]: I0320 08:37:27.941050 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:37:28.070535 master-0 kubenswrapper[7465]: I0320 08:37:28.070484 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-68bf6ff9d6-mvfn5"] Mar 20 08:37:28.071209 master-0 kubenswrapper[7465]: I0320 08:37:28.071169 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.074249 master-0 kubenswrapper[7465]: I0320 08:37:28.074209 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 20 08:37:28.074503 master-0 kubenswrapper[7465]: I0320 08:37:28.074480 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 20 08:37:28.075308 master-0 kubenswrapper[7465]: I0320 08:37:28.075276 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 20 08:37:28.075522 master-0 kubenswrapper[7465]: I0320 08:37:28.075493 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 20 08:37:28.088154 master-0 kubenswrapper[7465]: I0320 08:37:28.088096 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-68bf6ff9d6-mvfn5"] Mar 20 08:37:28.092651 master-0 kubenswrapper[7465]: I0320 08:37:28.092629 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 20 08:37:28.186944 master-0 kubenswrapper[7465]: I0320 08:37:28.186784 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f2217de0-7805-4f5f-8ea5-93b81b7e0236-snapshots\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.186944 master-0 kubenswrapper[7465]: I0320 08:37:28.186874 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2217de0-7805-4f5f-8ea5-93b81b7e0236-serving-cert\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.186944 master-0 kubenswrapper[7465]: I0320 08:37:28.186900 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.186944 master-0 kubenswrapper[7465]: I0320 08:37:28.186925 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.187276 master-0 kubenswrapper[7465]: I0320 08:37:28.186956 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82x7p\" (UniqueName: \"kubernetes.io/projected/f2217de0-7805-4f5f-8ea5-93b81b7e0236-kube-api-access-82x7p\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.288426 master-0 kubenswrapper[7465]: I0320 08:37:28.288320 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f2217de0-7805-4f5f-8ea5-93b81b7e0236-snapshots\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.288711 master-0 kubenswrapper[7465]: I0320 08:37:28.288445 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2217de0-7805-4f5f-8ea5-93b81b7e0236-serving-cert\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.288711 master-0 kubenswrapper[7465]: I0320 08:37:28.288493 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.288711 master-0 kubenswrapper[7465]: I0320 08:37:28.288538 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.288711 master-0 kubenswrapper[7465]: I0320 08:37:28.288593 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82x7p\" (UniqueName: \"kubernetes.io/projected/f2217de0-7805-4f5f-8ea5-93b81b7e0236-kube-api-access-82x7p\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.291613 master-0 kubenswrapper[7465]: I0320 08:37:28.289311 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f2217de0-7805-4f5f-8ea5-93b81b7e0236-snapshots\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.291613 master-0 kubenswrapper[7465]: I0320 08:37:28.290558 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.291613 master-0 kubenswrapper[7465]: I0320 08:37:28.290909 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.297948 master-0 kubenswrapper[7465]: I0320 08:37:28.297900 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2217de0-7805-4f5f-8ea5-93b81b7e0236-serving-cert\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.649219 master-0 kubenswrapper[7465]: I0320 08:37:28.648730 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82x7p\" (UniqueName: \"kubernetes.io/projected/f2217de0-7805-4f5f-8ea5-93b81b7e0236-kube-api-access-82x7p\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:28.698157 master-0 kubenswrapper[7465]: I0320 08:37:28.697944 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:37:29.925782 master-0 kubenswrapper[7465]: I0320 08:37:29.924833 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29"] Mar 20 08:37:29.926997 master-0 kubenswrapper[7465]: I0320 08:37:29.926931 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:29.929219 master-0 kubenswrapper[7465]: I0320 08:37:29.929152 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 20 08:37:29.931846 master-0 kubenswrapper[7465]: I0320 08:37:29.929864 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 20 08:37:29.931846 master-0 kubenswrapper[7465]: I0320 08:37:29.930168 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 20 08:37:29.931846 master-0 kubenswrapper[7465]: I0320 08:37:29.930310 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:37:29.969037 master-0 kubenswrapper[7465]: I0320 08:37:29.968943 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 08:37:30.042244 master-0 kubenswrapper[7465]: I0320 08:37:30.041849 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75e80b57-a0f3-4f6a-a022-457944c8f59b-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-xhq29\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.042244 master-0 kubenswrapper[7465]: I0320 08:37:30.041914 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvf9w\" (UniqueName: \"kubernetes.io/projected/75e80b57-a0f3-4f6a-a022-457944c8f59b-kube-api-access-dvf9w\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-xhq29\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.042244 master-0 kubenswrapper[7465]: I0320 08:37:30.041943 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/75e80b57-a0f3-4f6a-a022-457944c8f59b-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-xhq29\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.042244 master-0 kubenswrapper[7465]: I0320 08:37:30.041978 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/75e80b57-a0f3-4f6a-a022-457944c8f59b-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-xhq29\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.042244 master-0 kubenswrapper[7465]: I0320 08:37:30.042010 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/75e80b57-a0f3-4f6a-a022-457944c8f59b-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-xhq29\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.079358 master-0 kubenswrapper[7465]: I0320 08:37:30.079297 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn"] Mar 20 08:37:30.089109 master-0 kubenswrapper[7465]: I0320 08:37:30.088978 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:37:30.092844 master-0 kubenswrapper[7465]: I0320 08:37:30.092773 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 08:37:30.094678 master-0 kubenswrapper[7465]: I0320 08:37:30.093225 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 08:37:30.094678 master-0 kubenswrapper[7465]: I0320 08:37:30.093399 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 08:37:30.102739 master-0 kubenswrapper[7465]: I0320 08:37:30.100526 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn"] Mar 20 08:37:30.144059 master-0 kubenswrapper[7465]: I0320 08:37:30.143742 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plc2q\" (UniqueName: \"kubernetes.io/projected/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-kube-api-access-plc2q\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:37:30.144197 master-0 kubenswrapper[7465]: I0320 08:37:30.144154 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-images\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:37:30.144266 master-0 kubenswrapper[7465]: I0320 08:37:30.144216 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75e80b57-a0f3-4f6a-a022-457944c8f59b-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-xhq29\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.144266 master-0 kubenswrapper[7465]: I0320 08:37:30.144248 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvf9w\" (UniqueName: \"kubernetes.io/projected/75e80b57-a0f3-4f6a-a022-457944c8f59b-kube-api-access-dvf9w\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-xhq29\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.144367 master-0 kubenswrapper[7465]: I0320 08:37:30.144321 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/75e80b57-a0f3-4f6a-a022-457944c8f59b-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-xhq29\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.144367 master-0 kubenswrapper[7465]: I0320 08:37:30.144348 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-config\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:37:30.144955 master-0 kubenswrapper[7465]: I0320 08:37:30.144826 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/75e80b57-a0f3-4f6a-a022-457944c8f59b-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-xhq29\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.145029 master-0 kubenswrapper[7465]: I0320 08:37:30.144972 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:37:30.145124 master-0 kubenswrapper[7465]: I0320 08:37:30.145088 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/75e80b57-a0f3-4f6a-a022-457944c8f59b-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-xhq29\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.145280 master-0 kubenswrapper[7465]: I0320 08:37:30.145225 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/75e80b57-a0f3-4f6a-a022-457944c8f59b-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-xhq29\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.145374 master-0 kubenswrapper[7465]: I0320 08:37:30.145329 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/75e80b57-a0f3-4f6a-a022-457944c8f59b-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-xhq29\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.145970 master-0 kubenswrapper[7465]: I0320 08:37:30.145895 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75e80b57-a0f3-4f6a-a022-457944c8f59b-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-xhq29\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.148645 master-0 kubenswrapper[7465]: I0320 08:37:30.148597 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/75e80b57-a0f3-4f6a-a022-457944c8f59b-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-xhq29\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.166855 master-0 kubenswrapper[7465]: I0320 08:37:30.166775 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvf9w\" (UniqueName: \"kubernetes.io/projected/75e80b57-a0f3-4f6a-a022-457944c8f59b-kube-api-access-dvf9w\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-xhq29\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.246555 master-0 kubenswrapper[7465]: I0320 08:37:30.246488 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-images\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:37:30.246761 master-0 kubenswrapper[7465]: I0320 08:37:30.246565 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-config\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:37:30.246761 master-0 kubenswrapper[7465]: I0320 08:37:30.246601 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:37:30.246761 master-0 kubenswrapper[7465]: I0320 08:37:30.246652 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plc2q\" (UniqueName: \"kubernetes.io/projected/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-kube-api-access-plc2q\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:37:30.249693 master-0 kubenswrapper[7465]: I0320 08:37:30.249652 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-config\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:37:30.251373 master-0 kubenswrapper[7465]: I0320 08:37:30.251337 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:37:30.252170 master-0 kubenswrapper[7465]: I0320 08:37:30.251980 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-images\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:37:30.270968 master-0 kubenswrapper[7465]: I0320 08:37:30.270879 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plc2q\" (UniqueName: \"kubernetes.io/projected/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-kube-api-access-plc2q\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:37:30.308403 master-0 kubenswrapper[7465]: I0320 08:37:30.308301 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:37:30.448010 master-0 kubenswrapper[7465]: I0320 08:37:30.447848 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:37:34.534958 master-0 kubenswrapper[7465]: I0320 08:37:34.534870 7465 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 20 08:37:34.535488 master-0 kubenswrapper[7465]: I0320 08:37:34.535382 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" containerID="cri-o://6a321a6b1056a105feaf6a80afaf577bd08c544abd875f2fd35678bf25e3bfcd" gracePeriod=30 Mar 20 08:37:34.536819 master-0 kubenswrapper[7465]: I0320 08:37:34.535436 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" containerID="cri-o://ac50f53bc5c085a17ffb0495207fc3be5a9d394ca5e56961969c0592f7f25b0d" gracePeriod=30 Mar 20 08:37:34.580824 master-0 kubenswrapper[7465]: I0320 08:37:34.580648 7465 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 20 08:37:34.580923 master-0 kubenswrapper[7465]: E0320 08:37:34.580894 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" Mar 20 08:37:34.580923 master-0 kubenswrapper[7465]: I0320 08:37:34.580910 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" Mar 20 08:37:34.580988 master-0 kubenswrapper[7465]: E0320 08:37:34.580940 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" Mar 20 08:37:34.580988 master-0 kubenswrapper[7465]: I0320 08:37:34.580948 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" Mar 20 08:37:34.581097 master-0 kubenswrapper[7465]: I0320 08:37:34.581076 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" Mar 20 08:37:34.581097 master-0 kubenswrapper[7465]: I0320 08:37:34.581094 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" Mar 20 08:37:34.583069 master-0 kubenswrapper[7465]: I0320 08:37:34.583037 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.620740 master-0 kubenswrapper[7465]: I0320 08:37:34.620644 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.620951 master-0 kubenswrapper[7465]: I0320 08:37:34.620777 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.620951 master-0 kubenswrapper[7465]: I0320 08:37:34.620847 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.621273 master-0 kubenswrapper[7465]: I0320 08:37:34.621013 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.621527 master-0 kubenswrapper[7465]: I0320 08:37:34.621499 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.621579 master-0 kubenswrapper[7465]: I0320 08:37:34.621543 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.723418 master-0 kubenswrapper[7465]: I0320 08:37:34.723263 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.723418 master-0 kubenswrapper[7465]: I0320 08:37:34.723341 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.723658 master-0 kubenswrapper[7465]: I0320 08:37:34.723480 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.723658 master-0 kubenswrapper[7465]: I0320 08:37:34.723540 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.723658 master-0 kubenswrapper[7465]: I0320 08:37:34.723650 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.723791 master-0 kubenswrapper[7465]: I0320 08:37:34.723683 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.723791 master-0 kubenswrapper[7465]: I0320 08:37:34.723722 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.723791 master-0 kubenswrapper[7465]: I0320 08:37:34.723755 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.723875 master-0 kubenswrapper[7465]: I0320 08:37:34.723814 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.723952 master-0 kubenswrapper[7465]: I0320 08:37:34.723926 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.724163 master-0 kubenswrapper[7465]: I0320 08:37:34.724140 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:34.724294 master-0 kubenswrapper[7465]: I0320 08:37:34.724206 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:37:36.535261 master-0 kubenswrapper[7465]: I0320 08:37:36.534341 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:36.550158 master-0 kubenswrapper[7465]: I0320 08:37:36.550093 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e180bf9a-03f7-405b-90c3-b2e46008213e-config\") pod \"e180bf9a-03f7-405b-90c3-b2e46008213e\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " Mar 20 08:37:36.550577 master-0 kubenswrapper[7465]: I0320 08:37:36.550303 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk7cs\" (UniqueName: \"kubernetes.io/projected/e180bf9a-03f7-405b-90c3-b2e46008213e-kube-api-access-zk7cs\") pod \"e180bf9a-03f7-405b-90c3-b2e46008213e\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " Mar 20 08:37:36.550577 master-0 kubenswrapper[7465]: I0320 08:37:36.550511 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e180bf9a-03f7-405b-90c3-b2e46008213e-serving-cert\") pod \"e180bf9a-03f7-405b-90c3-b2e46008213e\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " Mar 20 08:37:36.550577 master-0 kubenswrapper[7465]: I0320 08:37:36.550548 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e180bf9a-03f7-405b-90c3-b2e46008213e-client-ca\") pod \"e180bf9a-03f7-405b-90c3-b2e46008213e\" (UID: \"e180bf9a-03f7-405b-90c3-b2e46008213e\") " Mar 20 08:37:36.551156 master-0 kubenswrapper[7465]: I0320 08:37:36.551114 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e180bf9a-03f7-405b-90c3-b2e46008213e-client-ca" (OuterVolumeSpecName: "client-ca") pod "e180bf9a-03f7-405b-90c3-b2e46008213e" (UID: "e180bf9a-03f7-405b-90c3-b2e46008213e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:37:36.551276 master-0 kubenswrapper[7465]: I0320 08:37:36.551161 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e180bf9a-03f7-405b-90c3-b2e46008213e-config" (OuterVolumeSpecName: "config") pod "e180bf9a-03f7-405b-90c3-b2e46008213e" (UID: "e180bf9a-03f7-405b-90c3-b2e46008213e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:37:36.551520 master-0 kubenswrapper[7465]: I0320 08:37:36.551485 7465 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e180bf9a-03f7-405b-90c3-b2e46008213e-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:36.551571 master-0 kubenswrapper[7465]: I0320 08:37:36.551554 7465 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e180bf9a-03f7-405b-90c3-b2e46008213e-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:36.553853 master-0 kubenswrapper[7465]: I0320 08:37:36.553790 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e180bf9a-03f7-405b-90c3-b2e46008213e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e180bf9a-03f7-405b-90c3-b2e46008213e" (UID: "e180bf9a-03f7-405b-90c3-b2e46008213e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:37:36.554993 master-0 kubenswrapper[7465]: I0320 08:37:36.554940 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e180bf9a-03f7-405b-90c3-b2e46008213e-kube-api-access-zk7cs" (OuterVolumeSpecName: "kube-api-access-zk7cs") pod "e180bf9a-03f7-405b-90c3-b2e46008213e" (UID: "e180bf9a-03f7-405b-90c3-b2e46008213e"). InnerVolumeSpecName "kube-api-access-zk7cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:37:36.595742 master-0 kubenswrapper[7465]: I0320 08:37:36.595698 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_8682b669-c173-4b96-80f6-029292f5c25b/installer/0.log" Mar 20 08:37:36.595847 master-0 kubenswrapper[7465]: I0320 08:37:36.595763 7465 generic.go:334] "Generic (PLEG): container finished" podID="8682b669-c173-4b96-80f6-029292f5c25b" containerID="7dcd8a99fc665575842ace97eb1ad322b466b3c9cd44a9d54ad2389140cfb2bc" exitCode=1 Mar 20 08:37:36.595847 master-0 kubenswrapper[7465]: I0320 08:37:36.595836 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"8682b669-c173-4b96-80f6-029292f5c25b","Type":"ContainerDied","Data":"7dcd8a99fc665575842ace97eb1ad322b466b3c9cd44a9d54ad2389140cfb2bc"} Mar 20 08:37:36.597507 master-0 kubenswrapper[7465]: I0320 08:37:36.597476 7465 generic.go:334] "Generic (PLEG): container finished" podID="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" containerID="b2c7cbe5708ed7a3530e1dc35eccab2ac0970444664ce50722925f65c5f61474" exitCode=0 Mar 20 08:37:36.597571 master-0 kubenswrapper[7465]: I0320 08:37:36.597524 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" event={"ID":"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6","Type":"ContainerDied","Data":"b2c7cbe5708ed7a3530e1dc35eccab2ac0970444664ce50722925f65c5f61474"} Mar 20 08:37:36.598207 master-0 kubenswrapper[7465]: I0320 08:37:36.598154 7465 scope.go:117] "RemoveContainer" containerID="b2c7cbe5708ed7a3530e1dc35eccab2ac0970444664ce50722925f65c5f61474" Mar 20 08:37:36.604619 master-0 kubenswrapper[7465]: I0320 08:37:36.604560 7465 generic.go:334] "Generic (PLEG): container finished" podID="c1854ea4-c8e2-4289-84b6-1f18b2ac684f" containerID="f44e15f6f5dbc67eb3bd400bcba69ac1adc2aa6d2546e0cbc2778889121d3bcf" exitCode=0 Mar 20 08:37:36.604783 master-0 kubenswrapper[7465]: I0320 08:37:36.604723 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" event={"ID":"c1854ea4-c8e2-4289-84b6-1f18b2ac684f","Type":"ContainerDied","Data":"f44e15f6f5dbc67eb3bd400bcba69ac1adc2aa6d2546e0cbc2778889121d3bcf"} Mar 20 08:37:36.608058 master-0 kubenswrapper[7465]: I0320 08:37:36.605658 7465 scope.go:117] "RemoveContainer" containerID="f44e15f6f5dbc67eb3bd400bcba69ac1adc2aa6d2546e0cbc2778889121d3bcf" Mar 20 08:37:36.609981 master-0 kubenswrapper[7465]: I0320 08:37:36.609943 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_d17f953e-3ca4-4bd5-ad89-678447774687/installer/0.log" Mar 20 08:37:36.610079 master-0 kubenswrapper[7465]: I0320 08:37:36.610032 7465 generic.go:334] "Generic (PLEG): container finished" podID="d17f953e-3ca4-4bd5-ad89-678447774687" containerID="e11ff3b52271d029d42cce254ad48e40793e3cdb6d379d24178958fc78484abf" exitCode=1 Mar 20 08:37:36.610155 master-0 kubenswrapper[7465]: I0320 08:37:36.610116 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"d17f953e-3ca4-4bd5-ad89-678447774687","Type":"ContainerDied","Data":"e11ff3b52271d029d42cce254ad48e40793e3cdb6d379d24178958fc78484abf"} Mar 20 08:37:36.611569 master-0 kubenswrapper[7465]: I0320 08:37:36.611539 7465 generic.go:334] "Generic (PLEG): container finished" podID="75e3e2cc-aa56-41f3-8859-1c086f419d05" containerID="adad78c0e9b98fa52036392a08a46d2f94455f9ec32c4bacb460a8042eb8ee5e" exitCode=0 Mar 20 08:37:36.611631 master-0 kubenswrapper[7465]: I0320 08:37:36.611611 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" event={"ID":"75e3e2cc-aa56-41f3-8859-1c086f419d05","Type":"ContainerDied","Data":"adad78c0e9b98fa52036392a08a46d2f94455f9ec32c4bacb460a8042eb8ee5e"} Mar 20 08:37:36.613127 master-0 kubenswrapper[7465]: I0320 08:37:36.613072 7465 scope.go:117] "RemoveContainer" containerID="adad78c0e9b98fa52036392a08a46d2f94455f9ec32c4bacb460a8042eb8ee5e" Mar 20 08:37:36.613368 master-0 kubenswrapper[7465]: I0320 08:37:36.613330 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" event={"ID":"e180bf9a-03f7-405b-90c3-b2e46008213e","Type":"ContainerDied","Data":"02b2465660b354ff71c8f1178956736d7360fb5bea9412ce65fe2d7c69c27724"} Mar 20 08:37:36.613411 master-0 kubenswrapper[7465]: I0320 08:37:36.613376 7465 scope.go:117] "RemoveContainer" containerID="18fb1fff91a4cce821d193a03b4576ce661c211b770f644f35eae3cf19c8da77" Mar 20 08:37:36.613508 master-0 kubenswrapper[7465]: I0320 08:37:36.613477 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" Mar 20 08:37:36.653722 master-0 kubenswrapper[7465]: I0320 08:37:36.653653 7465 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e180bf9a-03f7-405b-90c3-b2e46008213e-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:36.653722 master-0 kubenswrapper[7465]: I0320 08:37:36.653710 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk7cs\" (UniqueName: \"kubernetes.io/projected/e180bf9a-03f7-405b-90c3-b2e46008213e-kube-api-access-zk7cs\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:36.820001 master-0 kubenswrapper[7465]: I0320 08:37:36.819894 7465 patch_prober.go:28] interesting pod/route-controller-manager-7ffc895647-6j97v container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:37:36.820337 master-0 kubenswrapper[7465]: I0320 08:37:36.820006 7465 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" podUID="e180bf9a-03f7-405b-90c3-b2e46008213e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:37:37.604415 master-0 kubenswrapper[7465]: I0320 08:37:37.604350 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:37:38.222840 master-0 kubenswrapper[7465]: I0320 08:37:38.222791 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_8682b669-c173-4b96-80f6-029292f5c25b/installer/0.log" Mar 20 08:37:38.222980 master-0 kubenswrapper[7465]: I0320 08:37:38.222875 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 20 08:37:38.226961 master-0 kubenswrapper[7465]: I0320 08:37:38.226748 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_d17f953e-3ca4-4bd5-ad89-678447774687/installer/0.log" Mar 20 08:37:38.226961 master-0 kubenswrapper[7465]: I0320 08:37:38.226834 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 20 08:37:38.282175 master-0 kubenswrapper[7465]: I0320 08:37:38.279113 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d17f953e-3ca4-4bd5-ad89-678447774687-kubelet-dir\") pod \"d17f953e-3ca4-4bd5-ad89-678447774687\" (UID: \"d17f953e-3ca4-4bd5-ad89-678447774687\") " Mar 20 08:37:38.282175 master-0 kubenswrapper[7465]: I0320 08:37:38.279277 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d17f953e-3ca4-4bd5-ad89-678447774687-kube-api-access\") pod \"d17f953e-3ca4-4bd5-ad89-678447774687\" (UID: \"d17f953e-3ca4-4bd5-ad89-678447774687\") " Mar 20 08:37:38.282175 master-0 kubenswrapper[7465]: I0320 08:37:38.279387 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8682b669-c173-4b96-80f6-029292f5c25b-kubelet-dir\") pod \"8682b669-c173-4b96-80f6-029292f5c25b\" (UID: \"8682b669-c173-4b96-80f6-029292f5c25b\") " Mar 20 08:37:38.282175 master-0 kubenswrapper[7465]: I0320 08:37:38.279423 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8682b669-c173-4b96-80f6-029292f5c25b-kube-api-access\") pod \"8682b669-c173-4b96-80f6-029292f5c25b\" (UID: \"8682b669-c173-4b96-80f6-029292f5c25b\") " Mar 20 08:37:38.282175 master-0 kubenswrapper[7465]: I0320 08:37:38.279453 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8682b669-c173-4b96-80f6-029292f5c25b-var-lock\") pod \"8682b669-c173-4b96-80f6-029292f5c25b\" (UID: \"8682b669-c173-4b96-80f6-029292f5c25b\") " Mar 20 08:37:38.282175 master-0 kubenswrapper[7465]: I0320 08:37:38.279504 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d17f953e-3ca4-4bd5-ad89-678447774687-var-lock\") pod \"d17f953e-3ca4-4bd5-ad89-678447774687\" (UID: \"d17f953e-3ca4-4bd5-ad89-678447774687\") " Mar 20 08:37:38.282175 master-0 kubenswrapper[7465]: I0320 08:37:38.279905 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d17f953e-3ca4-4bd5-ad89-678447774687-var-lock" (OuterVolumeSpecName: "var-lock") pod "d17f953e-3ca4-4bd5-ad89-678447774687" (UID: "d17f953e-3ca4-4bd5-ad89-678447774687"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:38.282175 master-0 kubenswrapper[7465]: I0320 08:37:38.279955 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d17f953e-3ca4-4bd5-ad89-678447774687-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d17f953e-3ca4-4bd5-ad89-678447774687" (UID: "d17f953e-3ca4-4bd5-ad89-678447774687"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:38.286851 master-0 kubenswrapper[7465]: I0320 08:37:38.283796 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8682b669-c173-4b96-80f6-029292f5c25b-var-lock" (OuterVolumeSpecName: "var-lock") pod "8682b669-c173-4b96-80f6-029292f5c25b" (UID: "8682b669-c173-4b96-80f6-029292f5c25b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:38.286851 master-0 kubenswrapper[7465]: I0320 08:37:38.283864 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8682b669-c173-4b96-80f6-029292f5c25b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8682b669-c173-4b96-80f6-029292f5c25b" (UID: "8682b669-c173-4b96-80f6-029292f5c25b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:38.301384 master-0 kubenswrapper[7465]: I0320 08:37:38.301103 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d17f953e-3ca4-4bd5-ad89-678447774687-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d17f953e-3ca4-4bd5-ad89-678447774687" (UID: "d17f953e-3ca4-4bd5-ad89-678447774687"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:37:38.307087 master-0 kubenswrapper[7465]: I0320 08:37:38.307040 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8682b669-c173-4b96-80f6-029292f5c25b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8682b669-c173-4b96-80f6-029292f5c25b" (UID: "8682b669-c173-4b96-80f6-029292f5c25b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:37:38.382125 master-0 kubenswrapper[7465]: I0320 08:37:38.382069 7465 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d17f953e-3ca4-4bd5-ad89-678447774687-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:38.382125 master-0 kubenswrapper[7465]: I0320 08:37:38.382117 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d17f953e-3ca4-4bd5-ad89-678447774687-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:38.382301 master-0 kubenswrapper[7465]: I0320 08:37:38.382134 7465 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8682b669-c173-4b96-80f6-029292f5c25b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:38.382301 master-0 kubenswrapper[7465]: I0320 08:37:38.382149 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8682b669-c173-4b96-80f6-029292f5c25b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:38.382301 master-0 kubenswrapper[7465]: I0320 08:37:38.382164 7465 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8682b669-c173-4b96-80f6-029292f5c25b-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:38.382301 master-0 kubenswrapper[7465]: I0320 08:37:38.382177 7465 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d17f953e-3ca4-4bd5-ad89-678447774687-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:38.509326 master-0 kubenswrapper[7465]: I0320 08:37:38.502042 7465 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:37:38.639738 master-0 kubenswrapper[7465]: I0320 08:37:38.639612 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_8682b669-c173-4b96-80f6-029292f5c25b/installer/0.log" Mar 20 08:37:38.639738 master-0 kubenswrapper[7465]: I0320 08:37:38.639697 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"8682b669-c173-4b96-80f6-029292f5c25b","Type":"ContainerDied","Data":"a7a94acdb1ced20f1398af7489166fd2b70ae13922fdab81533d23a1b96c7db0"} Mar 20 08:37:38.639738 master-0 kubenswrapper[7465]: I0320 08:37:38.639748 7465 scope.go:117] "RemoveContainer" containerID="7dcd8a99fc665575842ace97eb1ad322b466b3c9cd44a9d54ad2389140cfb2bc" Mar 20 08:37:38.640443 master-0 kubenswrapper[7465]: I0320 08:37:38.639881 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 20 08:37:38.646554 master-0 kubenswrapper[7465]: I0320 08:37:38.645844 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" event={"ID":"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6","Type":"ContainerStarted","Data":"8009103455f6e5b4ff5637d33a8290182816ff88ad007a0262e9e5800739bab4"} Mar 20 08:37:38.650349 master-0 kubenswrapper[7465]: I0320 08:37:38.647902 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" event={"ID":"75e80b57-a0f3-4f6a-a022-457944c8f59b","Type":"ContainerStarted","Data":"e6c1d5a99612e6d35505b3e74ccfbf34d01a1eaff1e58e1eab8e47b44ad28c82"} Mar 20 08:37:38.650349 master-0 kubenswrapper[7465]: I0320 08:37:38.649447 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" event={"ID":"4117cb69-45cf-4966-82b6-a31340c7db11","Type":"ContainerStarted","Data":"87748b853faf38704aba6691ffdb72f7909a87d3516c0b11cedf1d4b870a3219"} Mar 20 08:37:38.651659 master-0 kubenswrapper[7465]: I0320 08:37:38.650840 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_d17f953e-3ca4-4bd5-ad89-678447774687/installer/0.log" Mar 20 08:37:38.651659 master-0 kubenswrapper[7465]: I0320 08:37:38.650963 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"d17f953e-3ca4-4bd5-ad89-678447774687","Type":"ContainerDied","Data":"4e6b04ca634ce1b5886b7da997d2a8569817c1b864a82899979971c78a3d51e1"} Mar 20 08:37:38.651659 master-0 kubenswrapper[7465]: I0320 08:37:38.651002 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 20 08:37:38.655267 master-0 kubenswrapper[7465]: I0320 08:37:38.653175 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" event={"ID":"75e3e2cc-aa56-41f3-8859-1c086f419d05","Type":"ContainerStarted","Data":"9526eea2cea58cb9e28474105457b96211d2f64f5d2c17947ddff373db76ab0b"} Mar 20 08:37:38.656303 master-0 kubenswrapper[7465]: I0320 08:37:38.656087 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" event={"ID":"590dd533-c8db-42e3-9485-1c9df719773f","Type":"ContainerStarted","Data":"cda82b97cf053cc40d3d7f1f9bbdda52685fddf9f31a465359a5c0f818a7678c"} Mar 20 08:37:38.656303 master-0 kubenswrapper[7465]: I0320 08:37:38.656250 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" podUID="590dd533-c8db-42e3-9485-1c9df719773f" containerName="controller-manager" containerID="cri-o://cda82b97cf053cc40d3d7f1f9bbdda52685fddf9f31a465359a5c0f818a7678c" gracePeriod=30 Mar 20 08:37:38.656646 master-0 kubenswrapper[7465]: I0320 08:37:38.656620 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:38.662030 master-0 kubenswrapper[7465]: I0320 08:37:38.661991 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:38.768269 master-0 kubenswrapper[7465]: I0320 08:37:38.768227 7465 scope.go:117] "RemoveContainer" containerID="e11ff3b52271d029d42cce254ad48e40793e3cdb6d379d24178958fc78484abf" Mar 20 08:37:39.146695 master-0 kubenswrapper[7465]: I0320 08:37:39.146517 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:39.199800 master-0 kubenswrapper[7465]: I0320 08:37:39.199715 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxv95\" (UniqueName: \"kubernetes.io/projected/590dd533-c8db-42e3-9485-1c9df719773f-kube-api-access-wxv95\") pod \"590dd533-c8db-42e3-9485-1c9df719773f\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " Mar 20 08:37:39.202154 master-0 kubenswrapper[7465]: I0320 08:37:39.202088 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-client-ca\") pod \"590dd533-c8db-42e3-9485-1c9df719773f\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " Mar 20 08:37:39.202350 master-0 kubenswrapper[7465]: I0320 08:37:39.202325 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/590dd533-c8db-42e3-9485-1c9df719773f-serving-cert\") pod \"590dd533-c8db-42e3-9485-1c9df719773f\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " Mar 20 08:37:39.202509 master-0 kubenswrapper[7465]: I0320 08:37:39.202466 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-config\") pod \"590dd533-c8db-42e3-9485-1c9df719773f\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " Mar 20 08:37:39.202509 master-0 kubenswrapper[7465]: I0320 08:37:39.202504 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-proxy-ca-bundles\") pod \"590dd533-c8db-42e3-9485-1c9df719773f\" (UID: \"590dd533-c8db-42e3-9485-1c9df719773f\") " Mar 20 08:37:39.203250 master-0 kubenswrapper[7465]: I0320 08:37:39.203200 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-client-ca" (OuterVolumeSpecName: "client-ca") pod "590dd533-c8db-42e3-9485-1c9df719773f" (UID: "590dd533-c8db-42e3-9485-1c9df719773f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:37:39.203479 master-0 kubenswrapper[7465]: I0320 08:37:39.203342 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590dd533-c8db-42e3-9485-1c9df719773f-kube-api-access-wxv95" (OuterVolumeSpecName: "kube-api-access-wxv95") pod "590dd533-c8db-42e3-9485-1c9df719773f" (UID: "590dd533-c8db-42e3-9485-1c9df719773f"). InnerVolumeSpecName "kube-api-access-wxv95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:37:39.203615 master-0 kubenswrapper[7465]: I0320 08:37:39.203576 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-config" (OuterVolumeSpecName: "config") pod "590dd533-c8db-42e3-9485-1c9df719773f" (UID: "590dd533-c8db-42e3-9485-1c9df719773f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:37:39.203729 master-0 kubenswrapper[7465]: I0320 08:37:39.203699 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "590dd533-c8db-42e3-9485-1c9df719773f" (UID: "590dd533-c8db-42e3-9485-1c9df719773f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:37:39.205701 master-0 kubenswrapper[7465]: I0320 08:37:39.205651 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590dd533-c8db-42e3-9485-1c9df719773f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "590dd533-c8db-42e3-9485-1c9df719773f" (UID: "590dd533-c8db-42e3-9485-1c9df719773f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:37:39.305088 master-0 kubenswrapper[7465]: I0320 08:37:39.305022 7465 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:39.305088 master-0 kubenswrapper[7465]: I0320 08:37:39.305076 7465 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:39.305088 master-0 kubenswrapper[7465]: I0320 08:37:39.305093 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxv95\" (UniqueName: \"kubernetes.io/projected/590dd533-c8db-42e3-9485-1c9df719773f-kube-api-access-wxv95\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:39.305088 master-0 kubenswrapper[7465]: I0320 08:37:39.305106 7465 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/590dd533-c8db-42e3-9485-1c9df719773f-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:39.305572 master-0 kubenswrapper[7465]: I0320 08:37:39.305120 7465 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/590dd533-c8db-42e3-9485-1c9df719773f-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:39.667156 master-0 kubenswrapper[7465]: I0320 08:37:39.667008 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" event={"ID":"91b2899e-8d24-41a0-bec8-d11c67b8f955","Type":"ContainerStarted","Data":"239c73fe7acb21fae05105f39e9db825bd1c976521db41e975ce454317c12c28"} Mar 20 08:37:39.669917 master-0 kubenswrapper[7465]: I0320 08:37:39.669881 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" event={"ID":"c1854ea4-c8e2-4289-84b6-1f18b2ac684f","Type":"ContainerStarted","Data":"eedbb1dfd13f24b92d1505673b2418928be1e1bfdd5eb59005a694a899688fee"} Mar 20 08:37:39.675723 master-0 kubenswrapper[7465]: I0320 08:37:39.675661 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" event={"ID":"47eadda0-35a6-4b5c-a96c-24854be15098","Type":"ContainerStarted","Data":"1dad796a4b96686dc1ca4a32fa60300c9992326d4cb1ceaa47be4941e0d0b81b"} Mar 20 08:37:39.675955 master-0 kubenswrapper[7465]: I0320 08:37:39.675920 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" Mar 20 08:37:39.678064 master-0 kubenswrapper[7465]: I0320 08:37:39.678019 7465 generic.go:334] "Generic (PLEG): container finished" podID="55cefc56-c008-45c1-a6ac-b1d3c8778c7b" containerID="826e6ad2813ac1102d49808d3d0e9d3cfd04bbcf8b2b1c66206cb30f43e2de58" exitCode=0 Mar 20 08:37:39.678134 master-0 kubenswrapper[7465]: I0320 08:37:39.678090 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86g9n" event={"ID":"55cefc56-c008-45c1-a6ac-b1d3c8778c7b","Type":"ContainerDied","Data":"826e6ad2813ac1102d49808d3d0e9d3cfd04bbcf8b2b1c66206cb30f43e2de58"} Mar 20 08:37:39.687333 master-0 kubenswrapper[7465]: I0320 08:37:39.687282 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" Mar 20 08:37:39.687766 master-0 kubenswrapper[7465]: I0320 08:37:39.687717 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mgcb9" event={"ID":"06159643-e9b8-417f-ba79-ee1e1ca5c951","Type":"ContainerStarted","Data":"e7321b68cd658b736e2e6b297892a77f8a40c4d3d188ee1bd535ab60a2be4cf0"} Mar 20 08:37:39.690315 master-0 kubenswrapper[7465]: I0320 08:37:39.690240 7465 generic.go:334] "Generic (PLEG): container finished" podID="956a8ee3-2618-4f04-8d1c-97c3a4595c94" containerID="162d6251581b5398c7cdd12c089739f6900bac67c42e07fd4143959fd82c1770" exitCode=0 Mar 20 08:37:39.690410 master-0 kubenswrapper[7465]: I0320 08:37:39.690318 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srjqw" event={"ID":"956a8ee3-2618-4f04-8d1c-97c3a4595c94","Type":"ContainerDied","Data":"162d6251581b5398c7cdd12c089739f6900bac67c42e07fd4143959fd82c1770"} Mar 20 08:37:39.699297 master-0 kubenswrapper[7465]: I0320 08:37:39.692279 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"fac672fa-7660-449e-a0d1-244dc6282d76","Type":"ContainerStarted","Data":"aecbf33029725426faa2806ba773a548665753d84d9ec4f0ac83ae36cdffa3ce"} Mar 20 08:37:39.699297 master-0 kubenswrapper[7465]: I0320 08:37:39.698573 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" event={"ID":"4117cb69-45cf-4966-82b6-a31340c7db11","Type":"ContainerStarted","Data":"3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1"} Mar 20 08:37:39.702165 master-0 kubenswrapper[7465]: I0320 08:37:39.700842 7465 generic.go:334] "Generic (PLEG): container finished" podID="c649d964-ba32-44e4-a3cb-06a285972d97" containerID="8d44586dc2f9d77b931bacec095df15c2bd3b3ec3048ced5896c388e58dacb0c" exitCode=0 Mar 20 08:37:39.702165 master-0 kubenswrapper[7465]: I0320 08:37:39.700927 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zspn5" event={"ID":"c649d964-ba32-44e4-a3cb-06a285972d97","Type":"ContainerDied","Data":"8d44586dc2f9d77b931bacec095df15c2bd3b3ec3048ced5896c388e58dacb0c"} Mar 20 08:37:39.703341 master-0 kubenswrapper[7465]: I0320 08:37:39.703279 7465 generic.go:334] "Generic (PLEG): container finished" podID="590dd533-c8db-42e3-9485-1c9df719773f" containerID="cda82b97cf053cc40d3d7f1f9bbdda52685fddf9f31a465359a5c0f818a7678c" exitCode=0 Mar 20 08:37:39.704289 master-0 kubenswrapper[7465]: I0320 08:37:39.703787 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" event={"ID":"590dd533-c8db-42e3-9485-1c9df719773f","Type":"ContainerDied","Data":"cda82b97cf053cc40d3d7f1f9bbdda52685fddf9f31a465359a5c0f818a7678c"} Mar 20 08:37:39.704289 master-0 kubenswrapper[7465]: I0320 08:37:39.703840 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" event={"ID":"590dd533-c8db-42e3-9485-1c9df719773f","Type":"ContainerDied","Data":"56a2947905e36af4c7e4eee7385aba059d957c24fbe5028cd5296acd03b88f48"} Mar 20 08:37:39.704289 master-0 kubenswrapper[7465]: I0320 08:37:39.703868 7465 scope.go:117] "RemoveContainer" containerID="cda82b97cf053cc40d3d7f1f9bbdda52685fddf9f31a465359a5c0f818a7678c" Mar 20 08:37:39.712938 master-0 kubenswrapper[7465]: I0320 08:37:39.704284 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6486d766f9-5b77h" Mar 20 08:37:39.727762 master-0 kubenswrapper[7465]: I0320 08:37:39.727716 7465 scope.go:117] "RemoveContainer" containerID="cda82b97cf053cc40d3d7f1f9bbdda52685fddf9f31a465359a5c0f818a7678c" Mar 20 08:37:39.732472 master-0 kubenswrapper[7465]: E0320 08:37:39.728841 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda82b97cf053cc40d3d7f1f9bbdda52685fddf9f31a465359a5c0f818a7678c\": container with ID starting with cda82b97cf053cc40d3d7f1f9bbdda52685fddf9f31a465359a5c0f818a7678c not found: ID does not exist" containerID="cda82b97cf053cc40d3d7f1f9bbdda52685fddf9f31a465359a5c0f818a7678c" Mar 20 08:37:39.732472 master-0 kubenswrapper[7465]: I0320 08:37:39.728899 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda82b97cf053cc40d3d7f1f9bbdda52685fddf9f31a465359a5c0f818a7678c"} err="failed to get container status \"cda82b97cf053cc40d3d7f1f9bbdda52685fddf9f31a465359a5c0f818a7678c\": rpc error: code = NotFound desc = could not find container \"cda82b97cf053cc40d3d7f1f9bbdda52685fddf9f31a465359a5c0f818a7678c\": container with ID starting with cda82b97cf053cc40d3d7f1f9bbdda52685fddf9f31a465359a5c0f818a7678c not found: ID does not exist" Mar 20 08:37:39.975020 master-0 kubenswrapper[7465]: I0320 08:37:39.974912 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:39.978362 master-0 kubenswrapper[7465]: I0320 08:37:39.978299 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:39.978362 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:39.978362 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:39.978362 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:39.978593 master-0 kubenswrapper[7465]: I0320 08:37:39.978391 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:40.714627 master-0 kubenswrapper[7465]: I0320 08:37:40.714556 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-p7pt6_68252533-bd64-4fc5-838a-cc350cbe77f0/openshift-controller-manager-operator/0.log" Mar 20 08:37:40.714627 master-0 kubenswrapper[7465]: I0320 08:37:40.714629 7465 generic.go:334] "Generic (PLEG): container finished" podID="68252533-bd64-4fc5-838a-cc350cbe77f0" containerID="e3f8332f8ebf3d1a4f060a0c3657a0fbf7dd52dab57065dbe98fe23495c66678" exitCode=1 Mar 20 08:37:40.715413 master-0 kubenswrapper[7465]: I0320 08:37:40.714709 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" event={"ID":"68252533-bd64-4fc5-838a-cc350cbe77f0","Type":"ContainerDied","Data":"e3f8332f8ebf3d1a4f060a0c3657a0fbf7dd52dab57065dbe98fe23495c66678"} Mar 20 08:37:40.715585 master-0 kubenswrapper[7465]: I0320 08:37:40.715543 7465 scope.go:117] "RemoveContainer" containerID="e3f8332f8ebf3d1a4f060a0c3657a0fbf7dd52dab57065dbe98fe23495c66678" Mar 20 08:37:40.719408 master-0 kubenswrapper[7465]: I0320 08:37:40.719381 7465 generic.go:334] "Generic (PLEG): container finished" podID="06159643-e9b8-417f-ba79-ee1e1ca5c951" containerID="e7321b68cd658b736e2e6b297892a77f8a40c4d3d188ee1bd535ab60a2be4cf0" exitCode=0 Mar 20 08:37:40.719734 master-0 kubenswrapper[7465]: I0320 08:37:40.719535 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mgcb9" event={"ID":"06159643-e9b8-417f-ba79-ee1e1ca5c951","Type":"ContainerDied","Data":"e7321b68cd658b736e2e6b297892a77f8a40c4d3d188ee1bd535ab60a2be4cf0"} Mar 20 08:37:40.977881 master-0 kubenswrapper[7465]: I0320 08:37:40.977830 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:40.977881 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:40.977881 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:40.977881 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:40.978063 master-0 kubenswrapper[7465]: I0320 08:37:40.977903 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:41.727760 master-0 kubenswrapper[7465]: I0320 08:37:41.727707 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-p7pt6_68252533-bd64-4fc5-838a-cc350cbe77f0/openshift-controller-manager-operator/0.log" Mar 20 08:37:41.728341 master-0 kubenswrapper[7465]: I0320 08:37:41.727823 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" event={"ID":"68252533-bd64-4fc5-838a-cc350cbe77f0","Type":"ContainerStarted","Data":"e90b46b2a24eed1acbde07d446b8c7de8acf8cbdfe00eeb63977c91e3cae9f34"} Mar 20 08:37:41.978173 master-0 kubenswrapper[7465]: I0320 08:37:41.978019 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:41.978173 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:41.978173 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:41.978173 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:41.978173 master-0 kubenswrapper[7465]: I0320 08:37:41.978113 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:42.978172 master-0 kubenswrapper[7465]: I0320 08:37:42.978083 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:42.978172 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:42.978172 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:42.978172 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:42.978172 master-0 kubenswrapper[7465]: I0320 08:37:42.978167 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:43.977746 master-0 kubenswrapper[7465]: I0320 08:37:43.977641 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:43.977746 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:43.977746 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:43.977746 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:43.978168 master-0 kubenswrapper[7465]: I0320 08:37:43.977757 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:44.977406 master-0 kubenswrapper[7465]: I0320 08:37:44.977334 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:44.977406 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:44.977406 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:44.977406 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:45.007572 master-0 kubenswrapper[7465]: I0320 08:37:44.977433 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:45.869330 master-0 kubenswrapper[7465]: I0320 08:37:45.869179 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srjqw" event={"ID":"956a8ee3-2618-4f04-8d1c-97c3a4595c94","Type":"ContainerStarted","Data":"6b2daff08bafe9a34203a7f9995e51650ea9112e321da5c7bb3cda5cf1b124e2"} Mar 20 08:37:45.871236 master-0 kubenswrapper[7465]: I0320 08:37:45.871149 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" event={"ID":"4117cb69-45cf-4966-82b6-a31340c7db11","Type":"ContainerStarted","Data":"49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166"} Mar 20 08:37:45.979386 master-0 kubenswrapper[7465]: I0320 08:37:45.979267 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:45.979386 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:45.979386 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:45.979386 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:45.979386 master-0 kubenswrapper[7465]: I0320 08:37:45.979368 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:46.468630 master-0 kubenswrapper[7465]: E0320 08:37:46.468451 7465 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:37:46.883827 master-0 kubenswrapper[7465]: I0320 08:37:46.882437 7465 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="8f140ecbb839c536b9c01be26f60184770878f8b44764513599810ad9344fdb7" exitCode=1 Mar 20 08:37:46.883827 master-0 kubenswrapper[7465]: I0320 08:37:46.882474 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"8f140ecbb839c536b9c01be26f60184770878f8b44764513599810ad9344fdb7"} Mar 20 08:37:46.883827 master-0 kubenswrapper[7465]: I0320 08:37:46.883379 7465 scope.go:117] "RemoveContainer" containerID="8f140ecbb839c536b9c01be26f60184770878f8b44764513599810ad9344fdb7" Mar 20 08:37:46.887233 master-0 kubenswrapper[7465]: I0320 08:37:46.884991 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mgcb9" event={"ID":"06159643-e9b8-417f-ba79-ee1e1ca5c951","Type":"ContainerStarted","Data":"af9431084ad652fa67903c07dacd4b5fe50459c9cb6c9858baf0da64d22c51f3"} Mar 20 08:37:46.887233 master-0 kubenswrapper[7465]: I0320 08:37:46.886266 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" event={"ID":"75e80b57-a0f3-4f6a-a022-457944c8f59b","Type":"ContainerStarted","Data":"aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496"} Mar 20 08:37:46.889052 master-0 kubenswrapper[7465]: I0320 08:37:46.888984 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zspn5" event={"ID":"c649d964-ba32-44e4-a3cb-06a285972d97","Type":"ContainerStarted","Data":"12530f8cc7a96159435ca72d952505623c195a3951b56e70dfb55c96d0ec3d09"} Mar 20 08:37:46.891169 master-0 kubenswrapper[7465]: I0320 08:37:46.891139 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86g9n" event={"ID":"55cefc56-c008-45c1-a6ac-b1d3c8778c7b","Type":"ContainerStarted","Data":"068275d1cd841a9bf5f79cb0540e343d651716d7934e461f10b2346a851f5cbb"} Mar 20 08:37:46.977575 master-0 kubenswrapper[7465]: I0320 08:37:46.977476 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:46.977575 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:46.977575 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:46.977575 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:46.977575 master-0 kubenswrapper[7465]: I0320 08:37:46.977569 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:47.395744 master-0 kubenswrapper[7465]: I0320 08:37:47.395657 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:37:47.395744 master-0 kubenswrapper[7465]: I0320 08:37:47.395727 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:37:47.617257 master-0 kubenswrapper[7465]: E0320 08:37:47.617178 7465 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 20 08:37:47.617944 master-0 kubenswrapper[7465]: I0320 08:37:47.617926 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 20 08:37:47.900789 master-0 kubenswrapper[7465]: I0320 08:37:47.900728 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"17402333a3400904f60b3f059728c9faa13bc6bb53c95e63d5e8325a42104e2f"} Mar 20 08:37:47.901887 master-0 kubenswrapper[7465]: I0320 08:37:47.901855 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"6860ec0c6307c0854099262d2b68eee9cef0172599ec80b28a89c6d016fb4071"} Mar 20 08:37:47.904050 master-0 kubenswrapper[7465]: I0320 08:37:47.903969 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" event={"ID":"75e80b57-a0f3-4f6a-a022-457944c8f59b","Type":"ContainerStarted","Data":"c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295"} Mar 20 08:37:47.904113 master-0 kubenswrapper[7465]: I0320 08:37:47.904072 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" event={"ID":"75e80b57-a0f3-4f6a-a022-457944c8f59b","Type":"ContainerStarted","Data":"8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9"} Mar 20 08:37:47.975386 master-0 kubenswrapper[7465]: I0320 08:37:47.975322 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:37:47.977628 master-0 kubenswrapper[7465]: I0320 08:37:47.977601 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:47.977628 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:47.977628 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:47.977628 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:47.977753 master-0 kubenswrapper[7465]: I0320 08:37:47.977649 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:48.430007 master-0 kubenswrapper[7465]: I0320 08:37:48.429936 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:37:48.430007 master-0 kubenswrapper[7465]: I0320 08:37:48.430003 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:37:48.443708 master-0 kubenswrapper[7465]: I0320 08:37:48.443609 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-86g9n" podUID="55cefc56-c008-45c1-a6ac-b1d3c8778c7b" containerName="registry-server" probeResult="failure" output=< Mar 20 08:37:48.443708 master-0 kubenswrapper[7465]: timeout: failed to connect service ":50051" within 1s Mar 20 08:37:48.443708 master-0 kubenswrapper[7465]: > Mar 20 08:37:48.475141 master-0 kubenswrapper[7465]: I0320 08:37:48.475072 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:37:48.913585 master-0 kubenswrapper[7465]: I0320 08:37:48.913505 7465 generic.go:334] "Generic (PLEG): container finished" podID="4490a747-da2d-4f1a-8986-bc2c1c58424b" containerID="0521d9515acccdbef13de273c2fd3fc8c0c08193b40755e745ddfeeb3789e32d" exitCode=0 Mar 20 08:37:48.913879 master-0 kubenswrapper[7465]: I0320 08:37:48.913632 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"4490a747-da2d-4f1a-8986-bc2c1c58424b","Type":"ContainerDied","Data":"0521d9515acccdbef13de273c2fd3fc8c0c08193b40755e745ddfeeb3789e32d"} Mar 20 08:37:48.915844 master-0 kubenswrapper[7465]: I0320 08:37:48.915803 7465 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="91045cb8c13e35ca1f0bfb21ba636da24cd41b91eea8db817a9a5a02317192b3" exitCode=0 Mar 20 08:37:48.916008 master-0 kubenswrapper[7465]: I0320 08:37:48.915942 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"91045cb8c13e35ca1f0bfb21ba636da24cd41b91eea8db817a9a5a02317192b3"} Mar 20 08:37:48.977877 master-0 kubenswrapper[7465]: I0320 08:37:48.977777 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:48.977877 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:48.977877 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:48.977877 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:48.978465 master-0 kubenswrapper[7465]: I0320 08:37:48.977889 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:49.983212 master-0 kubenswrapper[7465]: I0320 08:37:49.978940 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:49.983212 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:49.983212 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:49.983212 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:49.983212 master-0 kubenswrapper[7465]: I0320 08:37:49.979039 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:50.266831 master-0 kubenswrapper[7465]: I0320 08:37:50.266752 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 20 08:37:50.299812 master-0 kubenswrapper[7465]: I0320 08:37:50.299745 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4490a747-da2d-4f1a-8986-bc2c1c58424b-var-lock\") pod \"4490a747-da2d-4f1a-8986-bc2c1c58424b\" (UID: \"4490a747-da2d-4f1a-8986-bc2c1c58424b\") " Mar 20 08:37:50.299812 master-0 kubenswrapper[7465]: I0320 08:37:50.299808 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4490a747-da2d-4f1a-8986-bc2c1c58424b-kube-api-access\") pod \"4490a747-da2d-4f1a-8986-bc2c1c58424b\" (UID: \"4490a747-da2d-4f1a-8986-bc2c1c58424b\") " Mar 20 08:37:50.299812 master-0 kubenswrapper[7465]: I0320 08:37:50.299841 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4490a747-da2d-4f1a-8986-bc2c1c58424b-kubelet-dir\") pod \"4490a747-da2d-4f1a-8986-bc2c1c58424b\" (UID: \"4490a747-da2d-4f1a-8986-bc2c1c58424b\") " Mar 20 08:37:50.300382 master-0 kubenswrapper[7465]: I0320 08:37:50.299896 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4490a747-da2d-4f1a-8986-bc2c1c58424b-var-lock" (OuterVolumeSpecName: "var-lock") pod "4490a747-da2d-4f1a-8986-bc2c1c58424b" (UID: "4490a747-da2d-4f1a-8986-bc2c1c58424b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:50.300382 master-0 kubenswrapper[7465]: I0320 08:37:50.300007 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4490a747-da2d-4f1a-8986-bc2c1c58424b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4490a747-da2d-4f1a-8986-bc2c1c58424b" (UID: "4490a747-da2d-4f1a-8986-bc2c1c58424b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:50.300382 master-0 kubenswrapper[7465]: I0320 08:37:50.300168 7465 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4490a747-da2d-4f1a-8986-bc2c1c58424b-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:50.300382 master-0 kubenswrapper[7465]: I0320 08:37:50.300204 7465 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4490a747-da2d-4f1a-8986-bc2c1c58424b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:50.305097 master-0 kubenswrapper[7465]: I0320 08:37:50.305033 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4490a747-da2d-4f1a-8986-bc2c1c58424b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4490a747-da2d-4f1a-8986-bc2c1c58424b" (UID: "4490a747-da2d-4f1a-8986-bc2c1c58424b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:37:50.401637 master-0 kubenswrapper[7465]: I0320 08:37:50.401549 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4490a747-da2d-4f1a-8986-bc2c1c58424b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:37:50.941422 master-0 kubenswrapper[7465]: I0320 08:37:50.941352 7465 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="6369057ac22c507cf87d187a9b3dae20c7b88e730a831d4ea4937e43b3fed7fb" exitCode=1 Mar 20 08:37:50.941667 master-0 kubenswrapper[7465]: I0320 08:37:50.941454 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerDied","Data":"6369057ac22c507cf87d187a9b3dae20c7b88e730a831d4ea4937e43b3fed7fb"} Mar 20 08:37:50.942300 master-0 kubenswrapper[7465]: I0320 08:37:50.942273 7465 scope.go:117] "RemoveContainer" containerID="6369057ac22c507cf87d187a9b3dae20c7b88e730a831d4ea4937e43b3fed7fb" Mar 20 08:37:50.945032 master-0 kubenswrapper[7465]: I0320 08:37:50.944984 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"4490a747-da2d-4f1a-8986-bc2c1c58424b","Type":"ContainerDied","Data":"0aa1305a973a71f928c142131df579b42fa3e776fd7926a4aa71bddb2c85fcba"} Mar 20 08:37:50.945090 master-0 kubenswrapper[7465]: I0320 08:37:50.945034 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 20 08:37:50.945090 master-0 kubenswrapper[7465]: I0320 08:37:50.945045 7465 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aa1305a973a71f928c142131df579b42fa3e776fd7926a4aa71bddb2c85fcba" Mar 20 08:37:50.980358 master-0 kubenswrapper[7465]: I0320 08:37:50.980274 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:50.980358 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:50.980358 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:50.980358 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:50.980842 master-0 kubenswrapper[7465]: I0320 08:37:50.980420 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:51.605633 master-0 kubenswrapper[7465]: I0320 08:37:51.605557 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:37:51.954490 master-0 kubenswrapper[7465]: I0320 08:37:51.954318 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"865ba621afdb76437d315fccb4bc81df59adddaddae54ec22d99130fa629dce8"} Mar 20 08:37:51.978749 master-0 kubenswrapper[7465]: I0320 08:37:51.978685 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:51.978749 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:51.978749 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:51.978749 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:51.979027 master-0 kubenswrapper[7465]: I0320 08:37:51.978746 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:52.979091 master-0 kubenswrapper[7465]: I0320 08:37:52.978945 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:52.979091 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:52.979091 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:52.979091 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:52.980517 master-0 kubenswrapper[7465]: I0320 08:37:52.979084 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:53.727228 master-0 kubenswrapper[7465]: I0320 08:37:53.727138 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:37:53.978381 master-0 kubenswrapper[7465]: I0320 08:37:53.978173 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:53.978381 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:53.978381 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:53.978381 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:53.978381 master-0 kubenswrapper[7465]: I0320 08:37:53.978279 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:54.607391 master-0 kubenswrapper[7465]: I0320 08:37:54.607247 7465 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:37:54.999113 master-0 kubenswrapper[7465]: I0320 08:37:54.998919 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:54.999113 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:54.999113 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:54.999113 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:54.999113 master-0 kubenswrapper[7465]: I0320 08:37:54.999011 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:55.355107 master-0 kubenswrapper[7465]: I0320 08:37:55.354993 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:37:55.355107 master-0 kubenswrapper[7465]: I0320 08:37:55.355104 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:37:55.428375 master-0 kubenswrapper[7465]: I0320 08:37:55.428264 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:37:55.612001 master-0 kubenswrapper[7465]: I0320 08:37:55.611753 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:37:55.612001 master-0 kubenswrapper[7465]: I0320 08:37:55.611839 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:37:55.669212 master-0 kubenswrapper[7465]: I0320 08:37:55.669097 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:37:55.980028 master-0 kubenswrapper[7465]: I0320 08:37:55.979804 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:55.980028 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:55.980028 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:55.980028 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:55.980028 master-0 kubenswrapper[7465]: I0320 08:37:55.979900 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:56.047249 master-0 kubenswrapper[7465]: I0320 08:37:56.047170 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:37:56.054730 master-0 kubenswrapper[7465]: I0320 08:37:56.054677 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:37:56.468799 master-0 kubenswrapper[7465]: E0320 08:37:56.468697 7465 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" Mar 20 08:37:56.978672 master-0 kubenswrapper[7465]: I0320 08:37:56.978590 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:56.978672 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:56.978672 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:56.978672 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:56.979804 master-0 kubenswrapper[7465]: I0320 08:37:56.978694 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:57.456540 master-0 kubenswrapper[7465]: I0320 08:37:57.456472 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:37:57.495683 master-0 kubenswrapper[7465]: I0320 08:37:57.495606 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:37:57.631504 master-0 kubenswrapper[7465]: E0320 08:37:57.631293 7465 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:37:47Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:37:47Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:37:47Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:37:47Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a746a87b784ea1caa278fd0e012554f9df520b6fff665ea0bc4c83f487fed113\\\"],\\\"sizeBytes\\\":484450894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c4d5a681595e428ff4b5083648c13615eed80be9084a3d3fc68a0295079cb12\\\"],\\\"sizeBytes\\\":484187929},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f933312f49083e8746fc41ab5e46a9a757b448374f14971e256ebcb36f11dd97\\\"],\\\"sizeBytes\\\":470826739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ea5c8a93f30e0a4932da5697d22c0da7eda9a7035c0555eb006b6755e62bb2fc\\\"],\\\"sizeBytes\\\":468265024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\\\"],\\\"sizeBytes\\\":465090934},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9609c00207cc4db97f0fd6162eb429d7f81654137f020a677e30cba26a887a24\\\"],\\\"sizeBytes\\\":463705930},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:632e80bba5077068ecca05fddb95aedebad4493af6f36152c01c6ae490975b62\\\"],\\\"sizeBytes\\\":458126937},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bcb08821551e9a5b9f82aa794bcea673279cefb93cb47492e19ccac5e2cf18fe\\\"],\\\"sizeBytes\\\":456576198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:759fb1d5353dbbadd443f38631d977ca3aed9787b873be05cc9660532a252739\\\"],\\\"sizeBytes\\\":448828620},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3062f6485aec4770e60852b535c69a42527b305161fe856499c8658ead6d1e85\\\"],\\\"sizeBytes\\\":448042136},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:951ecfeba9b2da4b653034d09275f925396a79c2d8461b8a7c71c776fee67ba0\\\"],\\\"sizeBytes\\\":443272037},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:292560e2d80b460468bb19fe0ddf289767c655027b03a76ee6c40c91ffe4c483\\\"],\\\"sizeBytes\\\":438654374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e66fd50be6f83ce321a566dfb76f3725b597374077d5af13813b928f6b1267e\\\"],\\\"sizeBytes\\\":411587146}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:37:57.978407 master-0 kubenswrapper[7465]: I0320 08:37:57.978341 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:57.978407 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:57.978407 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:57.978407 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:57.978856 master-0 kubenswrapper[7465]: I0320 08:37:57.978818 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:58.500907 master-0 kubenswrapper[7465]: I0320 08:37:58.500815 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:37:58.979651 master-0 kubenswrapper[7465]: I0320 08:37:58.979535 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:58.979651 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:58.979651 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:58.979651 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:58.980858 master-0 kubenswrapper[7465]: I0320 08:37:58.979660 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:37:59.978705 master-0 kubenswrapper[7465]: I0320 08:37:59.978546 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:37:59.978705 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:37:59.978705 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:37:59.978705 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:37:59.978705 master-0 kubenswrapper[7465]: I0320 08:37:59.978691 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:00.979314 master-0 kubenswrapper[7465]: I0320 08:38:00.979000 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:00.979314 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:00.979314 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:00.979314 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:00.981099 master-0 kubenswrapper[7465]: I0320 08:38:00.979347 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:01.923094 master-0 kubenswrapper[7465]: E0320 08:38:01.922863 7465 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 20 08:38:01.977948 master-0 kubenswrapper[7465]: I0320 08:38:01.977605 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:01.977948 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:01.977948 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:01.977948 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:01.977948 master-0 kubenswrapper[7465]: I0320 08:38:01.977740 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:02.039443 master-0 kubenswrapper[7465]: I0320 08:38:02.039307 7465 generic.go:334] "Generic (PLEG): container finished" podID="d664a6d0d2a24360dee10612610f1b59" containerID="ac50f53bc5c085a17ffb0495207fc3be5a9d394ca5e56961969c0592f7f25b0d" exitCode=0 Mar 20 08:38:02.977541 master-0 kubenswrapper[7465]: I0320 08:38:02.977441 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:02.977541 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:02.977541 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:02.977541 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:02.977902 master-0 kubenswrapper[7465]: I0320 08:38:02.977547 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:03.046325 master-0 kubenswrapper[7465]: I0320 08:38:03.046253 7465 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="31b5815996c66a028a6e102943aed8dd0cbf1cb918ec3a5b728d9fb0cb098506" exitCode=0 Mar 20 08:38:03.046325 master-0 kubenswrapper[7465]: I0320 08:38:03.046307 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"31b5815996c66a028a6e102943aed8dd0cbf1cb918ec3a5b728d9fb0cb098506"} Mar 20 08:38:03.978331 master-0 kubenswrapper[7465]: I0320 08:38:03.978224 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:03.978331 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:03.978331 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:03.978331 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:03.978784 master-0 kubenswrapper[7465]: I0320 08:38:03.978337 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:04.606712 master-0 kubenswrapper[7465]: I0320 08:38:04.606504 7465 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:38:04.675532 master-0 kubenswrapper[7465]: I0320 08:38:04.675460 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_d664a6d0d2a24360dee10612610f1b59/etcdctl/0.log" Mar 20 08:38:04.675706 master-0 kubenswrapper[7465]: I0320 08:38:04.675606 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:38:04.858295 master-0 kubenswrapper[7465]: I0320 08:38:04.858070 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"d664a6d0d2a24360dee10612610f1b59\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " Mar 20 08:38:04.858295 master-0 kubenswrapper[7465]: I0320 08:38:04.858234 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir" (OuterVolumeSpecName: "data-dir") pod "d664a6d0d2a24360dee10612610f1b59" (UID: "d664a6d0d2a24360dee10612610f1b59"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:38:04.858607 master-0 kubenswrapper[7465]: I0320 08:38:04.858329 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs" (OuterVolumeSpecName: "certs") pod "d664a6d0d2a24360dee10612610f1b59" (UID: "d664a6d0d2a24360dee10612610f1b59"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:38:04.858607 master-0 kubenswrapper[7465]: I0320 08:38:04.858262 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"d664a6d0d2a24360dee10612610f1b59\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " Mar 20 08:38:04.858884 master-0 kubenswrapper[7465]: I0320 08:38:04.858850 7465 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:38:04.858942 master-0 kubenswrapper[7465]: I0320 08:38:04.858883 7465 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 08:38:04.978350 master-0 kubenswrapper[7465]: I0320 08:38:04.978272 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:04.978350 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:04.978350 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:04.978350 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:04.978770 master-0 kubenswrapper[7465]: I0320 08:38:04.978379 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:05.064427 master-0 kubenswrapper[7465]: I0320 08:38:05.064360 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_d664a6d0d2a24360dee10612610f1b59/etcdctl/0.log" Mar 20 08:38:05.064738 master-0 kubenswrapper[7465]: I0320 08:38:05.064440 7465 generic.go:334] "Generic (PLEG): container finished" podID="d664a6d0d2a24360dee10612610f1b59" containerID="6a321a6b1056a105feaf6a80afaf577bd08c544abd875f2fd35678bf25e3bfcd" exitCode=137 Mar 20 08:38:05.064738 master-0 kubenswrapper[7465]: I0320 08:38:05.064581 7465 scope.go:117] "RemoveContainer" containerID="ac50f53bc5c085a17ffb0495207fc3be5a9d394ca5e56961969c0592f7f25b0d" Mar 20 08:38:05.066436 master-0 kubenswrapper[7465]: I0320 08:38:05.066382 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:38:05.068034 master-0 kubenswrapper[7465]: I0320 08:38:05.067987 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_e1d21f11-7386-4a04-a82e-5a03f3602a3b/installer/0.log" Mar 20 08:38:05.068123 master-0 kubenswrapper[7465]: I0320 08:38:05.068091 7465 generic.go:334] "Generic (PLEG): container finished" podID="e1d21f11-7386-4a04-a82e-5a03f3602a3b" containerID="bf471cdb978763d680a893df02a2a47dbe930e97fc0ccb05e480229f6feda593" exitCode=1 Mar 20 08:38:05.068174 master-0 kubenswrapper[7465]: I0320 08:38:05.068143 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"e1d21f11-7386-4a04-a82e-5a03f3602a3b","Type":"ContainerDied","Data":"bf471cdb978763d680a893df02a2a47dbe930e97fc0ccb05e480229f6feda593"} Mar 20 08:38:05.091723 master-0 kubenswrapper[7465]: I0320 08:38:05.091667 7465 scope.go:117] "RemoveContainer" containerID="6a321a6b1056a105feaf6a80afaf577bd08c544abd875f2fd35678bf25e3bfcd" Mar 20 08:38:05.117903 master-0 kubenswrapper[7465]: I0320 08:38:05.117827 7465 scope.go:117] "RemoveContainer" containerID="ac50f53bc5c085a17ffb0495207fc3be5a9d394ca5e56961969c0592f7f25b0d" Mar 20 08:38:05.118640 master-0 kubenswrapper[7465]: E0320 08:38:05.118566 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac50f53bc5c085a17ffb0495207fc3be5a9d394ca5e56961969c0592f7f25b0d\": container with ID starting with ac50f53bc5c085a17ffb0495207fc3be5a9d394ca5e56961969c0592f7f25b0d not found: ID does not exist" containerID="ac50f53bc5c085a17ffb0495207fc3be5a9d394ca5e56961969c0592f7f25b0d" Mar 20 08:38:05.118755 master-0 kubenswrapper[7465]: I0320 08:38:05.118646 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac50f53bc5c085a17ffb0495207fc3be5a9d394ca5e56961969c0592f7f25b0d"} err="failed to get container status \"ac50f53bc5c085a17ffb0495207fc3be5a9d394ca5e56961969c0592f7f25b0d\": rpc error: code = NotFound desc = could not find container \"ac50f53bc5c085a17ffb0495207fc3be5a9d394ca5e56961969c0592f7f25b0d\": container with ID starting with ac50f53bc5c085a17ffb0495207fc3be5a9d394ca5e56961969c0592f7f25b0d not found: ID does not exist" Mar 20 08:38:05.118755 master-0 kubenswrapper[7465]: I0320 08:38:05.118690 7465 scope.go:117] "RemoveContainer" containerID="6a321a6b1056a105feaf6a80afaf577bd08c544abd875f2fd35678bf25e3bfcd" Mar 20 08:38:05.119295 master-0 kubenswrapper[7465]: E0320 08:38:05.119226 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a321a6b1056a105feaf6a80afaf577bd08c544abd875f2fd35678bf25e3bfcd\": container with ID starting with 6a321a6b1056a105feaf6a80afaf577bd08c544abd875f2fd35678bf25e3bfcd not found: ID does not exist" containerID="6a321a6b1056a105feaf6a80afaf577bd08c544abd875f2fd35678bf25e3bfcd" Mar 20 08:38:05.119388 master-0 kubenswrapper[7465]: I0320 08:38:05.119295 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a321a6b1056a105feaf6a80afaf577bd08c544abd875f2fd35678bf25e3bfcd"} err="failed to get container status \"6a321a6b1056a105feaf6a80afaf577bd08c544abd875f2fd35678bf25e3bfcd\": rpc error: code = NotFound desc = could not find container \"6a321a6b1056a105feaf6a80afaf577bd08c544abd875f2fd35678bf25e3bfcd\": container with ID starting with 6a321a6b1056a105feaf6a80afaf577bd08c544abd875f2fd35678bf25e3bfcd not found: ID does not exist" Mar 20 08:38:05.978416 master-0 kubenswrapper[7465]: I0320 08:38:05.978294 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:05.978416 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:05.978416 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:05.978416 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:05.978416 master-0 kubenswrapper[7465]: I0320 08:38:05.978404 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:06.429389 master-0 kubenswrapper[7465]: I0320 08:38:06.429334 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_e1d21f11-7386-4a04-a82e-5a03f3602a3b/installer/0.log" Mar 20 08:38:06.429770 master-0 kubenswrapper[7465]: I0320 08:38:06.429427 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 20 08:38:06.469853 master-0 kubenswrapper[7465]: E0320 08:38:06.469739 7465 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:38:06.482157 master-0 kubenswrapper[7465]: I0320 08:38:06.482055 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1d21f11-7386-4a04-a82e-5a03f3602a3b-kubelet-dir\") pod \"e1d21f11-7386-4a04-a82e-5a03f3602a3b\" (UID: \"e1d21f11-7386-4a04-a82e-5a03f3602a3b\") " Mar 20 08:38:06.482407 master-0 kubenswrapper[7465]: I0320 08:38:06.482248 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1d21f11-7386-4a04-a82e-5a03f3602a3b-var-lock\") pod \"e1d21f11-7386-4a04-a82e-5a03f3602a3b\" (UID: \"e1d21f11-7386-4a04-a82e-5a03f3602a3b\") " Mar 20 08:38:06.482518 master-0 kubenswrapper[7465]: I0320 08:38:06.482459 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1d21f11-7386-4a04-a82e-5a03f3602a3b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e1d21f11-7386-4a04-a82e-5a03f3602a3b" (UID: "e1d21f11-7386-4a04-a82e-5a03f3602a3b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:38:06.482631 master-0 kubenswrapper[7465]: I0320 08:38:06.482573 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1d21f11-7386-4a04-a82e-5a03f3602a3b-var-lock" (OuterVolumeSpecName: "var-lock") pod "e1d21f11-7386-4a04-a82e-5a03f3602a3b" (UID: "e1d21f11-7386-4a04-a82e-5a03f3602a3b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:38:06.482923 master-0 kubenswrapper[7465]: I0320 08:38:06.482869 7465 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1d21f11-7386-4a04-a82e-5a03f3602a3b-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:38:06.482923 master-0 kubenswrapper[7465]: I0320 08:38:06.482898 7465 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1d21f11-7386-4a04-a82e-5a03f3602a3b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:38:06.548645 master-0 kubenswrapper[7465]: I0320 08:38:06.548541 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d664a6d0d2a24360dee10612610f1b59" path="/var/lib/kubelet/pods/d664a6d0d2a24360dee10612610f1b59/volumes" Mar 20 08:38:06.549683 master-0 kubenswrapper[7465]: I0320 08:38:06.549632 7465 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 20 08:38:06.583723 master-0 kubenswrapper[7465]: I0320 08:38:06.583609 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1d21f11-7386-4a04-a82e-5a03f3602a3b-kube-api-access\") pod \"e1d21f11-7386-4a04-a82e-5a03f3602a3b\" (UID: \"e1d21f11-7386-4a04-a82e-5a03f3602a3b\") " Mar 20 08:38:06.588986 master-0 kubenswrapper[7465]: I0320 08:38:06.588906 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1d21f11-7386-4a04-a82e-5a03f3602a3b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e1d21f11-7386-4a04-a82e-5a03f3602a3b" (UID: "e1d21f11-7386-4a04-a82e-5a03f3602a3b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:38:06.686342 master-0 kubenswrapper[7465]: I0320 08:38:06.686072 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1d21f11-7386-4a04-a82e-5a03f3602a3b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:38:06.978631 master-0 kubenswrapper[7465]: I0320 08:38:06.978395 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:06.978631 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:06.978631 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:06.978631 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:06.978631 master-0 kubenswrapper[7465]: I0320 08:38:06.978513 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:07.086816 master-0 kubenswrapper[7465]: I0320 08:38:07.086752 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_e1d21f11-7386-4a04-a82e-5a03f3602a3b/installer/0.log" Mar 20 08:38:07.087317 master-0 kubenswrapper[7465]: I0320 08:38:07.086933 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 20 08:38:07.631950 master-0 kubenswrapper[7465]: E0320 08:38:07.631854 7465 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:38:07.979257 master-0 kubenswrapper[7465]: I0320 08:38:07.979030 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:07.979257 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:07.979257 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:07.979257 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:07.979257 master-0 kubenswrapper[7465]: I0320 08:38:07.979142 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:08.476986 master-0 kubenswrapper[7465]: I0320 08:38:08.476781 7465 patch_prober.go:28] interesting pod/authentication-operator-5885bfd7f4-62zrx container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.25:8443/healthz\": dial tcp 10.128.0.25:8443: connect: connection refused" start-of-body= Mar 20 08:38:08.476986 master-0 kubenswrapper[7465]: I0320 08:38:08.476890 7465 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" podUID="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.25:8443/healthz\": dial tcp 10.128.0.25:8443: connect: connection refused" Mar 20 08:38:08.551493 master-0 kubenswrapper[7465]: E0320 08:38:08.551235 7465 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e7fd4a698ed56 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:37:34.535388502 +0000 UTC m=+80.178704012,LastTimestamp:2026-03-20 08:37:34.535388502 +0000 UTC m=+80.178704012,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:38:08.978979 master-0 kubenswrapper[7465]: I0320 08:38:08.978845 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:08.978979 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:08.978979 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:08.978979 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:08.980263 master-0 kubenswrapper[7465]: I0320 08:38:08.978992 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:09.978515 master-0 kubenswrapper[7465]: I0320 08:38:09.978383 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:09.978515 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:09.978515 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:09.978515 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:09.979014 master-0 kubenswrapper[7465]: I0320 08:38:09.978516 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:10.113059 master-0 kubenswrapper[7465]: I0320 08:38:10.112947 7465 generic.go:334] "Generic (PLEG): container finished" podID="fa759777-de22-4440-a3d3-ad429a3b8e7b" containerID="9fc8932c5b9da1ed7961a714ed5b905683dbfe51746690c485563d67ef3b0b29" exitCode=0 Mar 20 08:38:10.978375 master-0 kubenswrapper[7465]: I0320 08:38:10.978264 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:10.978375 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:10.978375 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:10.978375 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:10.978863 master-0 kubenswrapper[7465]: I0320 08:38:10.978409 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:11.979005 master-0 kubenswrapper[7465]: I0320 08:38:11.978868 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:11.979005 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:11.979005 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:11.979005 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:11.980104 master-0 kubenswrapper[7465]: I0320 08:38:11.978997 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:12.979254 master-0 kubenswrapper[7465]: I0320 08:38:12.979117 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:12.979254 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:12.979254 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:12.979254 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:12.980229 master-0 kubenswrapper[7465]: I0320 08:38:12.979260 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:13.979473 master-0 kubenswrapper[7465]: I0320 08:38:13.979370 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:13.979473 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:13.979473 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:13.979473 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:13.980536 master-0 kubenswrapper[7465]: I0320 08:38:13.979490 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:14.605948 master-0 kubenswrapper[7465]: I0320 08:38:14.605872 7465 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:38:14.978499 master-0 kubenswrapper[7465]: I0320 08:38:14.978427 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:14.978499 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:14.978499 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:14.978499 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:14.978871 master-0 kubenswrapper[7465]: I0320 08:38:14.978518 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:15.157555 master-0 kubenswrapper[7465]: I0320 08:38:15.157455 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7bd846bfc4-mt454_ad692349-5089-4afc-85b2-9b6e7997567c/network-operator/0.log" Mar 20 08:38:15.157555 master-0 kubenswrapper[7465]: I0320 08:38:15.157537 7465 generic.go:334] "Generic (PLEG): container finished" podID="ad692349-5089-4afc-85b2-9b6e7997567c" containerID="871b2216305e9883e9345351545dfa140768d0b1a9975612361ac0d841fda34c" exitCode=255 Mar 20 08:38:15.978608 master-0 kubenswrapper[7465]: I0320 08:38:15.978534 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:15.978608 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:15.978608 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:15.978608 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:15.979023 master-0 kubenswrapper[7465]: I0320 08:38:15.978636 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:16.054023 master-0 kubenswrapper[7465]: E0320 08:38:16.053951 7465 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 20 08:38:16.173921 master-0 kubenswrapper[7465]: I0320 08:38:16.173840 7465 generic.go:334] "Generic (PLEG): container finished" podID="325f0a83-d56d-4b62-977b-088a7d5f0e00" containerID="d498d1943e73378963ddf1fcb87e4851f496c741db4b494c02cde7a06a7405b7" exitCode=0 Mar 20 08:38:16.470755 master-0 kubenswrapper[7465]: E0320 08:38:16.470588 7465 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:38:16.978576 master-0 kubenswrapper[7465]: I0320 08:38:16.978489 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:16.978576 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:16.978576 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:16.978576 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:16.979402 master-0 kubenswrapper[7465]: I0320 08:38:16.979332 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:17.183504 master-0 kubenswrapper[7465]: I0320 08:38:17.183383 7465 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="a2139218314ea5d5d1e04c37be758e7a9f90c106dd3c470737be6550fb6322a9" exitCode=0 Mar 20 08:38:17.632858 master-0 kubenswrapper[7465]: E0320 08:38:17.632770 7465 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:38:17.978079 master-0 kubenswrapper[7465]: I0320 08:38:17.977841 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:17.978079 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:17.978079 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:17.978079 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:17.978079 master-0 kubenswrapper[7465]: I0320 08:38:17.977941 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:18.475228 master-0 kubenswrapper[7465]: I0320 08:38:18.475123 7465 patch_prober.go:28] interesting pod/authentication-operator-5885bfd7f4-62zrx container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.25:8443/healthz\": dial tcp 10.128.0.25:8443: connect: connection refused" start-of-body= Mar 20 08:38:18.475964 master-0 kubenswrapper[7465]: I0320 08:38:18.475255 7465 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" podUID="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.25:8443/healthz\": dial tcp 10.128.0.25:8443: connect: connection refused" Mar 20 08:38:18.978913 master-0 kubenswrapper[7465]: I0320 08:38:18.978820 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:18.978913 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:18.978913 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:18.978913 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:18.979356 master-0 kubenswrapper[7465]: I0320 08:38:18.978963 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:19.198269 master-0 kubenswrapper[7465]: I0320 08:38:19.198178 7465 generic.go:334] "Generic (PLEG): container finished" podID="c2a23d24-9e09-431e-8c3b-8456ff51a8d0" containerID="4f3597589c3cef89a127e7d3fec06145ebc7ceb92ec5a686f505064f491c35a3" exitCode=0 Mar 20 08:38:19.977613 master-0 kubenswrapper[7465]: I0320 08:38:19.977515 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:19.977613 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:19.977613 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:19.977613 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:19.978475 master-0 kubenswrapper[7465]: I0320 08:38:19.977633 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:20.979162 master-0 kubenswrapper[7465]: I0320 08:38:20.978851 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:20.979162 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:20.979162 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:20.979162 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:20.979854 master-0 kubenswrapper[7465]: I0320 08:38:20.979222 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:21.978914 master-0 kubenswrapper[7465]: I0320 08:38:21.978810 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:21.978914 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:21.978914 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:21.978914 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:21.979826 master-0 kubenswrapper[7465]: I0320 08:38:21.978940 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:22.978786 master-0 kubenswrapper[7465]: I0320 08:38:22.978710 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:22.978786 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:22.978786 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:22.978786 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:22.979754 master-0 kubenswrapper[7465]: I0320 08:38:22.978801 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:23.978858 master-0 kubenswrapper[7465]: I0320 08:38:23.978756 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:23.978858 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:23.978858 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:23.978858 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:23.979280 master-0 kubenswrapper[7465]: I0320 08:38:23.978899 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:24.979052 master-0 kubenswrapper[7465]: I0320 08:38:24.978876 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:24.979052 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:24.979052 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:24.979052 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:24.979052 master-0 kubenswrapper[7465]: I0320 08:38:24.978997 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:25.978416 master-0 kubenswrapper[7465]: I0320 08:38:25.978315 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:25.978416 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:25.978416 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:25.978416 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:25.979088 master-0 kubenswrapper[7465]: I0320 08:38:25.979026 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:26.251735 master-0 kubenswrapper[7465]: I0320 08:38:26.251556 7465 generic.go:334] "Generic (PLEG): container finished" podID="f046860d-2d54-4746-8ba2-f8e90fa55e38" containerID="0bd5b879aadefbf9a44c6c6dcf866736c73041c10aa61588b2570b45b21eb4d2" exitCode=0 Mar 20 08:38:26.471494 master-0 kubenswrapper[7465]: E0320 08:38:26.471399 7465 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" Mar 20 08:38:26.471860 master-0 kubenswrapper[7465]: I0320 08:38:26.471839 7465 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 08:38:26.978426 master-0 kubenswrapper[7465]: I0320 08:38:26.978364 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:26.978426 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:26.978426 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:26.978426 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:26.978837 master-0 kubenswrapper[7465]: I0320 08:38:26.978452 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:27.634176 master-0 kubenswrapper[7465]: E0320 08:38:27.634111 7465 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:38:27.978282 master-0 kubenswrapper[7465]: I0320 08:38:27.978079 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:27.978282 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:27.978282 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:27.978282 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:27.978282 master-0 kubenswrapper[7465]: I0320 08:38:27.978214 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:28.475373 master-0 kubenswrapper[7465]: I0320 08:38:28.475288 7465 patch_prober.go:28] interesting pod/authentication-operator-5885bfd7f4-62zrx container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.25:8443/healthz\": dial tcp 10.128.0.25:8443: connect: connection refused" start-of-body= Mar 20 08:38:28.475744 master-0 kubenswrapper[7465]: I0320 08:38:28.475385 7465 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" podUID="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.25:8443/healthz\": dial tcp 10.128.0.25:8443: connect: connection refused" Mar 20 08:38:28.979141 master-0 kubenswrapper[7465]: I0320 08:38:28.979048 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:28.979141 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:28.979141 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:28.979141 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:28.979913 master-0 kubenswrapper[7465]: I0320 08:38:28.979169 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:29.977917 master-0 kubenswrapper[7465]: I0320 08:38:29.977833 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:29.977917 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:29.977917 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:29.977917 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:29.978476 master-0 kubenswrapper[7465]: I0320 08:38:29.977938 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:30.190708 master-0 kubenswrapper[7465]: E0320 08:38:30.190593 7465 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 20 08:38:30.977964 master-0 kubenswrapper[7465]: I0320 08:38:30.977924 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:30.977964 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:30.977964 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:30.977964 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:30.978168 master-0 kubenswrapper[7465]: I0320 08:38:30.978004 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:31.290086 master-0 kubenswrapper[7465]: I0320 08:38:31.289994 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-6t5vb_fb0fc10f-5796-4cd5-b8f5-72d678054c24/approver/0.log" Mar 20 08:38:31.290702 master-0 kubenswrapper[7465]: I0320 08:38:31.290483 7465 generic.go:334] "Generic (PLEG): container finished" podID="fb0fc10f-5796-4cd5-b8f5-72d678054c24" containerID="11fb9901a3c95280d4ada671324191ca15e473462e8fdb0e196a3a680e2d44a1" exitCode=1 Mar 20 08:38:31.978754 master-0 kubenswrapper[7465]: I0320 08:38:31.978648 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:31.978754 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:31.978754 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:31.978754 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:31.979271 master-0 kubenswrapper[7465]: I0320 08:38:31.978784 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:32.083873 master-0 kubenswrapper[7465]: I0320 08:38:32.083784 7465 patch_prober.go:28] interesting pod/etcd-operator-8544cbcf9c-brhw4 container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.28:8443/healthz\": dial tcp 10.128.0.28:8443: connect: connection refused" start-of-body= Mar 20 08:38:32.083873 master-0 kubenswrapper[7465]: I0320 08:38:32.083850 7465 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" podUID="f046860d-2d54-4746-8ba2-f8e90fa55e38" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.28:8443/healthz\": dial tcp 10.128.0.28:8443: connect: connection refused" Mar 20 08:38:32.979367 master-0 kubenswrapper[7465]: I0320 08:38:32.979229 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:32.979367 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:32.979367 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:32.979367 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:32.980577 master-0 kubenswrapper[7465]: I0320 08:38:32.979364 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:33.978884 master-0 kubenswrapper[7465]: I0320 08:38:33.978758 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:33.978884 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:33.978884 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:33.978884 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:33.980367 master-0 kubenswrapper[7465]: I0320 08:38:33.979590 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:35.015334 master-0 kubenswrapper[7465]: I0320 08:38:35.015090 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:35.015334 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:35.015334 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:35.015334 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:35.016282 master-0 kubenswrapper[7465]: I0320 08:38:35.015315 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:35.978529 master-0 kubenswrapper[7465]: I0320 08:38:35.978454 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:35.978529 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:35.978529 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:35.978529 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:35.978973 master-0 kubenswrapper[7465]: I0320 08:38:35.978557 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:36.472939 master-0 kubenswrapper[7465]: E0320 08:38:36.472841 7465 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 20 08:38:36.536910 master-0 kubenswrapper[7465]: I0320 08:38:36.536806 7465 status_manager.go:851] "Failed to get status for pod" podUID="e180bf9a-03f7-405b-90c3-b2e46008213e" pod="openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods route-controller-manager-7ffc895647-6j97v)" Mar 20 08:38:36.977382 master-0 kubenswrapper[7465]: I0320 08:38:36.977292 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:36.977382 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:36.977382 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:36.977382 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:36.977900 master-0 kubenswrapper[7465]: I0320 08:38:36.977392 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:37.635316 master-0 kubenswrapper[7465]: E0320 08:38:37.635063 7465 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:38:37.635316 master-0 kubenswrapper[7465]: E0320 08:38:37.635153 7465 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 08:38:37.979362 master-0 kubenswrapper[7465]: I0320 08:38:37.979155 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:37.979362 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:37.979362 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:37.979362 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:37.979714 master-0 kubenswrapper[7465]: I0320 08:38:37.979340 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:38.977885 master-0 kubenswrapper[7465]: I0320 08:38:38.977788 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:38.977885 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:38.977885 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:38.977885 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:38.977885 master-0 kubenswrapper[7465]: I0320 08:38:38.977879 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:39.053841 master-0 kubenswrapper[7465]: E0320 08:38:39.053692 7465 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:38:39.053841 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-autoscaler-operator-866dc4744-xwxg7_openshift-machine-api_21bebade-17fa-444e-92a9-eea53d6cd673_0(eea5c777fab069276340fb8b8e8bc932a48b37c137af14866a4f948ceb435c50): error adding pod openshift-machine-api_cluster-autoscaler-operator-866dc4744-xwxg7 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"eea5c777fab069276340fb8b8e8bc932a48b37c137af14866a4f948ceb435c50" Netns:"/var/run/netns/cf3886d6-d600-4c57-baf9-25c23fb5edd0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=cluster-autoscaler-operator-866dc4744-xwxg7;K8S_POD_INFRA_CONTAINER_ID=eea5c777fab069276340fb8b8e8bc932a48b37c137af14866a4f948ceb435c50;K8S_POD_UID=21bebade-17fa-444e-92a9-eea53d6cd673" Path:"" ERRORED: error configuring pod [openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7] networking: Multus: [openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7/21bebade-17fa-444e-92a9-eea53d6cd673]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-autoscaler-operator-866dc4744-xwxg7 in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-autoscaler-operator-866dc4744-xwxg7 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-autoscaler-operator-866dc4744-xwxg7?timeout=1m0s": context deadline exceeded Mar 20 08:38:39.053841 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.053841 master-0 kubenswrapper[7465]: > Mar 20 08:38:39.055442 master-0 kubenswrapper[7465]: E0320 08:38:39.055347 7465 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:38:39.055442 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-autoscaler-operator-866dc4744-xwxg7_openshift-machine-api_21bebade-17fa-444e-92a9-eea53d6cd673_0(eea5c777fab069276340fb8b8e8bc932a48b37c137af14866a4f948ceb435c50): error adding pod openshift-machine-api_cluster-autoscaler-operator-866dc4744-xwxg7 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"eea5c777fab069276340fb8b8e8bc932a48b37c137af14866a4f948ceb435c50" Netns:"/var/run/netns/cf3886d6-d600-4c57-baf9-25c23fb5edd0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=cluster-autoscaler-operator-866dc4744-xwxg7;K8S_POD_INFRA_CONTAINER_ID=eea5c777fab069276340fb8b8e8bc932a48b37c137af14866a4f948ceb435c50;K8S_POD_UID=21bebade-17fa-444e-92a9-eea53d6cd673" Path:"" ERRORED: error configuring pod [openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7] networking: Multus: [openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7/21bebade-17fa-444e-92a9-eea53d6cd673]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-autoscaler-operator-866dc4744-xwxg7 in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-autoscaler-operator-866dc4744-xwxg7 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-autoscaler-operator-866dc4744-xwxg7?timeout=1m0s": context deadline exceeded Mar 20 08:38:39.055442 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.055442 master-0 kubenswrapper[7465]: > pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:38:39.055602 master-0 kubenswrapper[7465]: E0320 08:38:39.055485 7465 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:38:39.055602 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-autoscaler-operator-866dc4744-xwxg7_openshift-machine-api_21bebade-17fa-444e-92a9-eea53d6cd673_0(eea5c777fab069276340fb8b8e8bc932a48b37c137af14866a4f948ceb435c50): error adding pod openshift-machine-api_cluster-autoscaler-operator-866dc4744-xwxg7 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"eea5c777fab069276340fb8b8e8bc932a48b37c137af14866a4f948ceb435c50" Netns:"/var/run/netns/cf3886d6-d600-4c57-baf9-25c23fb5edd0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=cluster-autoscaler-operator-866dc4744-xwxg7;K8S_POD_INFRA_CONTAINER_ID=eea5c777fab069276340fb8b8e8bc932a48b37c137af14866a4f948ceb435c50;K8S_POD_UID=21bebade-17fa-444e-92a9-eea53d6cd673" Path:"" ERRORED: error configuring pod [openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7] networking: Multus: [openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7/21bebade-17fa-444e-92a9-eea53d6cd673]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-autoscaler-operator-866dc4744-xwxg7 in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-autoscaler-operator-866dc4744-xwxg7 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-autoscaler-operator-866dc4744-xwxg7?timeout=1m0s": context deadline exceeded Mar 20 08:38:39.055602 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.055602 master-0 kubenswrapper[7465]: > pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:38:39.055742 master-0 kubenswrapper[7465]: E0320 08:38:39.055621 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cluster-autoscaler-operator-866dc4744-xwxg7_openshift-machine-api(21bebade-17fa-444e-92a9-eea53d6cd673)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cluster-autoscaler-operator-866dc4744-xwxg7_openshift-machine-api(21bebade-17fa-444e-92a9-eea53d6cd673)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-autoscaler-operator-866dc4744-xwxg7_openshift-machine-api_21bebade-17fa-444e-92a9-eea53d6cd673_0(eea5c777fab069276340fb8b8e8bc932a48b37c137af14866a4f948ceb435c50): error adding pod openshift-machine-api_cluster-autoscaler-operator-866dc4744-xwxg7 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"eea5c777fab069276340fb8b8e8bc932a48b37c137af14866a4f948ceb435c50\\\" Netns:\\\"/var/run/netns/cf3886d6-d600-4c57-baf9-25c23fb5edd0\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=cluster-autoscaler-operator-866dc4744-xwxg7;K8S_POD_INFRA_CONTAINER_ID=eea5c777fab069276340fb8b8e8bc932a48b37c137af14866a4f948ceb435c50;K8S_POD_UID=21bebade-17fa-444e-92a9-eea53d6cd673\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7] networking: Multus: [openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7/21bebade-17fa-444e-92a9-eea53d6cd673]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-autoscaler-operator-866dc4744-xwxg7 in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-autoscaler-operator-866dc4744-xwxg7 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-autoscaler-operator-866dc4744-xwxg7?timeout=1m0s\\\": context deadline exceeded\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" podUID="21bebade-17fa-444e-92a9-eea53d6cd673" Mar 20 08:38:39.202865 master-0 kubenswrapper[7465]: E0320 08:38:39.202790 7465 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:38:39.202865 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-samples-operator-85f7577d78-mwfgx_openshift-cluster-samples-operator_e3bf8eaf-5f6c-41a6-aaeb-6c921d789466_0(ca850c7655dc9ec8e9a1fbbcce67531da5122d6e0d326f12b4ff2fa81952ebba): error adding pod openshift-cluster-samples-operator_cluster-samples-operator-85f7577d78-mwfgx to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ca850c7655dc9ec8e9a1fbbcce67531da5122d6e0d326f12b4ff2fa81952ebba" Netns:"/var/run/netns/87752d08-ed34-4f8b-a26a-f4ed4ca16e8e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cluster-samples-operator;K8S_POD_NAME=cluster-samples-operator-85f7577d78-mwfgx;K8S_POD_INFRA_CONTAINER_ID=ca850c7655dc9ec8e9a1fbbcce67531da5122d6e0d326f12b4ff2fa81952ebba;K8S_POD_UID=e3bf8eaf-5f6c-41a6-aaeb-6c921d789466" Path:"" ERRORED: error configuring pod [openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx] networking: Multus: [openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-samples-operator-85f7577d78-mwfgx in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-samples-operator-85f7577d78-mwfgx in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-85f7577d78-mwfgx?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:38:39.202865 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.202865 master-0 kubenswrapper[7465]: > Mar 20 08:38:39.203052 master-0 kubenswrapper[7465]: E0320 08:38:39.202897 7465 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:38:39.203052 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-samples-operator-85f7577d78-mwfgx_openshift-cluster-samples-operator_e3bf8eaf-5f6c-41a6-aaeb-6c921d789466_0(ca850c7655dc9ec8e9a1fbbcce67531da5122d6e0d326f12b4ff2fa81952ebba): error adding pod openshift-cluster-samples-operator_cluster-samples-operator-85f7577d78-mwfgx to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ca850c7655dc9ec8e9a1fbbcce67531da5122d6e0d326f12b4ff2fa81952ebba" Netns:"/var/run/netns/87752d08-ed34-4f8b-a26a-f4ed4ca16e8e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cluster-samples-operator;K8S_POD_NAME=cluster-samples-operator-85f7577d78-mwfgx;K8S_POD_INFRA_CONTAINER_ID=ca850c7655dc9ec8e9a1fbbcce67531da5122d6e0d326f12b4ff2fa81952ebba;K8S_POD_UID=e3bf8eaf-5f6c-41a6-aaeb-6c921d789466" Path:"" ERRORED: error configuring pod [openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx] networking: Multus: [openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-samples-operator-85f7577d78-mwfgx in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-samples-operator-85f7577d78-mwfgx in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-85f7577d78-mwfgx?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:38:39.203052 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.203052 master-0 kubenswrapper[7465]: > pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:38:39.203052 master-0 kubenswrapper[7465]: E0320 08:38:39.202922 7465 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:38:39.203052 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-samples-operator-85f7577d78-mwfgx_openshift-cluster-samples-operator_e3bf8eaf-5f6c-41a6-aaeb-6c921d789466_0(ca850c7655dc9ec8e9a1fbbcce67531da5122d6e0d326f12b4ff2fa81952ebba): error adding pod openshift-cluster-samples-operator_cluster-samples-operator-85f7577d78-mwfgx to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ca850c7655dc9ec8e9a1fbbcce67531da5122d6e0d326f12b4ff2fa81952ebba" Netns:"/var/run/netns/87752d08-ed34-4f8b-a26a-f4ed4ca16e8e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cluster-samples-operator;K8S_POD_NAME=cluster-samples-operator-85f7577d78-mwfgx;K8S_POD_INFRA_CONTAINER_ID=ca850c7655dc9ec8e9a1fbbcce67531da5122d6e0d326f12b4ff2fa81952ebba;K8S_POD_UID=e3bf8eaf-5f6c-41a6-aaeb-6c921d789466" Path:"" ERRORED: error configuring pod [openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx] networking: Multus: [openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-samples-operator-85f7577d78-mwfgx in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-samples-operator-85f7577d78-mwfgx in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-85f7577d78-mwfgx?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:38:39.203052 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.203052 master-0 kubenswrapper[7465]: > pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:38:39.203052 master-0 kubenswrapper[7465]: E0320 08:38:39.203000 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cluster-samples-operator-85f7577d78-mwfgx_openshift-cluster-samples-operator(e3bf8eaf-5f6c-41a6-aaeb-6c921d789466)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cluster-samples-operator-85f7577d78-mwfgx_openshift-cluster-samples-operator(e3bf8eaf-5f6c-41a6-aaeb-6c921d789466)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-samples-operator-85f7577d78-mwfgx_openshift-cluster-samples-operator_e3bf8eaf-5f6c-41a6-aaeb-6c921d789466_0(ca850c7655dc9ec8e9a1fbbcce67531da5122d6e0d326f12b4ff2fa81952ebba): error adding pod openshift-cluster-samples-operator_cluster-samples-operator-85f7577d78-mwfgx to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"ca850c7655dc9ec8e9a1fbbcce67531da5122d6e0d326f12b4ff2fa81952ebba\\\" Netns:\\\"/var/run/netns/87752d08-ed34-4f8b-a26a-f4ed4ca16e8e\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cluster-samples-operator;K8S_POD_NAME=cluster-samples-operator-85f7577d78-mwfgx;K8S_POD_INFRA_CONTAINER_ID=ca850c7655dc9ec8e9a1fbbcce67531da5122d6e0d326f12b4ff2fa81952ebba;K8S_POD_UID=e3bf8eaf-5f6c-41a6-aaeb-6c921d789466\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx] networking: Multus: [openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-samples-operator-85f7577d78-mwfgx in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-samples-operator-85f7577d78-mwfgx in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-85f7577d78-mwfgx?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" podUID="e3bf8eaf-5f6c-41a6-aaeb-6c921d789466" Mar 20 08:38:39.222520 master-0 kubenswrapper[7465]: E0320 08:38:39.222404 7465 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:38:39.222520 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_control-plane-machine-set-operator-6f97756bc8-7t5qv_openshift-machine-api_e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e_0(9dbc34ec166af95a7853428b41cee350fe85c8c54be4aa6e328035aa02204cfc): error adding pod openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-7t5qv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9dbc34ec166af95a7853428b41cee350fe85c8c54be4aa6e328035aa02204cfc" Netns:"/var/run/netns/528b4df4-56d0-4831-a96c-32b3710c2f88" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=control-plane-machine-set-operator-6f97756bc8-7t5qv;K8S_POD_INFRA_CONTAINER_ID=9dbc34ec166af95a7853428b41cee350fe85c8c54be4aa6e328035aa02204cfc;K8S_POD_UID=e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e" Path:"" ERRORED: error configuring pod [openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv] networking: Multus: [openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod control-plane-machine-set-operator-6f97756bc8-7t5qv in out of cluster comm: SetNetworkStatus: failed to update the pod control-plane-machine-set-operator-6f97756bc8-7t5qv in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/control-plane-machine-set-operator-6f97756bc8-7t5qv?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:38:39.222520 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.222520 master-0 kubenswrapper[7465]: > Mar 20 08:38:39.222732 master-0 kubenswrapper[7465]: E0320 08:38:39.222574 7465 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:38:39.222732 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_control-plane-machine-set-operator-6f97756bc8-7t5qv_openshift-machine-api_e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e_0(9dbc34ec166af95a7853428b41cee350fe85c8c54be4aa6e328035aa02204cfc): error adding pod openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-7t5qv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9dbc34ec166af95a7853428b41cee350fe85c8c54be4aa6e328035aa02204cfc" Netns:"/var/run/netns/528b4df4-56d0-4831-a96c-32b3710c2f88" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=control-plane-machine-set-operator-6f97756bc8-7t5qv;K8S_POD_INFRA_CONTAINER_ID=9dbc34ec166af95a7853428b41cee350fe85c8c54be4aa6e328035aa02204cfc;K8S_POD_UID=e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e" Path:"" ERRORED: error configuring pod [openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv] networking: Multus: [openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod control-plane-machine-set-operator-6f97756bc8-7t5qv in out of cluster comm: SetNetworkStatus: failed to update the pod control-plane-machine-set-operator-6f97756bc8-7t5qv in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/control-plane-machine-set-operator-6f97756bc8-7t5qv?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:38:39.222732 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.222732 master-0 kubenswrapper[7465]: > pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:38:39.222732 master-0 kubenswrapper[7465]: E0320 08:38:39.222631 7465 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:38:39.222732 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_control-plane-machine-set-operator-6f97756bc8-7t5qv_openshift-machine-api_e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e_0(9dbc34ec166af95a7853428b41cee350fe85c8c54be4aa6e328035aa02204cfc): error adding pod openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-7t5qv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9dbc34ec166af95a7853428b41cee350fe85c8c54be4aa6e328035aa02204cfc" Netns:"/var/run/netns/528b4df4-56d0-4831-a96c-32b3710c2f88" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=control-plane-machine-set-operator-6f97756bc8-7t5qv;K8S_POD_INFRA_CONTAINER_ID=9dbc34ec166af95a7853428b41cee350fe85c8c54be4aa6e328035aa02204cfc;K8S_POD_UID=e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e" Path:"" ERRORED: error configuring pod [openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv] networking: Multus: [openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod control-plane-machine-set-operator-6f97756bc8-7t5qv in out of cluster comm: SetNetworkStatus: failed to update the pod control-plane-machine-set-operator-6f97756bc8-7t5qv in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/control-plane-machine-set-operator-6f97756bc8-7t5qv?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:38:39.222732 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.222732 master-0 kubenswrapper[7465]: > pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:38:39.223113 master-0 kubenswrapper[7465]: E0320 08:38:39.222793 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"control-plane-machine-set-operator-6f97756bc8-7t5qv_openshift-machine-api(e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"control-plane-machine-set-operator-6f97756bc8-7t5qv_openshift-machine-api(e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_control-plane-machine-set-operator-6f97756bc8-7t5qv_openshift-machine-api_e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e_0(9dbc34ec166af95a7853428b41cee350fe85c8c54be4aa6e328035aa02204cfc): error adding pod openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-7t5qv to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"9dbc34ec166af95a7853428b41cee350fe85c8c54be4aa6e328035aa02204cfc\\\" Netns:\\\"/var/run/netns/528b4df4-56d0-4831-a96c-32b3710c2f88\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=control-plane-machine-set-operator-6f97756bc8-7t5qv;K8S_POD_INFRA_CONTAINER_ID=9dbc34ec166af95a7853428b41cee350fe85c8c54be4aa6e328035aa02204cfc;K8S_POD_UID=e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv] networking: Multus: [openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod control-plane-machine-set-operator-6f97756bc8-7t5qv in out of cluster comm: SetNetworkStatus: failed to update the pod control-plane-machine-set-operator-6f97756bc8-7t5qv in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/control-plane-machine-set-operator-6f97756bc8-7t5qv?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" podUID="e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e" Mar 20 08:38:39.354033 master-0 kubenswrapper[7465]: I0320 08:38:39.353849 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:38:39.354518 master-0 kubenswrapper[7465]: I0320 08:38:39.354460 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:38:39.354615 master-0 kubenswrapper[7465]: I0320 08:38:39.354588 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:38:39.355401 master-0 kubenswrapper[7465]: I0320 08:38:39.355362 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:38:39.403811 master-0 kubenswrapper[7465]: E0320 08:38:39.403637 7465 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:38:39.403811 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_insights-operator-68bf6ff9d6-mvfn5_openshift-insights_f2217de0-7805-4f5f-8ea5-93b81b7e0236_0(6e679c83d9e07e8539cd4ee0b9a66261e4e8939cdf7af15cbfbb7e5cfaad82f9): error adding pod openshift-insights_insights-operator-68bf6ff9d6-mvfn5 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6e679c83d9e07e8539cd4ee0b9a66261e4e8939cdf7af15cbfbb7e5cfaad82f9" Netns:"/var/run/netns/06f0a4c8-4a3a-400f-a230-8a6ef7f51225" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-insights;K8S_POD_NAME=insights-operator-68bf6ff9d6-mvfn5;K8S_POD_INFRA_CONTAINER_ID=6e679c83d9e07e8539cd4ee0b9a66261e4e8939cdf7af15cbfbb7e5cfaad82f9;K8S_POD_UID=f2217de0-7805-4f5f-8ea5-93b81b7e0236" Path:"" ERRORED: error configuring pod [openshift-insights/insights-operator-68bf6ff9d6-mvfn5] networking: Multus: [openshift-insights/insights-operator-68bf6ff9d6-mvfn5/f2217de0-7805-4f5f-8ea5-93b81b7e0236]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod insights-operator-68bf6ff9d6-mvfn5 in out of cluster comm: SetNetworkStatus: failed to update the pod insights-operator-68bf6ff9d6-mvfn5 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-68bf6ff9d6-mvfn5?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:38:39.403811 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.403811 master-0 kubenswrapper[7465]: > Mar 20 08:38:39.404017 master-0 kubenswrapper[7465]: E0320 08:38:39.403896 7465 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:38:39.404017 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_insights-operator-68bf6ff9d6-mvfn5_openshift-insights_f2217de0-7805-4f5f-8ea5-93b81b7e0236_0(6e679c83d9e07e8539cd4ee0b9a66261e4e8939cdf7af15cbfbb7e5cfaad82f9): error adding pod openshift-insights_insights-operator-68bf6ff9d6-mvfn5 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6e679c83d9e07e8539cd4ee0b9a66261e4e8939cdf7af15cbfbb7e5cfaad82f9" Netns:"/var/run/netns/06f0a4c8-4a3a-400f-a230-8a6ef7f51225" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-insights;K8S_POD_NAME=insights-operator-68bf6ff9d6-mvfn5;K8S_POD_INFRA_CONTAINER_ID=6e679c83d9e07e8539cd4ee0b9a66261e4e8939cdf7af15cbfbb7e5cfaad82f9;K8S_POD_UID=f2217de0-7805-4f5f-8ea5-93b81b7e0236" Path:"" ERRORED: error configuring pod [openshift-insights/insights-operator-68bf6ff9d6-mvfn5] networking: Multus: [openshift-insights/insights-operator-68bf6ff9d6-mvfn5/f2217de0-7805-4f5f-8ea5-93b81b7e0236]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod insights-operator-68bf6ff9d6-mvfn5 in out of cluster comm: SetNetworkStatus: failed to update the pod insights-operator-68bf6ff9d6-mvfn5 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-68bf6ff9d6-mvfn5?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:38:39.404017 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.404017 master-0 kubenswrapper[7465]: > pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:38:39.404017 master-0 kubenswrapper[7465]: E0320 08:38:39.403949 7465 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:38:39.404017 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_insights-operator-68bf6ff9d6-mvfn5_openshift-insights_f2217de0-7805-4f5f-8ea5-93b81b7e0236_0(6e679c83d9e07e8539cd4ee0b9a66261e4e8939cdf7af15cbfbb7e5cfaad82f9): error adding pod openshift-insights_insights-operator-68bf6ff9d6-mvfn5 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6e679c83d9e07e8539cd4ee0b9a66261e4e8939cdf7af15cbfbb7e5cfaad82f9" Netns:"/var/run/netns/06f0a4c8-4a3a-400f-a230-8a6ef7f51225" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-insights;K8S_POD_NAME=insights-operator-68bf6ff9d6-mvfn5;K8S_POD_INFRA_CONTAINER_ID=6e679c83d9e07e8539cd4ee0b9a66261e4e8939cdf7af15cbfbb7e5cfaad82f9;K8S_POD_UID=f2217de0-7805-4f5f-8ea5-93b81b7e0236" Path:"" ERRORED: error configuring pod [openshift-insights/insights-operator-68bf6ff9d6-mvfn5] networking: Multus: [openshift-insights/insights-operator-68bf6ff9d6-mvfn5/f2217de0-7805-4f5f-8ea5-93b81b7e0236]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod insights-operator-68bf6ff9d6-mvfn5 in out of cluster comm: SetNetworkStatus: failed to update the pod insights-operator-68bf6ff9d6-mvfn5 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-68bf6ff9d6-mvfn5?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:38:39.404017 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.404017 master-0 kubenswrapper[7465]: > pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:38:39.404304 master-0 kubenswrapper[7465]: E0320 08:38:39.404071 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"insights-operator-68bf6ff9d6-mvfn5_openshift-insights(f2217de0-7805-4f5f-8ea5-93b81b7e0236)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"insights-operator-68bf6ff9d6-mvfn5_openshift-insights(f2217de0-7805-4f5f-8ea5-93b81b7e0236)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_insights-operator-68bf6ff9d6-mvfn5_openshift-insights_f2217de0-7805-4f5f-8ea5-93b81b7e0236_0(6e679c83d9e07e8539cd4ee0b9a66261e4e8939cdf7af15cbfbb7e5cfaad82f9): error adding pod openshift-insights_insights-operator-68bf6ff9d6-mvfn5 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"6e679c83d9e07e8539cd4ee0b9a66261e4e8939cdf7af15cbfbb7e5cfaad82f9\\\" Netns:\\\"/var/run/netns/06f0a4c8-4a3a-400f-a230-8a6ef7f51225\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-insights;K8S_POD_NAME=insights-operator-68bf6ff9d6-mvfn5;K8S_POD_INFRA_CONTAINER_ID=6e679c83d9e07e8539cd4ee0b9a66261e4e8939cdf7af15cbfbb7e5cfaad82f9;K8S_POD_UID=f2217de0-7805-4f5f-8ea5-93b81b7e0236\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-insights/insights-operator-68bf6ff9d6-mvfn5] networking: Multus: [openshift-insights/insights-operator-68bf6ff9d6-mvfn5/f2217de0-7805-4f5f-8ea5-93b81b7e0236]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod insights-operator-68bf6ff9d6-mvfn5 in out of cluster comm: SetNetworkStatus: failed to update the pod insights-operator-68bf6ff9d6-mvfn5 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-68bf6ff9d6-mvfn5?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" podUID="f2217de0-7805-4f5f-8ea5-93b81b7e0236" Mar 20 08:38:39.424557 master-0 kubenswrapper[7465]: E0320 08:38:39.424464 7465 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:38:39.424557 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cloud-credential-operator-744f9dbf77-r4qvh_openshift-cloud-credential-operator_469183dd-dc54-467d-82a1-611132ae8ec4_0(41dcb504e61c8c9db109ecb73d18ade9019dd987172fffe0bc5955fea5e2bd2a): error adding pod openshift-cloud-credential-operator_cloud-credential-operator-744f9dbf77-r4qvh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"41dcb504e61c8c9db109ecb73d18ade9019dd987172fffe0bc5955fea5e2bd2a" Netns:"/var/run/netns/9ad75b80-3551-465a-a3ca-a85581db5fef" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cloud-credential-operator;K8S_POD_NAME=cloud-credential-operator-744f9dbf77-r4qvh;K8S_POD_INFRA_CONTAINER_ID=41dcb504e61c8c9db109ecb73d18ade9019dd987172fffe0bc5955fea5e2bd2a;K8S_POD_UID=469183dd-dc54-467d-82a1-611132ae8ec4" Path:"" ERRORED: error configuring pod [openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh] networking: Multus: [openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh/469183dd-dc54-467d-82a1-611132ae8ec4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cloud-credential-operator-744f9dbf77-r4qvh in out of cluster comm: SetNetworkStatus: failed to update the pod cloud-credential-operator-744f9dbf77-r4qvh in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-credential-operator/pods/cloud-credential-operator-744f9dbf77-r4qvh?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:38:39.424557 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.424557 master-0 kubenswrapper[7465]: > Mar 20 08:38:39.425034 master-0 kubenswrapper[7465]: E0320 08:38:39.424596 7465 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:38:39.425034 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cloud-credential-operator-744f9dbf77-r4qvh_openshift-cloud-credential-operator_469183dd-dc54-467d-82a1-611132ae8ec4_0(41dcb504e61c8c9db109ecb73d18ade9019dd987172fffe0bc5955fea5e2bd2a): error adding pod openshift-cloud-credential-operator_cloud-credential-operator-744f9dbf77-r4qvh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"41dcb504e61c8c9db109ecb73d18ade9019dd987172fffe0bc5955fea5e2bd2a" Netns:"/var/run/netns/9ad75b80-3551-465a-a3ca-a85581db5fef" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cloud-credential-operator;K8S_POD_NAME=cloud-credential-operator-744f9dbf77-r4qvh;K8S_POD_INFRA_CONTAINER_ID=41dcb504e61c8c9db109ecb73d18ade9019dd987172fffe0bc5955fea5e2bd2a;K8S_POD_UID=469183dd-dc54-467d-82a1-611132ae8ec4" Path:"" ERRORED: error configuring pod [openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh] networking: Multus: [openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh/469183dd-dc54-467d-82a1-611132ae8ec4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cloud-credential-operator-744f9dbf77-r4qvh in out of cluster comm: SetNetworkStatus: failed to update the pod cloud-credential-operator-744f9dbf77-r4qvh in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-credential-operator/pods/cloud-credential-operator-744f9dbf77-r4qvh?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:38:39.425034 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.425034 master-0 kubenswrapper[7465]: > pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:38:39.425034 master-0 kubenswrapper[7465]: E0320 08:38:39.424643 7465 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:38:39.425034 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cloud-credential-operator-744f9dbf77-r4qvh_openshift-cloud-credential-operator_469183dd-dc54-467d-82a1-611132ae8ec4_0(41dcb504e61c8c9db109ecb73d18ade9019dd987172fffe0bc5955fea5e2bd2a): error adding pod openshift-cloud-credential-operator_cloud-credential-operator-744f9dbf77-r4qvh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"41dcb504e61c8c9db109ecb73d18ade9019dd987172fffe0bc5955fea5e2bd2a" Netns:"/var/run/netns/9ad75b80-3551-465a-a3ca-a85581db5fef" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cloud-credential-operator;K8S_POD_NAME=cloud-credential-operator-744f9dbf77-r4qvh;K8S_POD_INFRA_CONTAINER_ID=41dcb504e61c8c9db109ecb73d18ade9019dd987172fffe0bc5955fea5e2bd2a;K8S_POD_UID=469183dd-dc54-467d-82a1-611132ae8ec4" Path:"" ERRORED: error configuring pod [openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh] networking: Multus: [openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh/469183dd-dc54-467d-82a1-611132ae8ec4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cloud-credential-operator-744f9dbf77-r4qvh in out of cluster comm: SetNetworkStatus: failed to update the pod cloud-credential-operator-744f9dbf77-r4qvh in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-credential-operator/pods/cloud-credential-operator-744f9dbf77-r4qvh?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:38:39.425034 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.425034 master-0 kubenswrapper[7465]: > pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:38:39.425034 master-0 kubenswrapper[7465]: E0320 08:38:39.424800 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cloud-credential-operator-744f9dbf77-r4qvh_openshift-cloud-credential-operator(469183dd-dc54-467d-82a1-611132ae8ec4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cloud-credential-operator-744f9dbf77-r4qvh_openshift-cloud-credential-operator(469183dd-dc54-467d-82a1-611132ae8ec4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cloud-credential-operator-744f9dbf77-r4qvh_openshift-cloud-credential-operator_469183dd-dc54-467d-82a1-611132ae8ec4_0(41dcb504e61c8c9db109ecb73d18ade9019dd987172fffe0bc5955fea5e2bd2a): error adding pod openshift-cloud-credential-operator_cloud-credential-operator-744f9dbf77-r4qvh to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"41dcb504e61c8c9db109ecb73d18ade9019dd987172fffe0bc5955fea5e2bd2a\\\" Netns:\\\"/var/run/netns/9ad75b80-3551-465a-a3ca-a85581db5fef\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cloud-credential-operator;K8S_POD_NAME=cloud-credential-operator-744f9dbf77-r4qvh;K8S_POD_INFRA_CONTAINER_ID=41dcb504e61c8c9db109ecb73d18ade9019dd987172fffe0bc5955fea5e2bd2a;K8S_POD_UID=469183dd-dc54-467d-82a1-611132ae8ec4\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh] networking: Multus: [openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh/469183dd-dc54-467d-82a1-611132ae8ec4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cloud-credential-operator-744f9dbf77-r4qvh in out of cluster comm: SetNetworkStatus: failed to update the pod cloud-credential-operator-744f9dbf77-r4qvh in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-credential-operator/pods/cloud-credential-operator-744f9dbf77-r4qvh?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" podUID="469183dd-dc54-467d-82a1-611132ae8ec4" Mar 20 08:38:39.468805 master-0 kubenswrapper[7465]: E0320 08:38:39.468736 7465 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:38:39.468805 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_8b1c7a56-5d00-468a-bb8d-dbaf8e854951_0(c7f364ac5ac343cc49d7eb09efc4fc8a8e9a655adeeba4e984a9c0ca3e7b9153): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c7f364ac5ac343cc49d7eb09efc4fc8a8e9a655adeeba4e984a9c0ca3e7b9153" Netns:"/var/run/netns/60c6990c-6415-4a0b-ba8b-e62820d5aea5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=c7f364ac5ac343cc49d7eb09efc4fc8a8e9a655adeeba4e984a9c0ca3e7b9153;K8S_POD_UID=8b1c7a56-5d00-468a-bb8d-dbaf8e854951" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/8b1c7a56-5d00-468a-bb8d-dbaf8e854951]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": context deadline exceeded Mar 20 08:38:39.468805 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.468805 master-0 kubenswrapper[7465]: > Mar 20 08:38:39.469004 master-0 kubenswrapper[7465]: E0320 08:38:39.468859 7465 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:38:39.469004 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_8b1c7a56-5d00-468a-bb8d-dbaf8e854951_0(c7f364ac5ac343cc49d7eb09efc4fc8a8e9a655adeeba4e984a9c0ca3e7b9153): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c7f364ac5ac343cc49d7eb09efc4fc8a8e9a655adeeba4e984a9c0ca3e7b9153" Netns:"/var/run/netns/60c6990c-6415-4a0b-ba8b-e62820d5aea5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=c7f364ac5ac343cc49d7eb09efc4fc8a8e9a655adeeba4e984a9c0ca3e7b9153;K8S_POD_UID=8b1c7a56-5d00-468a-bb8d-dbaf8e854951" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/8b1c7a56-5d00-468a-bb8d-dbaf8e854951]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": context deadline exceeded Mar 20 08:38:39.469004 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.469004 master-0 kubenswrapper[7465]: > pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:38:39.469004 master-0 kubenswrapper[7465]: E0320 08:38:39.468900 7465 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:38:39.469004 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_8b1c7a56-5d00-468a-bb8d-dbaf8e854951_0(c7f364ac5ac343cc49d7eb09efc4fc8a8e9a655adeeba4e984a9c0ca3e7b9153): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c7f364ac5ac343cc49d7eb09efc4fc8a8e9a655adeeba4e984a9c0ca3e7b9153" Netns:"/var/run/netns/60c6990c-6415-4a0b-ba8b-e62820d5aea5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=c7f364ac5ac343cc49d7eb09efc4fc8a8e9a655adeeba4e984a9c0ca3e7b9153;K8S_POD_UID=8b1c7a56-5d00-468a-bb8d-dbaf8e854951" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/8b1c7a56-5d00-468a-bb8d-dbaf8e854951]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": context deadline exceeded Mar 20 08:38:39.469004 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.469004 master-0 kubenswrapper[7465]: > pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:38:39.469208 master-0 kubenswrapper[7465]: E0320 08:38:39.469037 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"installer-2-master-0_openshift-kube-controller-manager(8b1c7a56-5d00-468a-bb8d-dbaf8e854951)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"installer-2-master-0_openshift-kube-controller-manager(8b1c7a56-5d00-468a-bb8d-dbaf8e854951)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_8b1c7a56-5d00-468a-bb8d-dbaf8e854951_0(c7f364ac5ac343cc49d7eb09efc4fc8a8e9a655adeeba4e984a9c0ca3e7b9153): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"c7f364ac5ac343cc49d7eb09efc4fc8a8e9a655adeeba4e984a9c0ca3e7b9153\\\" Netns:\\\"/var/run/netns/60c6990c-6415-4a0b-ba8b-e62820d5aea5\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=c7f364ac5ac343cc49d7eb09efc4fc8a8e9a655adeeba4e984a9c0ca3e7b9153;K8S_POD_UID=8b1c7a56-5d00-468a-bb8d-dbaf8e854951\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/8b1c7a56-5d00-468a-bb8d-dbaf8e854951]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s\\\": context deadline exceeded\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-kube-controller-manager/installer-2-master-0" podUID="8b1c7a56-5d00-468a-bb8d-dbaf8e854951" Mar 20 08:38:39.521743 master-0 kubenswrapper[7465]: E0320 08:38:39.521676 7465 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:38:39.521743 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_machine-api-operator-6fbb6cf6f9-n8tnn_openshift-machine-api_c7f5e6cd-e093-409a-8758-d3db7a7eb32c_0(a48640ea906a4e679db30d8ed63617d252b5836d193a04691bdbafc25f9a7a5d): error adding pod openshift-machine-api_machine-api-operator-6fbb6cf6f9-n8tnn to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a48640ea906a4e679db30d8ed63617d252b5836d193a04691bdbafc25f9a7a5d" Netns:"/var/run/netns/60ecd76f-554b-4846-9272-ed180fe57e60" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=machine-api-operator-6fbb6cf6f9-n8tnn;K8S_POD_INFRA_CONTAINER_ID=a48640ea906a4e679db30d8ed63617d252b5836d193a04691bdbafc25f9a7a5d;K8S_POD_UID=c7f5e6cd-e093-409a-8758-d3db7a7eb32c" Path:"" ERRORED: error configuring pod [openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn] networking: Multus: [openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn/c7f5e6cd-e093-409a-8758-d3db7a7eb32c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod machine-api-operator-6fbb6cf6f9-n8tnn in out of cluster comm: SetNetworkStatus: failed to update the pod machine-api-operator-6fbb6cf6f9-n8tnn in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/machine-api-operator-6fbb6cf6f9-n8tnn?timeout=1m0s": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 08:38:39.521743 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.521743 master-0 kubenswrapper[7465]: > Mar 20 08:38:39.521962 master-0 kubenswrapper[7465]: E0320 08:38:39.521777 7465 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:38:39.521962 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_machine-api-operator-6fbb6cf6f9-n8tnn_openshift-machine-api_c7f5e6cd-e093-409a-8758-d3db7a7eb32c_0(a48640ea906a4e679db30d8ed63617d252b5836d193a04691bdbafc25f9a7a5d): error adding pod openshift-machine-api_machine-api-operator-6fbb6cf6f9-n8tnn to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a48640ea906a4e679db30d8ed63617d252b5836d193a04691bdbafc25f9a7a5d" Netns:"/var/run/netns/60ecd76f-554b-4846-9272-ed180fe57e60" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=machine-api-operator-6fbb6cf6f9-n8tnn;K8S_POD_INFRA_CONTAINER_ID=a48640ea906a4e679db30d8ed63617d252b5836d193a04691bdbafc25f9a7a5d;K8S_POD_UID=c7f5e6cd-e093-409a-8758-d3db7a7eb32c" Path:"" ERRORED: error configuring pod [openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn] networking: Multus: [openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn/c7f5e6cd-e093-409a-8758-d3db7a7eb32c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod machine-api-operator-6fbb6cf6f9-n8tnn in out of cluster comm: SetNetworkStatus: failed to update the pod machine-api-operator-6fbb6cf6f9-n8tnn in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/machine-api-operator-6fbb6cf6f9-n8tnn?timeout=1m0s": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 08:38:39.521962 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.521962 master-0 kubenswrapper[7465]: > pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:38:39.521962 master-0 kubenswrapper[7465]: E0320 08:38:39.521803 7465 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:38:39.521962 master-0 kubenswrapper[7465]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_machine-api-operator-6fbb6cf6f9-n8tnn_openshift-machine-api_c7f5e6cd-e093-409a-8758-d3db7a7eb32c_0(a48640ea906a4e679db30d8ed63617d252b5836d193a04691bdbafc25f9a7a5d): error adding pod openshift-machine-api_machine-api-operator-6fbb6cf6f9-n8tnn to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a48640ea906a4e679db30d8ed63617d252b5836d193a04691bdbafc25f9a7a5d" Netns:"/var/run/netns/60ecd76f-554b-4846-9272-ed180fe57e60" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=machine-api-operator-6fbb6cf6f9-n8tnn;K8S_POD_INFRA_CONTAINER_ID=a48640ea906a4e679db30d8ed63617d252b5836d193a04691bdbafc25f9a7a5d;K8S_POD_UID=c7f5e6cd-e093-409a-8758-d3db7a7eb32c" Path:"" ERRORED: error configuring pod [openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn] networking: Multus: [openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn/c7f5e6cd-e093-409a-8758-d3db7a7eb32c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod machine-api-operator-6fbb6cf6f9-n8tnn in out of cluster comm: SetNetworkStatus: failed to update the pod machine-api-operator-6fbb6cf6f9-n8tnn in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/machine-api-operator-6fbb6cf6f9-n8tnn?timeout=1m0s": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 08:38:39.521962 master-0 kubenswrapper[7465]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:38:39.521962 master-0 kubenswrapper[7465]: > pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:38:39.521962 master-0 kubenswrapper[7465]: E0320 08:38:39.521882 7465 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"machine-api-operator-6fbb6cf6f9-n8tnn_openshift-machine-api(c7f5e6cd-e093-409a-8758-d3db7a7eb32c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"machine-api-operator-6fbb6cf6f9-n8tnn_openshift-machine-api(c7f5e6cd-e093-409a-8758-d3db7a7eb32c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_machine-api-operator-6fbb6cf6f9-n8tnn_openshift-machine-api_c7f5e6cd-e093-409a-8758-d3db7a7eb32c_0(a48640ea906a4e679db30d8ed63617d252b5836d193a04691bdbafc25f9a7a5d): error adding pod openshift-machine-api_machine-api-operator-6fbb6cf6f9-n8tnn to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"a48640ea906a4e679db30d8ed63617d252b5836d193a04691bdbafc25f9a7a5d\\\" Netns:\\\"/var/run/netns/60ecd76f-554b-4846-9272-ed180fe57e60\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=machine-api-operator-6fbb6cf6f9-n8tnn;K8S_POD_INFRA_CONTAINER_ID=a48640ea906a4e679db30d8ed63617d252b5836d193a04691bdbafc25f9a7a5d;K8S_POD_UID=c7f5e6cd-e093-409a-8758-d3db7a7eb32c\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn] networking: Multus: [openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn/c7f5e6cd-e093-409a-8758-d3db7a7eb32c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod machine-api-operator-6fbb6cf6f9-n8tnn in out of cluster comm: SetNetworkStatus: failed to update the pod machine-api-operator-6fbb6cf6f9-n8tnn in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/machine-api-operator-6fbb6cf6f9-n8tnn?timeout=1m0s\\\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" podUID="c7f5e6cd-e093-409a-8758-d3db7a7eb32c" Mar 20 08:38:39.977273 master-0 kubenswrapper[7465]: I0320 08:38:39.977168 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:39.977273 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:39.977273 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:39.977273 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:39.977696 master-0 kubenswrapper[7465]: I0320 08:38:39.977279 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:40.360079 master-0 kubenswrapper[7465]: I0320 08:38:40.360028 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:38:40.360769 master-0 kubenswrapper[7465]: I0320 08:38:40.360028 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:38:40.360769 master-0 kubenswrapper[7465]: I0320 08:38:40.360038 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:38:40.360769 master-0 kubenswrapper[7465]: I0320 08:38:40.360621 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:38:40.360769 master-0 kubenswrapper[7465]: I0320 08:38:40.360638 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:38:40.360998 master-0 kubenswrapper[7465]: I0320 08:38:40.360978 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:38:40.361141 master-0 kubenswrapper[7465]: I0320 08:38:40.361101 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:38:40.361833 master-0 kubenswrapper[7465]: I0320 08:38:40.361818 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:38:40.554816 master-0 kubenswrapper[7465]: E0320 08:38:40.554732 7465 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 20 08:38:40.555463 master-0 kubenswrapper[7465]: E0320 08:38:40.555438 7465 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.018s" Mar 20 08:38:40.559934 master-0 kubenswrapper[7465]: I0320 08:38:40.558801 7465 scope.go:117] "RemoveContainer" containerID="11fb9901a3c95280d4ada671324191ca15e473462e8fdb0e196a3a680e2d44a1" Mar 20 08:38:40.566125 master-0 kubenswrapper[7465]: I0320 08:38:40.566093 7465 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 20 08:38:40.978369 master-0 kubenswrapper[7465]: I0320 08:38:40.978161 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:40.978369 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:40.978369 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:40.978369 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:40.978369 master-0 kubenswrapper[7465]: I0320 08:38:40.978308 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:41.369025 master-0 kubenswrapper[7465]: I0320 08:38:41.368962 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-6t5vb_fb0fc10f-5796-4cd5-b8f5-72d678054c24/approver/0.log" Mar 20 08:38:41.978429 master-0 kubenswrapper[7465]: I0320 08:38:41.978300 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:41.978429 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:41.978429 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:41.978429 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:41.978429 master-0 kubenswrapper[7465]: I0320 08:38:41.978391 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:42.554591 master-0 kubenswrapper[7465]: E0320 08:38:42.554145 7465 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{router-default-7dcf5569b5-xmvwz.189e7fd50f5861ae openshift-ingress 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-ingress,Name:router-default-7dcf5569b5-xmvwz,UID:91b2899e-8d24-41a0-bec8-d11c67b8f955,APIVersion:v1,ResourceVersion:8055,FieldPath:spec.containers{router},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\" in 28.25s (28.25s including waiting). Image size: 487159945 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:37:36.292766126 +0000 UTC m=+81.936081616,LastTimestamp:2026-03-20 08:37:36.292766126 +0000 UTC m=+81.936081616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:38:42.978454 master-0 kubenswrapper[7465]: I0320 08:38:42.978340 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:42.978454 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:42.978454 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:42.978454 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:42.979034 master-0 kubenswrapper[7465]: I0320 08:38:42.978466 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:43.384663 master-0 kubenswrapper[7465]: I0320 08:38:43.384556 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-gzg9m_ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/ingress-operator/0.log" Mar 20 08:38:43.384663 master-0 kubenswrapper[7465]: I0320 08:38:43.384651 7465 generic.go:334] "Generic (PLEG): container finished" podID="ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02" containerID="0c7a85a881c1e6ccd13a87741cbb2be0fd8a4f88ff19bd0cb0941e6a43061f79" exitCode=1 Mar 20 08:38:43.978334 master-0 kubenswrapper[7465]: I0320 08:38:43.978235 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:43.978334 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:43.978334 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:43.978334 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:43.979633 master-0 kubenswrapper[7465]: I0320 08:38:43.978362 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:44.977167 master-0 kubenswrapper[7465]: I0320 08:38:44.976972 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:44.977167 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:44.977167 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:44.977167 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:44.977167 master-0 kubenswrapper[7465]: I0320 08:38:44.977118 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:45.979675 master-0 kubenswrapper[7465]: I0320 08:38:45.979570 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:45.979675 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:45.979675 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:45.979675 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:45.980790 master-0 kubenswrapper[7465]: I0320 08:38:45.979706 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:46.570961 master-0 kubenswrapper[7465]: E0320 08:38:46.570898 7465 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="6.015s" Mar 20 08:38:46.571294 master-0 kubenswrapper[7465]: I0320 08:38:46.571030 7465 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:38:46.571294 master-0 kubenswrapper[7465]: I0320 08:38:46.571051 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"e1d21f11-7386-4a04-a82e-5a03f3602a3b","Type":"ContainerDied","Data":"fc9fcf2245b5e00e0473ecdf9c16e18d2e148c7fa6e4f86bf8df81bc8b274006"} Mar 20 08:38:46.571294 master-0 kubenswrapper[7465]: I0320 08:38:46.571082 7465 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc9fcf2245b5e00e0473ecdf9c16e18d2e148c7fa6e4f86bf8df81bc8b274006" Mar 20 08:38:46.571294 master-0 kubenswrapper[7465]: I0320 08:38:46.571102 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:38:46.572894 master-0 kubenswrapper[7465]: I0320 08:38:46.572720 7465 scope.go:117] "RemoveContainer" containerID="d498d1943e73378963ddf1fcb87e4851f496c741db4b494c02cde7a06a7405b7" Mar 20 08:38:46.573702 master-0 kubenswrapper[7465]: I0320 08:38:46.572977 7465 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"8009103455f6e5b4ff5637d33a8290182816ff88ad007a0262e9e5800739bab4"} pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Mar 20 08:38:46.573702 master-0 kubenswrapper[7465]: I0320 08:38:46.573052 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" podUID="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" containerName="authentication-operator" containerID="cri-o://8009103455f6e5b4ff5637d33a8290182816ff88ad007a0262e9e5800739bab4" gracePeriod=30 Mar 20 08:38:46.573702 master-0 kubenswrapper[7465]: I0320 08:38:46.573372 7465 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"17402333a3400904f60b3f059728c9faa13bc6bb53c95e63d5e8325a42104e2f"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 20 08:38:46.573702 master-0 kubenswrapper[7465]: I0320 08:38:46.573476 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" containerID="cri-o://17402333a3400904f60b3f059728c9faa13bc6bb53c95e63d5e8325a42104e2f" gracePeriod=30 Mar 20 08:38:46.596035 master-0 kubenswrapper[7465]: I0320 08:38:46.590805 7465 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.602738 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" event={"ID":"fa759777-de22-4440-a3d3-ad429a3b8e7b","Type":"ContainerDied","Data":"9fc8932c5b9da1ed7961a714ed5b905683dbfe51746690c485563d67ef3b0b29"} Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.602807 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" event={"ID":"ad692349-5089-4afc-85b2-9b6e7997567c","Type":"ContainerDied","Data":"871b2216305e9883e9345351545dfa140768d0b1a9975612361ac0d841fda34c"} Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.602830 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" event={"ID":"325f0a83-d56d-4b62-977b-088a7d5f0e00","Type":"ContainerDied","Data":"d498d1943e73378963ddf1fcb87e4851f496c741db4b494c02cde7a06a7405b7"} Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.602849 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"a2139218314ea5d5d1e04c37be758e7a9f90c106dd3c470737be6550fb6322a9"} Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.602867 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" event={"ID":"c2a23d24-9e09-431e-8c3b-8456ff51a8d0","Type":"ContainerDied","Data":"4f3597589c3cef89a127e7d3fec06145ebc7ceb92ec5a686f505064f491c35a3"} Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.602892 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" event={"ID":"f046860d-2d54-4746-8ba2-f8e90fa55e38","Type":"ContainerDied","Data":"0bd5b879aadefbf9a44c6c6dcf866736c73041c10aa61588b2570b45b21eb4d2"} Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.602916 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"267f2d3e5624276bc815692d6f63750c35fab88bf4fad9637c60210f294ab470"} Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.602930 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"5de11809fbb3db5b6981fddb634a5dbf7f162fcbe9eede8cb63026b2ff7e2a3e"} Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.602942 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"91928dd4bf037a74fc3110c950269e5b4ae8998e3616107aa1170ce1d3fede55"} Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.602955 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"c0c0eafff8c825fc9c4a32593e8d54d61ac68f27a7fde59d8dfb857aeb1580f0"} Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.602968 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-6t5vb" event={"ID":"fb0fc10f-5796-4cd5-b8f5-72d678054c24","Type":"ContainerDied","Data":"11fb9901a3c95280d4ada671324191ca15e473462e8fdb0e196a3a680e2d44a1"} Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.602985 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"336ee2eca239c702b23f8eadc224486f445c6fd4853f373a73d423d9f64cfcac"} Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.602999 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-6t5vb" event={"ID":"fb0fc10f-5796-4cd5-b8f5-72d678054c24","Type":"ContainerStarted","Data":"b38f5242ee8fec0a4fb77638e3088bf483c8bc44e65e9e4b954af76f0ae77a90"} Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.603012 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" event={"ID":"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02","Type":"ContainerDied","Data":"0c7a85a881c1e6ccd13a87741cbb2be0fd8a4f88ff19bd0cb0941e6a43061f79"} Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.603794 7465 scope.go:117] "RemoveContainer" containerID="0c7a85a881c1e6ccd13a87741cbb2be0fd8a4f88ff19bd0cb0941e6a43061f79" Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.604900 7465 scope.go:117] "RemoveContainer" containerID="9fc8932c5b9da1ed7961a714ed5b905683dbfe51746690c485563d67ef3b0b29" Mar 20 08:38:46.606402 master-0 kubenswrapper[7465]: I0320 08:38:46.605628 7465 scope.go:117] "RemoveContainer" containerID="871b2216305e9883e9345351545dfa140768d0b1a9975612361ac0d841fda34c" Mar 20 08:38:46.613846 master-0 kubenswrapper[7465]: I0320 08:38:46.609740 7465 scope.go:117] "RemoveContainer" containerID="4f3597589c3cef89a127e7d3fec06145ebc7ceb92ec5a686f505064f491c35a3" Mar 20 08:38:46.616469 master-0 kubenswrapper[7465]: I0320 08:38:46.616143 7465 scope.go:117] "RemoveContainer" containerID="0bd5b879aadefbf9a44c6c6dcf866736c73041c10aa61588b2570b45b21eb4d2" Mar 20 08:38:46.625282 master-0 kubenswrapper[7465]: W0320 08:38:46.623253 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21bebade_17fa_444e_92a9_eea53d6cd673.slice/crio-0e8cf476b590f62cf998ca26bf5d07d7ec24559896e92dec128d4a67e132990a WatchSource:0}: Error finding container 0e8cf476b590f62cf998ca26bf5d07d7ec24559896e92dec128d4a67e132990a: Status 404 returned error can't find the container with id 0e8cf476b590f62cf998ca26bf5d07d7ec24559896e92dec128d4a67e132990a Mar 20 08:38:46.632083 master-0 kubenswrapper[7465]: W0320 08:38:46.631934 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8b1c7a56_5d00_468a_bb8d_dbaf8e854951.slice/crio-ff84ae96bfd96fbd48f779854c556cddaa04ed8b8b78d40e920af9da64968681 WatchSource:0}: Error finding container ff84ae96bfd96fbd48f779854c556cddaa04ed8b8b78d40e920af9da64968681: Status 404 returned error can't find the container with id ff84ae96bfd96fbd48f779854c556cddaa04ed8b8b78d40e920af9da64968681 Mar 20 08:38:46.646619 master-0 kubenswrapper[7465]: I0320 08:38:46.642840 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 20 08:38:46.646619 master-0 kubenswrapper[7465]: I0320 08:38:46.642869 7465 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="52261137-67d7-4c16-b3a6-125a22c5eab5" Mar 20 08:38:46.656303 master-0 kubenswrapper[7465]: I0320 08:38:46.649923 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 20 08:38:46.656303 master-0 kubenswrapper[7465]: I0320 08:38:46.649959 7465 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="52261137-67d7-4c16-b3a6-125a22c5eab5" Mar 20 08:38:46.656303 master-0 kubenswrapper[7465]: W0320 08:38:46.655457 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7f5e6cd_e093_409a_8758_d3db7a7eb32c.slice/crio-60bbe3130a4d3341abf02d519702749eab204c658db33e134e22b746126c5516 WatchSource:0}: Error finding container 60bbe3130a4d3341abf02d519702749eab204c658db33e134e22b746126c5516: Status 404 returned error can't find the container with id 60bbe3130a4d3341abf02d519702749eab204c658db33e134e22b746126c5516 Mar 20 08:38:46.660801 master-0 kubenswrapper[7465]: I0320 08:38:46.659374 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 20 08:38:46.663333 master-0 kubenswrapper[7465]: I0320 08:38:46.663298 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn"] Mar 20 08:38:46.667367 master-0 kubenswrapper[7465]: I0320 08:38:46.667315 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh"] Mar 20 08:38:46.669708 master-0 kubenswrapper[7465]: I0320 08:38:46.669658 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7"] Mar 20 08:38:46.672710 master-0 kubenswrapper[7465]: I0320 08:38:46.672510 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-68bf6ff9d6-mvfn5"] Mar 20 08:38:46.676962 master-0 kubenswrapper[7465]: I0320 08:38:46.676931 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx"] Mar 20 08:38:46.682321 master-0 kubenswrapper[7465]: I0320 08:38:46.682163 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=94.682130808 podStartE2EDuration="1m34.682130808s" podCreationTimestamp="2026-03-20 08:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:38:46.623218238 +0000 UTC m=+152.266533728" watchObservedRunningTime="2026-03-20 08:38:46.682130808 +0000 UTC m=+152.325446308" Mar 20 08:38:46.687221 master-0 kubenswrapper[7465]: I0320 08:38:46.687121 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" podStartSLOduration=78.923464295 podStartE2EDuration="1m24.687103003s" podCreationTimestamp="2026-03-20 08:37:22 +0000 UTC" firstStartedPulling="2026-03-20 08:37:39.080911442 +0000 UTC m=+84.724226932" lastFinishedPulling="2026-03-20 08:37:44.84455012 +0000 UTC m=+90.487865640" observedRunningTime="2026-03-20 08:38:46.654715187 +0000 UTC m=+152.298030677" watchObservedRunningTime="2026-03-20 08:38:46.687103003 +0000 UTC m=+152.330418493" Mar 20 08:38:46.688401 master-0 kubenswrapper[7465]: I0320 08:38:46.688343 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 20 08:38:46.690975 master-0 kubenswrapper[7465]: I0320 08:38:46.690913 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podStartSLOduration=96.440686097 podStartE2EDuration="2m4.690901713s" podCreationTimestamp="2026-03-20 08:36:42 +0000 UTC" firstStartedPulling="2026-03-20 08:37:08.04253354 +0000 UTC m=+53.685849020" lastFinishedPulling="2026-03-20 08:37:36.292749146 +0000 UTC m=+81.936064636" observedRunningTime="2026-03-20 08:38:46.681458858 +0000 UTC m=+152.324774348" watchObservedRunningTime="2026-03-20 08:38:46.690901713 +0000 UTC m=+152.334217203" Mar 20 08:38:46.730298 master-0 kubenswrapper[7465]: I0320 08:38:46.730245 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 20 08:38:46.737234 master-0 kubenswrapper[7465]: I0320 08:38:46.737161 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 20 08:38:46.889077 master-0 kubenswrapper[7465]: I0320 08:38:46.887786 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6486d766f9-5b77h"] Mar 20 08:38:46.891357 master-0 kubenswrapper[7465]: I0320 08:38:46.890413 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6486d766f9-5b77h"] Mar 20 08:38:46.914995 master-0 kubenswrapper[7465]: I0320 08:38:46.914920 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" podStartSLOduration=74.920296951 podStartE2EDuration="1m43.91489788s" podCreationTimestamp="2026-03-20 08:37:03 +0000 UTC" firstStartedPulling="2026-03-20 08:37:09.099864606 +0000 UTC m=+54.743180086" lastFinishedPulling="2026-03-20 08:37:38.094465525 +0000 UTC m=+83.737781015" observedRunningTime="2026-03-20 08:38:46.91179821 +0000 UTC m=+152.555113700" watchObservedRunningTime="2026-03-20 08:38:46.91489788 +0000 UTC m=+152.558213370" Mar 20 08:38:46.957802 master-0 kubenswrapper[7465]: I0320 08:38:46.957725 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-srjqw" podStartSLOduration=64.88027492 podStartE2EDuration="1m42.95770322s" podCreationTimestamp="2026-03-20 08:37:04 +0000 UTC" firstStartedPulling="2026-03-20 08:37:06.696275832 +0000 UTC m=+52.339591332" lastFinishedPulling="2026-03-20 08:37:44.773704102 +0000 UTC m=+90.417019632" observedRunningTime="2026-03-20 08:38:46.955423963 +0000 UTC m=+152.598739453" watchObservedRunningTime="2026-03-20 08:38:46.95770322 +0000 UTC m=+152.601018710" Mar 20 08:38:46.977399 master-0 kubenswrapper[7465]: I0320 08:38:46.977352 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:46.977399 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:46.977399 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:46.977399 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:46.977585 master-0 kubenswrapper[7465]: I0320 08:38:46.977428 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:47.003711 master-0 kubenswrapper[7465]: I0320 08:38:47.002671 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" podStartSLOduration=71.450029399 podStartE2EDuration="1m18.002647021s" podCreationTimestamp="2026-03-20 08:37:29 +0000 UTC" firstStartedPulling="2026-03-20 08:37:38.290674191 +0000 UTC m=+83.933989681" lastFinishedPulling="2026-03-20 08:37:44.843291813 +0000 UTC m=+90.486607303" observedRunningTime="2026-03-20 08:38:46.981090502 +0000 UTC m=+152.624406002" watchObservedRunningTime="2026-03-20 08:38:47.002647021 +0000 UTC m=+152.645962511" Mar 20 08:38:47.045383 master-0 kubenswrapper[7465]: I0320 08:38:47.045291 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zspn5" podStartSLOduration=67.470971571 podStartE2EDuration="1m40.045266365s" podCreationTimestamp="2026-03-20 08:37:07 +0000 UTC" firstStartedPulling="2026-03-20 08:37:11.923816626 +0000 UTC m=+57.567132116" lastFinishedPulling="2026-03-20 08:37:44.49811138 +0000 UTC m=+90.141426910" observedRunningTime="2026-03-20 08:38:47.043271447 +0000 UTC m=+152.686586937" watchObservedRunningTime="2026-03-20 08:38:47.045266365 +0000 UTC m=+152.688581855" Mar 20 08:38:47.122751 master-0 kubenswrapper[7465]: I0320 08:38:47.122685 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 20 08:38:47.135910 master-0 kubenswrapper[7465]: I0320 08:38:47.135855 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 20 08:38:47.155587 master-0 kubenswrapper[7465]: I0320 08:38:47.155509 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mgcb9" podStartSLOduration=63.401334203 podStartE2EDuration="1m42.155483751s" podCreationTimestamp="2026-03-20 08:37:05 +0000 UTC" firstStartedPulling="2026-03-20 08:37:07.741595988 +0000 UTC m=+53.384911468" lastFinishedPulling="2026-03-20 08:37:46.495745526 +0000 UTC m=+92.139061016" observedRunningTime="2026-03-20 08:38:47.15268496 +0000 UTC m=+152.796000460" watchObservedRunningTime="2026-03-20 08:38:47.155483751 +0000 UTC m=+152.798799241" Mar 20 08:38:47.182662 master-0 kubenswrapper[7465]: I0320 08:38:47.182596 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-86g9n" podStartSLOduration=66.213253235 podStartE2EDuration="1m41.182573572s" podCreationTimestamp="2026-03-20 08:37:06 +0000 UTC" firstStartedPulling="2026-03-20 08:37:09.876479899 +0000 UTC m=+55.519795389" lastFinishedPulling="2026-03-20 08:37:44.845800226 +0000 UTC m=+90.489115726" observedRunningTime="2026-03-20 08:38:47.181846971 +0000 UTC m=+152.825162461" watchObservedRunningTime="2026-03-20 08:38:47.182573572 +0000 UTC m=+152.825889052" Mar 20 08:38:47.254604 master-0 kubenswrapper[7465]: I0320 08:38:47.254131 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v"] Mar 20 08:38:47.275127 master-0 kubenswrapper[7465]: I0320 08:38:47.269944 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7ffc895647-6j97v"] Mar 20 08:38:47.421561 master-0 kubenswrapper[7465]: I0320 08:38:47.421454 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" event={"ID":"fa759777-de22-4440-a3d3-ad429a3b8e7b","Type":"ContainerStarted","Data":"ef88f829645fabb894212937333305cd3d87e5b021830721d4e5e9b594609690"} Mar 20 08:38:47.424803 master-0 kubenswrapper[7465]: I0320 08:38:47.423695 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" event={"ID":"21bebade-17fa-444e-92a9-eea53d6cd673","Type":"ContainerStarted","Data":"a74f65fd79254bc2069d4d58186204f182bbdda409cb8c9a6055b2b1614423bb"} Mar 20 08:38:47.424803 master-0 kubenswrapper[7465]: I0320 08:38:47.424022 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" event={"ID":"21bebade-17fa-444e-92a9-eea53d6cd673","Type":"ContainerStarted","Data":"0e8cf476b590f62cf998ca26bf5d07d7ec24559896e92dec128d4a67e132990a"} Mar 20 08:38:47.425992 master-0 kubenswrapper[7465]: I0320 08:38:47.425950 7465 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="17402333a3400904f60b3f059728c9faa13bc6bb53c95e63d5e8325a42104e2f" exitCode=2 Mar 20 08:38:47.426055 master-0 kubenswrapper[7465]: I0320 08:38:47.426036 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"17402333a3400904f60b3f059728c9faa13bc6bb53c95e63d5e8325a42104e2f"} Mar 20 08:38:47.426378 master-0 kubenswrapper[7465]: I0320 08:38:47.426347 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"8cdf7ffa9625537bd484b3cd72f3ca62a1fbd66303b800564461ec0e3e2735c7"} Mar 20 08:38:47.426425 master-0 kubenswrapper[7465]: I0320 08:38:47.426390 7465 scope.go:117] "RemoveContainer" containerID="8f140ecbb839c536b9c01be26f60184770878f8b44764513599810ad9344fdb7" Mar 20 08:38:47.428623 master-0 kubenswrapper[7465]: I0320 08:38:47.428579 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" event={"ID":"c2a23d24-9e09-431e-8c3b-8456ff51a8d0","Type":"ContainerStarted","Data":"bf7423bac144bcaaf3719ed8e76389e5f2ec9717aa4868ad4761ed7cc6782d76"} Mar 20 08:38:47.431399 master-0 kubenswrapper[7465]: I0320 08:38:47.431323 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" event={"ID":"f2217de0-7805-4f5f-8ea5-93b81b7e0236","Type":"ContainerStarted","Data":"1038dded4ac6146a3ef7e05fc425b32ac120e0351ec2aaee7b8ebe45679034dd"} Mar 20 08:38:47.438878 master-0 kubenswrapper[7465]: I0320 08:38:47.438819 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" event={"ID":"c7f5e6cd-e093-409a-8758-d3db7a7eb32c","Type":"ContainerStarted","Data":"b78496ecf995c35d24dfd3908193418c538ce4e684ecf45bdc674a187caf26f7"} Mar 20 08:38:47.438955 master-0 kubenswrapper[7465]: I0320 08:38:47.438883 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" event={"ID":"c7f5e6cd-e093-409a-8758-d3db7a7eb32c","Type":"ContainerStarted","Data":"60bbe3130a4d3341abf02d519702749eab204c658db33e134e22b746126c5516"} Mar 20 08:38:47.445815 master-0 kubenswrapper[7465]: I0320 08:38:47.445739 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" event={"ID":"469183dd-dc54-467d-82a1-611132ae8ec4","Type":"ContainerStarted","Data":"47b82c3aabac1e522a2b9825a1bdcce46331b472c4fd92d80f06797cd3c1f73f"} Mar 20 08:38:47.445934 master-0 kubenswrapper[7465]: I0320 08:38:47.445825 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" event={"ID":"469183dd-dc54-467d-82a1-611132ae8ec4","Type":"ContainerStarted","Data":"df1df4af888713c77332d729a24c1e1fdb472ce369b8165f8ad6dfbe7c60bbd6"} Mar 20 08:38:47.446641 master-0 kubenswrapper[7465]: I0320 08:38:47.446547 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=1.446520035 podStartE2EDuration="1.446520035s" podCreationTimestamp="2026-03-20 08:38:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:38:47.445176415 +0000 UTC m=+153.088491925" watchObservedRunningTime="2026-03-20 08:38:47.446520035 +0000 UTC m=+153.089835525" Mar 20 08:38:47.461547 master-0 kubenswrapper[7465]: I0320 08:38:47.461250 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-gzg9m_ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/ingress-operator/0.log" Mar 20 08:38:47.461547 master-0 kubenswrapper[7465]: I0320 08:38:47.461398 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" event={"ID":"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02","Type":"ContainerStarted","Data":"32acfc021b8f8071fac0cc1a8b0129efcea8236c65c56620ec15567dda3b37db"} Mar 20 08:38:47.463550 master-0 kubenswrapper[7465]: I0320 08:38:47.463369 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" event={"ID":"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466","Type":"ContainerStarted","Data":"ead213f06d6e13b0b8afce02cff25edfe82c583b53f661ee9bdc498f394f53a9"} Mar 20 08:38:47.474680 master-0 kubenswrapper[7465]: I0320 08:38:47.474602 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" event={"ID":"325f0a83-d56d-4b62-977b-088a7d5f0e00","Type":"ContainerStarted","Data":"3fb2d44fc3d06ba7cfb01123d6eb14daa319280841df97c7fb0370eae6efe992"} Mar 20 08:38:47.478804 master-0 kubenswrapper[7465]: I0320 08:38:47.478743 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" event={"ID":"f046860d-2d54-4746-8ba2-f8e90fa55e38","Type":"ContainerStarted","Data":"c3742feb1f4aa394282e45f9e7e1ad5a78209b23e0c120a4f3b31f9fa95097bc"} Mar 20 08:38:47.483335 master-0 kubenswrapper[7465]: I0320 08:38:47.483285 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"8b1c7a56-5d00-468a-bb8d-dbaf8e854951","Type":"ContainerStarted","Data":"ff84ae96bfd96fbd48f779854c556cddaa04ed8b8b78d40e920af9da64968681"} Mar 20 08:38:47.491913 master-0 kubenswrapper[7465]: I0320 08:38:47.491864 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7bd846bfc4-mt454_ad692349-5089-4afc-85b2-9b6e7997567c/network-operator/0.log" Mar 20 08:38:47.492972 master-0 kubenswrapper[7465]: I0320 08:38:47.492924 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" event={"ID":"ad692349-5089-4afc-85b2-9b6e7997567c","Type":"ContainerStarted","Data":"b10e547edcdc3314e5e478ee6b910f608083e53cf3a4277550ec5bfade59f20f"} Mar 20 08:38:47.618256 master-0 kubenswrapper[7465]: I0320 08:38:47.618146 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 20 08:38:47.626215 master-0 kubenswrapper[7465]: I0320 08:38:47.625705 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 20 08:38:47.656551 master-0 kubenswrapper[7465]: I0320 08:38:47.656267 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 20 08:38:47.716882 master-0 kubenswrapper[7465]: I0320 08:38:47.716790 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=91.716767031 podStartE2EDuration="1m31.716767031s" podCreationTimestamp="2026-03-20 08:37:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:38:47.712749694 +0000 UTC m=+153.356065184" watchObservedRunningTime="2026-03-20 08:38:47.716767031 +0000 UTC m=+153.360082541" Mar 20 08:38:47.978698 master-0 kubenswrapper[7465]: I0320 08:38:47.978333 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:47.978698 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:47.978698 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:47.978698 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:47.978698 master-0 kubenswrapper[7465]: I0320 08:38:47.978466 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:48.522355 master-0 kubenswrapper[7465]: I0320 08:38:48.522295 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"8b1c7a56-5d00-468a-bb8d-dbaf8e854951","Type":"ContainerStarted","Data":"5b4a47b78349fa5185bcf45526d28c821dd34bc78966a86b575a5f0037835565"} Mar 20 08:38:48.536448 master-0 kubenswrapper[7465]: I0320 08:38:48.535830 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 20 08:38:48.548901 master-0 kubenswrapper[7465]: I0320 08:38:48.548840 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="590dd533-c8db-42e3-9485-1c9df719773f" path="/var/lib/kubelet/pods/590dd533-c8db-42e3-9485-1c9df719773f/volumes" Mar 20 08:38:48.549645 master-0 kubenswrapper[7465]: I0320 08:38:48.549611 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8682b669-c173-4b96-80f6-029292f5c25b" path="/var/lib/kubelet/pods/8682b669-c173-4b96-80f6-029292f5c25b/volumes" Mar 20 08:38:48.550308 master-0 kubenswrapper[7465]: I0320 08:38:48.550275 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d17f953e-3ca4-4bd5-ad89-678447774687" path="/var/lib/kubelet/pods/d17f953e-3ca4-4bd5-ad89-678447774687/volumes" Mar 20 08:38:48.551083 master-0 kubenswrapper[7465]: I0320 08:38:48.551042 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e180bf9a-03f7-405b-90c3-b2e46008213e" path="/var/lib/kubelet/pods/e180bf9a-03f7-405b-90c3-b2e46008213e/volumes" Mar 20 08:38:48.977310 master-0 kubenswrapper[7465]: I0320 08:38:48.977249 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:48.977310 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:48.977310 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:48.977310 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:48.977644 master-0 kubenswrapper[7465]: I0320 08:38:48.977334 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:49.981216 master-0 kubenswrapper[7465]: I0320 08:38:49.979354 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:49.981216 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:49.981216 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:49.981216 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:49.981216 master-0 kubenswrapper[7465]: I0320 08:38:49.979435 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:50.982703 master-0 kubenswrapper[7465]: I0320 08:38:50.982631 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:50.982703 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:50.982703 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:50.982703 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:50.983467 master-0 kubenswrapper[7465]: I0320 08:38:50.982711 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:51.537056 master-0 kubenswrapper[7465]: I0320 08:38:51.536827 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:38:51.538310 master-0 kubenswrapper[7465]: I0320 08:38:51.538277 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:38:51.605698 master-0 kubenswrapper[7465]: I0320 08:38:51.605666 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:38:51.617786 master-0 kubenswrapper[7465]: I0320 08:38:51.617726 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:38:51.777119 master-0 kubenswrapper[7465]: I0320 08:38:51.777085 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-62zrx_29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/authentication-operator/1.log" Mar 20 08:38:51.778791 master-0 kubenswrapper[7465]: I0320 08:38:51.778743 7465 generic.go:334] "Generic (PLEG): container finished" podID="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" containerID="8009103455f6e5b4ff5637d33a8290182816ff88ad007a0262e9e5800739bab4" exitCode=255 Mar 20 08:38:51.780530 master-0 kubenswrapper[7465]: I0320 08:38:51.780502 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" event={"ID":"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6","Type":"ContainerDied","Data":"8009103455f6e5b4ff5637d33a8290182816ff88ad007a0262e9e5800739bab4"} Mar 20 08:38:51.780605 master-0 kubenswrapper[7465]: I0320 08:38:51.780547 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:38:51.780605 master-0 kubenswrapper[7465]: I0320 08:38:51.780562 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" event={"ID":"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6","Type":"ContainerStarted","Data":"df7ec56bc0dc6a5103a746a24bbb9fc1482c902df08dcd67e4b6e70f5d055d5f"} Mar 20 08:38:51.780675 master-0 kubenswrapper[7465]: I0320 08:38:51.780610 7465 scope.go:117] "RemoveContainer" containerID="b2c7cbe5708ed7a3530e1dc35eccab2ac0970444664ce50722925f65c5f61474" Mar 20 08:38:51.983245 master-0 kubenswrapper[7465]: I0320 08:38:51.983097 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:51.983245 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:51.983245 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:51.983245 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:51.983858 master-0 kubenswrapper[7465]: I0320 08:38:51.983233 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:52.053762 master-0 kubenswrapper[7465]: I0320 08:38:52.053694 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv"] Mar 20 08:38:52.796074 master-0 kubenswrapper[7465]: I0320 08:38:52.795971 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" event={"ID":"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466","Type":"ContainerStarted","Data":"3ed993094a62661a225afe193a23c8a2caab31f2640837dfe5f3c3a7f7e685b0"} Mar 20 08:38:52.796074 master-0 kubenswrapper[7465]: I0320 08:38:52.796052 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" event={"ID":"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466","Type":"ContainerStarted","Data":"c7891d6c35a440903f4bcbd4a40f2e1fa48f1550526df732c517bd9b6c44c0c9"} Mar 20 08:38:52.800691 master-0 kubenswrapper[7465]: I0320 08:38:52.800625 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" event={"ID":"21bebade-17fa-444e-92a9-eea53d6cd673","Type":"ContainerStarted","Data":"8b198f10122a271a46d1da5f2f799d55468d2123b4b2ad74d6f0cb05641e6136"} Mar 20 08:38:52.803039 master-0 kubenswrapper[7465]: I0320 08:38:52.803003 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-62zrx_29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/authentication-operator/1.log" Mar 20 08:38:52.809536 master-0 kubenswrapper[7465]: I0320 08:38:52.809468 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" event={"ID":"f2217de0-7805-4f5f-8ea5-93b81b7e0236","Type":"ContainerStarted","Data":"54740cdff38742d3ffc49ed74ce0bcac4131631c1e86f7422b93ffc4f5462afe"} Mar 20 08:38:52.811725 master-0 kubenswrapper[7465]: I0320 08:38:52.811648 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" event={"ID":"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e","Type":"ContainerStarted","Data":"f8c21c05090492f9afafa02ead2a469af0d1260ed484823064a0610864bf15d8"} Mar 20 08:38:52.819902 master-0 kubenswrapper[7465]: I0320 08:38:52.819745 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" podStartSLOduration=83.181590487 podStartE2EDuration="1m27.81971952s" podCreationTimestamp="2026-03-20 08:37:25 +0000 UTC" firstStartedPulling="2026-03-20 08:38:46.963501299 +0000 UTC m=+152.606816789" lastFinishedPulling="2026-03-20 08:38:51.601630332 +0000 UTC m=+157.244945822" observedRunningTime="2026-03-20 08:38:52.817045942 +0000 UTC m=+158.460361452" watchObservedRunningTime="2026-03-20 08:38:52.81971952 +0000 UTC m=+158.463035020" Mar 20 08:38:52.842540 master-0 kubenswrapper[7465]: I0320 08:38:52.842451 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" podStartSLOduration=81.218991467 podStartE2EDuration="1m25.842429992s" podCreationTimestamp="2026-03-20 08:37:27 +0000 UTC" firstStartedPulling="2026-03-20 08:38:46.982211815 +0000 UTC m=+152.625527305" lastFinishedPulling="2026-03-20 08:38:51.60565034 +0000 UTC m=+157.248965830" observedRunningTime="2026-03-20 08:38:52.837818028 +0000 UTC m=+158.481133528" watchObservedRunningTime="2026-03-20 08:38:52.842429992 +0000 UTC m=+158.485745482" Mar 20 08:38:52.857487 master-0 kubenswrapper[7465]: I0320 08:38:52.857397 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" podStartSLOduration=79.917362606 podStartE2EDuration="1m24.857377899s" podCreationTimestamp="2026-03-20 08:37:28 +0000 UTC" firstStartedPulling="2026-03-20 08:38:46.694437987 +0000 UTC m=+152.337753477" lastFinishedPulling="2026-03-20 08:38:51.63445328 +0000 UTC m=+157.277768770" observedRunningTime="2026-03-20 08:38:52.854872066 +0000 UTC m=+158.498187556" watchObservedRunningTime="2026-03-20 08:38:52.857377899 +0000 UTC m=+158.500693389" Mar 20 08:38:52.977749 master-0 kubenswrapper[7465]: I0320 08:38:52.977657 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:52.977749 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:52.977749 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:52.977749 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:52.978140 master-0 kubenswrapper[7465]: I0320 08:38:52.977780 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:53.978044 master-0 kubenswrapper[7465]: I0320 08:38:53.977943 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:53.978044 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:53.978044 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:53.978044 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:53.978044 master-0 kubenswrapper[7465]: I0320 08:38:53.978029 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:54.978431 master-0 kubenswrapper[7465]: I0320 08:38:54.978274 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:54.978431 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:54.978431 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:54.978431 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:54.978431 master-0 kubenswrapper[7465]: I0320 08:38:54.978379 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:55.978492 master-0 kubenswrapper[7465]: I0320 08:38:55.978401 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:55.978492 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:55.978492 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:55.978492 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:55.978492 master-0 kubenswrapper[7465]: I0320 08:38:55.978493 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:56.978127 master-0 kubenswrapper[7465]: I0320 08:38:56.978024 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:56.978127 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:56.978127 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:56.978127 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:56.978481 master-0 kubenswrapper[7465]: I0320 08:38:56.978244 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:57.977747 master-0 kubenswrapper[7465]: I0320 08:38:57.977672 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:57.977747 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:57.977747 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:57.977747 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:57.978450 master-0 kubenswrapper[7465]: I0320 08:38:57.977756 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:58.978848 master-0 kubenswrapper[7465]: I0320 08:38:58.978735 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:58.978848 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:58.978848 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:58.978848 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:58.980034 master-0 kubenswrapper[7465]: I0320 08:38:58.978854 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:38:59.868224 master-0 kubenswrapper[7465]: I0320 08:38:59.868130 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" event={"ID":"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e","Type":"ContainerStarted","Data":"322d73e8b151dad5452501bac1f7dfab899c0c317c5ec70fec1dfe654509113e"} Mar 20 08:38:59.869972 master-0 kubenswrapper[7465]: I0320 08:38:59.869885 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" event={"ID":"c7f5e6cd-e093-409a-8758-d3db7a7eb32c","Type":"ContainerStarted","Data":"43664e36cb7b60519ab710dfbfcb9bd2c63951d962e394659ce8bb21e98ebbb9"} Mar 20 08:38:59.871625 master-0 kubenswrapper[7465]: I0320 08:38:59.871576 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" event={"ID":"469183dd-dc54-467d-82a1-611132ae8ec4","Type":"ContainerStarted","Data":"476d7d163398e477e8e01c588ecf93d6f7b1021117a57e97b3cab709add5591d"} Mar 20 08:38:59.885595 master-0 kubenswrapper[7465]: I0320 08:38:59.885494 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" podStartSLOduration=92.417785659 podStartE2EDuration="1m39.885471967s" podCreationTimestamp="2026-03-20 08:37:20 +0000 UTC" firstStartedPulling="2026-03-20 08:38:52.074929964 +0000 UTC m=+157.718245444" lastFinishedPulling="2026-03-20 08:38:59.542616262 +0000 UTC m=+165.185931752" observedRunningTime="2026-03-20 08:38:59.88420651 +0000 UTC m=+165.527522020" watchObservedRunningTime="2026-03-20 08:38:59.885471967 +0000 UTC m=+165.528787457" Mar 20 08:38:59.918800 master-0 kubenswrapper[7465]: I0320 08:38:59.918621 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" podStartSLOduration=77.469888697 podStartE2EDuration="1m29.918583564s" podCreationTimestamp="2026-03-20 08:37:30 +0000 UTC" firstStartedPulling="2026-03-20 08:38:47.086244941 +0000 UTC m=+152.729560431" lastFinishedPulling="2026-03-20 08:38:59.534939818 +0000 UTC m=+165.178255298" observedRunningTime="2026-03-20 08:38:59.913952738 +0000 UTC m=+165.557268218" watchObservedRunningTime="2026-03-20 08:38:59.918583564 +0000 UTC m=+165.561899104" Mar 20 08:38:59.940654 master-0 kubenswrapper[7465]: I0320 08:38:59.940542 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" podStartSLOduration=82.396943509 podStartE2EDuration="1m34.940515824s" podCreationTimestamp="2026-03-20 08:37:25 +0000 UTC" firstStartedPulling="2026-03-20 08:38:47.027644521 +0000 UTC m=+152.670960011" lastFinishedPulling="2026-03-20 08:38:59.571216836 +0000 UTC m=+165.214532326" observedRunningTime="2026-03-20 08:38:59.938911507 +0000 UTC m=+165.582226997" watchObservedRunningTime="2026-03-20 08:38:59.940515824 +0000 UTC m=+165.583831314" Mar 20 08:38:59.979414 master-0 kubenswrapper[7465]: I0320 08:38:59.979333 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:38:59.979414 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:38:59.979414 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:38:59.979414 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:38:59.980225 master-0 kubenswrapper[7465]: I0320 08:38:59.979456 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:00.978019 master-0 kubenswrapper[7465]: I0320 08:39:00.977917 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:00.978019 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:00.978019 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:00.978019 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:00.978019 master-0 kubenswrapper[7465]: I0320 08:39:00.977997 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:01.978853 master-0 kubenswrapper[7465]: I0320 08:39:01.978778 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:01.978853 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:01.978853 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:01.978853 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:01.979952 master-0 kubenswrapper[7465]: I0320 08:39:01.978894 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:02.977105 master-0 kubenswrapper[7465]: I0320 08:39:02.977033 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:02.977105 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:02.977105 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:02.977105 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:02.977462 master-0 kubenswrapper[7465]: I0320 08:39:02.977114 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:03.734171 master-0 kubenswrapper[7465]: I0320 08:39:03.734109 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:39:03.978763 master-0 kubenswrapper[7465]: I0320 08:39:03.978636 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:03.978763 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:03.978763 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:03.978763 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:03.978763 master-0 kubenswrapper[7465]: I0320 08:39:03.978721 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:04.978780 master-0 kubenswrapper[7465]: I0320 08:39:04.978570 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:04.978780 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:04.978780 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:04.978780 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:04.978780 master-0 kubenswrapper[7465]: I0320 08:39:04.978657 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:05.290063 master-0 kubenswrapper[7465]: I0320 08:39:05.289900 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-62zrx_29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/authentication-operator/1.log" Mar 20 08:39:05.488705 master-0 kubenswrapper[7465]: I0320 08:39:05.488620 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-62zrx_29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/authentication-operator/2.log" Mar 20 08:39:05.688875 master-0 kubenswrapper[7465]: I0320 08:39:05.688792 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7dcf5569b5-xmvwz_91b2899e-8d24-41a0-bec8-d11c67b8f955/router/0.log" Mar 20 08:39:05.887410 master-0 kubenswrapper[7465]: I0320 08:39:05.887316 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-bc9b556d6-vdnq2_3de37144-a9ab-45fb-a23f-2287a5198edf/fix-audit-permissions/0.log" Mar 20 08:39:05.978455 master-0 kubenswrapper[7465]: I0320 08:39:05.978243 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:05.978455 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:05.978455 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:05.978455 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:05.978455 master-0 kubenswrapper[7465]: I0320 08:39:05.978347 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:06.099220 master-0 kubenswrapper[7465]: I0320 08:39:06.099123 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-bc9b556d6-vdnq2_3de37144-a9ab-45fb-a23f-2287a5198edf/oauth-apiserver/0.log" Mar 20 08:39:06.224958 master-0 kubenswrapper[7465]: I0320 08:39:06.224897 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5"] Mar 20 08:39:06.225315 master-0 kubenswrapper[7465]: E0320 08:39:06.225296 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d21f11-7386-4a04-a82e-5a03f3602a3b" containerName="installer" Mar 20 08:39:06.225386 master-0 kubenswrapper[7465]: I0320 08:39:06.225318 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d21f11-7386-4a04-a82e-5a03f3602a3b" containerName="installer" Mar 20 08:39:06.225386 master-0 kubenswrapper[7465]: E0320 08:39:06.225341 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4490a747-da2d-4f1a-8986-bc2c1c58424b" containerName="installer" Mar 20 08:39:06.225386 master-0 kubenswrapper[7465]: I0320 08:39:06.225350 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="4490a747-da2d-4f1a-8986-bc2c1c58424b" containerName="installer" Mar 20 08:39:06.225386 master-0 kubenswrapper[7465]: E0320 08:39:06.225373 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17f953e-3ca4-4bd5-ad89-678447774687" containerName="installer" Mar 20 08:39:06.225386 master-0 kubenswrapper[7465]: I0320 08:39:06.225383 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17f953e-3ca4-4bd5-ad89-678447774687" containerName="installer" Mar 20 08:39:06.225603 master-0 kubenswrapper[7465]: E0320 08:39:06.225398 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8682b669-c173-4b96-80f6-029292f5c25b" containerName="installer" Mar 20 08:39:06.225603 master-0 kubenswrapper[7465]: I0320 08:39:06.225406 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="8682b669-c173-4b96-80f6-029292f5c25b" containerName="installer" Mar 20 08:39:06.225603 master-0 kubenswrapper[7465]: E0320 08:39:06.225422 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590dd533-c8db-42e3-9485-1c9df719773f" containerName="controller-manager" Mar 20 08:39:06.225603 master-0 kubenswrapper[7465]: I0320 08:39:06.225430 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="590dd533-c8db-42e3-9485-1c9df719773f" containerName="controller-manager" Mar 20 08:39:06.225603 master-0 kubenswrapper[7465]: E0320 08:39:06.225445 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e180bf9a-03f7-405b-90c3-b2e46008213e" containerName="route-controller-manager" Mar 20 08:39:06.225603 master-0 kubenswrapper[7465]: I0320 08:39:06.225454 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="e180bf9a-03f7-405b-90c3-b2e46008213e" containerName="route-controller-manager" Mar 20 08:39:06.225603 master-0 kubenswrapper[7465]: I0320 08:39:06.225596 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="d17f953e-3ca4-4bd5-ad89-678447774687" containerName="installer" Mar 20 08:39:06.225863 master-0 kubenswrapper[7465]: I0320 08:39:06.225622 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="590dd533-c8db-42e3-9485-1c9df719773f" containerName="controller-manager" Mar 20 08:39:06.225863 master-0 kubenswrapper[7465]: I0320 08:39:06.225636 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="4490a747-da2d-4f1a-8986-bc2c1c58424b" containerName="installer" Mar 20 08:39:06.225863 master-0 kubenswrapper[7465]: I0320 08:39:06.225647 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1d21f11-7386-4a04-a82e-5a03f3602a3b" containerName="installer" Mar 20 08:39:06.225863 master-0 kubenswrapper[7465]: I0320 08:39:06.225666 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="e180bf9a-03f7-405b-90c3-b2e46008213e" containerName="route-controller-manager" Mar 20 08:39:06.225863 master-0 kubenswrapper[7465]: I0320 08:39:06.225683 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="8682b669-c173-4b96-80f6-029292f5c25b" containerName="installer" Mar 20 08:39:06.226362 master-0 kubenswrapper[7465]: I0320 08:39:06.226343 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:06.227328 master-0 kubenswrapper[7465]: I0320 08:39:06.227267 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-fc56bb77c-qd4sn"] Mar 20 08:39:06.228136 master-0 kubenswrapper[7465]: I0320 08:39:06.228110 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.229505 master-0 kubenswrapper[7465]: I0320 08:39:06.229426 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 08:39:06.229968 master-0 kubenswrapper[7465]: I0320 08:39:06.229942 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 08:39:06.230828 master-0 kubenswrapper[7465]: I0320 08:39:06.230768 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 08:39:06.232878 master-0 kubenswrapper[7465]: I0320 08:39:06.232849 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 08:39:06.233051 master-0 kubenswrapper[7465]: I0320 08:39:06.233034 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 08:39:06.233281 master-0 kubenswrapper[7465]: I0320 08:39:06.233264 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 08:39:06.233393 master-0 kubenswrapper[7465]: I0320 08:39:06.233335 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 08:39:06.233469 master-0 kubenswrapper[7465]: I0320 08:39:06.233421 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 08:39:06.233525 master-0 kubenswrapper[7465]: I0320 08:39:06.233460 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 08:39:06.234256 master-0 kubenswrapper[7465]: I0320 08:39:06.234230 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 08:39:06.255898 master-0 kubenswrapper[7465]: I0320 08:39:06.255807 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 08:39:06.259712 master-0 kubenswrapper[7465]: I0320 08:39:06.259649 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5"] Mar 20 08:39:06.273619 master-0 kubenswrapper[7465]: I0320 08:39:06.273166 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fc56bb77c-qd4sn"] Mar 20 08:39:06.290822 master-0 kubenswrapper[7465]: I0320 08:39:06.290727 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-brhw4_f046860d-2d54-4746-8ba2-f8e90fa55e38/etcd-operator/0.log" Mar 20 08:39:06.423335 master-0 kubenswrapper[7465]: I0320 08:39:06.423214 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a638c468-010c-4da3-ad62-26f5f2bbdbb9-serving-cert\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:06.423662 master-0 kubenswrapper[7465]: I0320 08:39:06.423356 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-client-ca\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.423662 master-0 kubenswrapper[7465]: I0320 08:39:06.423406 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5v9f\" (UniqueName: \"kubernetes.io/projected/a9a9ecf2-c476-4962-8333-21f242dbcb89-kube-api-access-d5v9f\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.423662 master-0 kubenswrapper[7465]: I0320 08:39:06.423512 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-client-ca\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:06.423870 master-0 kubenswrapper[7465]: I0320 08:39:06.423673 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdbds\" (UniqueName: \"kubernetes.io/projected/a638c468-010c-4da3-ad62-26f5f2bbdbb9-kube-api-access-sdbds\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:06.423870 master-0 kubenswrapper[7465]: I0320 08:39:06.423747 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-config\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.423870 master-0 kubenswrapper[7465]: I0320 08:39:06.423857 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-proxy-ca-bundles\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.424052 master-0 kubenswrapper[7465]: I0320 08:39:06.423974 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-config\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:06.424052 master-0 kubenswrapper[7465]: I0320 08:39:06.424009 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9a9ecf2-c476-4962-8333-21f242dbcb89-serving-cert\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.489825 master-0 kubenswrapper[7465]: I0320 08:39:06.489631 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-brhw4_f046860d-2d54-4746-8ba2-f8e90fa55e38/etcd-operator/1.log" Mar 20 08:39:06.525720 master-0 kubenswrapper[7465]: I0320 08:39:06.525609 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-config\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:06.525720 master-0 kubenswrapper[7465]: I0320 08:39:06.525710 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9a9ecf2-c476-4962-8333-21f242dbcb89-serving-cert\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.526083 master-0 kubenswrapper[7465]: I0320 08:39:06.525793 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a638c468-010c-4da3-ad62-26f5f2bbdbb9-serving-cert\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:06.526083 master-0 kubenswrapper[7465]: I0320 08:39:06.525840 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-client-ca\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.526083 master-0 kubenswrapper[7465]: I0320 08:39:06.525978 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5v9f\" (UniqueName: \"kubernetes.io/projected/a9a9ecf2-c476-4962-8333-21f242dbcb89-kube-api-access-d5v9f\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.526358 master-0 kubenswrapper[7465]: I0320 08:39:06.526122 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-client-ca\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:06.526358 master-0 kubenswrapper[7465]: I0320 08:39:06.526246 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdbds\" (UniqueName: \"kubernetes.io/projected/a638c468-010c-4da3-ad62-26f5f2bbdbb9-kube-api-access-sdbds\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:06.526358 master-0 kubenswrapper[7465]: I0320 08:39:06.526305 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-config\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.526673 master-0 kubenswrapper[7465]: I0320 08:39:06.526397 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-proxy-ca-bundles\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.527915 master-0 kubenswrapper[7465]: I0320 08:39:06.527778 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-config\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:06.531301 master-0 kubenswrapper[7465]: I0320 08:39:06.529723 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-client-ca\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:06.531301 master-0 kubenswrapper[7465]: I0320 08:39:06.530518 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-proxy-ca-bundles\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.531301 master-0 kubenswrapper[7465]: I0320 08:39:06.530910 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-config\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.531301 master-0 kubenswrapper[7465]: I0320 08:39:06.531285 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9a9ecf2-c476-4962-8333-21f242dbcb89-serving-cert\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.532062 master-0 kubenswrapper[7465]: I0320 08:39:06.532028 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-client-ca\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.535375 master-0 kubenswrapper[7465]: I0320 08:39:06.534593 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a638c468-010c-4da3-ad62-26f5f2bbdbb9-serving-cert\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:06.547599 master-0 kubenswrapper[7465]: I0320 08:39:06.547517 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdbds\" (UniqueName: \"kubernetes.io/projected/a638c468-010c-4da3-ad62-26f5f2bbdbb9-kube-api-access-sdbds\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:06.552091 master-0 kubenswrapper[7465]: I0320 08:39:06.551999 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:06.564651 master-0 kubenswrapper[7465]: I0320 08:39:06.564580 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5v9f\" (UniqueName: \"kubernetes.io/projected/a9a9ecf2-c476-4962-8333-21f242dbcb89-kube-api-access-d5v9f\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.577912 master-0 kubenswrapper[7465]: I0320 08:39:06.577831 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:06.711602 master-0 kubenswrapper[7465]: I0320 08:39:06.711478 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/setup/0.log" Mar 20 08:39:06.884850 master-0 kubenswrapper[7465]: I0320 08:39:06.884763 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-ensure-env-vars/0.log" Mar 20 08:39:06.978477 master-0 kubenswrapper[7465]: I0320 08:39:06.978407 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:06.978477 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:06.978477 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:06.978477 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:06.979587 master-0 kubenswrapper[7465]: I0320 08:39:06.978483 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:07.066965 master-0 kubenswrapper[7465]: I0320 08:39:07.066856 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5"] Mar 20 08:39:07.073514 master-0 kubenswrapper[7465]: W0320 08:39:07.073422 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda638c468_010c_4da3_ad62_26f5f2bbdbb9.slice/crio-e9d1c009ab8bfc1b5d9e5ffc8e765a713719b93c5edffb019bac7a72776dcb9e WatchSource:0}: Error finding container e9d1c009ab8bfc1b5d9e5ffc8e765a713719b93c5edffb019bac7a72776dcb9e: Status 404 returned error can't find the container with id e9d1c009ab8bfc1b5d9e5ffc8e765a713719b93c5edffb019bac7a72776dcb9e Mar 20 08:39:07.088429 master-0 kubenswrapper[7465]: I0320 08:39:07.088368 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-resources-copy/0.log" Mar 20 08:39:07.136160 master-0 kubenswrapper[7465]: I0320 08:39:07.135759 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fc56bb77c-qd4sn"] Mar 20 08:39:07.147523 master-0 kubenswrapper[7465]: W0320 08:39:07.147425 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9a9ecf2_c476_4962_8333_21f242dbcb89.slice/crio-0885038f20b8301926b3fbf1704c402c0cf75d3a9c64af977d4466bc2f1fe1b6 WatchSource:0}: Error finding container 0885038f20b8301926b3fbf1704c402c0cf75d3a9c64af977d4466bc2f1fe1b6: Status 404 returned error can't find the container with id 0885038f20b8301926b3fbf1704c402c0cf75d3a9c64af977d4466bc2f1fe1b6 Mar 20 08:39:07.293120 master-0 kubenswrapper[7465]: I0320 08:39:07.291750 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcdctl/0.log" Mar 20 08:39:07.500721 master-0 kubenswrapper[7465]: I0320 08:39:07.500559 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd/0.log" Mar 20 08:39:07.695396 master-0 kubenswrapper[7465]: I0320 08:39:07.695331 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 20 08:39:07.884559 master-0 kubenswrapper[7465]: I0320 08:39:07.884506 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-readyz/0.log" Mar 20 08:39:07.937075 master-0 kubenswrapper[7465]: I0320 08:39:07.936983 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" event={"ID":"a638c468-010c-4da3-ad62-26f5f2bbdbb9","Type":"ContainerStarted","Data":"ba86f095c655a63b2bbaa35e0d6ce4a38b529d13f07b295310237aeaa7d4bc4f"} Mar 20 08:39:07.937075 master-0 kubenswrapper[7465]: I0320 08:39:07.937058 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" event={"ID":"a638c468-010c-4da3-ad62-26f5f2bbdbb9","Type":"ContainerStarted","Data":"e9d1c009ab8bfc1b5d9e5ffc8e765a713719b93c5edffb019bac7a72776dcb9e"} Mar 20 08:39:07.938919 master-0 kubenswrapper[7465]: I0320 08:39:07.938422 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:07.943607 master-0 kubenswrapper[7465]: I0320 08:39:07.943428 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" event={"ID":"a9a9ecf2-c476-4962-8333-21f242dbcb89","Type":"ContainerStarted","Data":"c11d7390f44d6be5bc3fcba45513f31241f9e0af79908ed280e1f11a5db8a9db"} Mar 20 08:39:07.943607 master-0 kubenswrapper[7465]: I0320 08:39:07.943484 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" event={"ID":"a9a9ecf2-c476-4962-8333-21f242dbcb89","Type":"ContainerStarted","Data":"0885038f20b8301926b3fbf1704c402c0cf75d3a9c64af977d4466bc2f1fe1b6"} Mar 20 08:39:07.946531 master-0 kubenswrapper[7465]: I0320 08:39:07.944547 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:07.946531 master-0 kubenswrapper[7465]: I0320 08:39:07.944610 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:39:07.948878 master-0 kubenswrapper[7465]: I0320 08:39:07.948836 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:39:07.976410 master-0 kubenswrapper[7465]: I0320 08:39:07.976328 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" podStartSLOduration=107.97630858 podStartE2EDuration="1m47.97630858s" podCreationTimestamp="2026-03-20 08:37:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:39:07.97392516 +0000 UTC m=+173.617240670" watchObservedRunningTime="2026-03-20 08:39:07.97630858 +0000 UTC m=+173.619624070" Mar 20 08:39:07.977620 master-0 kubenswrapper[7465]: I0320 08:39:07.977575 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:07.977620 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:07.977620 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:07.977620 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:07.977807 master-0 kubenswrapper[7465]: I0320 08:39:07.977626 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:08.021928 master-0 kubenswrapper[7465]: I0320 08:39:08.021850 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" podStartSLOduration=108.021830927 podStartE2EDuration="1m48.021830927s" podCreationTimestamp="2026-03-20 08:37:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:39:08.019810158 +0000 UTC m=+173.663125648" watchObservedRunningTime="2026-03-20 08:39:08.021830927 +0000 UTC m=+173.665146417" Mar 20 08:39:08.094611 master-0 kubenswrapper[7465]: I0320 08:39:08.094556 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 20 08:39:08.290659 master-0 kubenswrapper[7465]: I0320 08:39:08.290508 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_4490a747-da2d-4f1a-8986-bc2c1c58424b/installer/0.log" Mar 20 08:39:08.498281 master-0 kubenswrapper[7465]: I0320 08:39:08.498210 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-8b68b9d9b-c6vkz_c1854ea4-c8e2-4289-84b6-1f18b2ac684f/kube-apiserver-operator/0.log" Mar 20 08:39:08.693600 master-0 kubenswrapper[7465]: I0320 08:39:08.693538 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-8b68b9d9b-c6vkz_c1854ea4-c8e2-4289-84b6-1f18b2ac684f/kube-apiserver-operator/1.log" Mar 20 08:39:08.775099 master-0 kubenswrapper[7465]: I0320 08:39:08.775024 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-86g9n"] Mar 20 08:39:08.775481 master-0 kubenswrapper[7465]: I0320 08:39:08.775408 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-86g9n" podUID="55cefc56-c008-45c1-a6ac-b1d3c8778c7b" containerName="registry-server" containerID="cri-o://068275d1cd841a9bf5f79cb0540e343d651716d7934e461f10b2346a851f5cbb" gracePeriod=2 Mar 20 08:39:08.891150 master-0 kubenswrapper[7465]: I0320 08:39:08.891084 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_49fac1b46a11e49501805e891baae4a9/setup/0.log" Mar 20 08:39:08.960426 master-0 kubenswrapper[7465]: I0320 08:39:08.960252 7465 generic.go:334] "Generic (PLEG): container finished" podID="55cefc56-c008-45c1-a6ac-b1d3c8778c7b" containerID="068275d1cd841a9bf5f79cb0540e343d651716d7934e461f10b2346a851f5cbb" exitCode=0 Mar 20 08:39:08.960626 master-0 kubenswrapper[7465]: I0320 08:39:08.960466 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86g9n" event={"ID":"55cefc56-c008-45c1-a6ac-b1d3c8778c7b","Type":"ContainerDied","Data":"068275d1cd841a9bf5f79cb0540e343d651716d7934e461f10b2346a851f5cbb"} Mar 20 08:39:08.968561 master-0 kubenswrapper[7465]: I0320 08:39:08.968243 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zspn5"] Mar 20 08:39:08.968561 master-0 kubenswrapper[7465]: I0320 08:39:08.968561 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zspn5" podUID="c649d964-ba32-44e4-a3cb-06a285972d97" containerName="registry-server" containerID="cri-o://12530f8cc7a96159435ca72d952505623c195a3951b56e70dfb55c96d0ec3d09" gracePeriod=2 Mar 20 08:39:08.987246 master-0 kubenswrapper[7465]: I0320 08:39:08.982225 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:08.987246 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:08.987246 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:08.987246 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:08.987246 master-0 kubenswrapper[7465]: I0320 08:39:08.982299 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:09.090001 master-0 kubenswrapper[7465]: I0320 08:39:09.088224 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_49fac1b46a11e49501805e891baae4a9/kube-apiserver/0.log" Mar 20 08:39:09.196590 master-0 kubenswrapper[7465]: I0320 08:39:09.196539 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cc955"] Mar 20 08:39:09.197994 master-0 kubenswrapper[7465]: I0320 08:39:09.197973 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:39:09.203953 master-0 kubenswrapper[7465]: I0320 08:39:09.203700 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-wq6zb" Mar 20 08:39:09.218299 master-0 kubenswrapper[7465]: I0320 08:39:09.216975 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cc955"] Mar 20 08:39:09.299592 master-0 kubenswrapper[7465]: I0320 08:39:09.298419 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:39:09.309539 master-0 kubenswrapper[7465]: I0320 08:39:09.309483 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_49fac1b46a11e49501805e891baae4a9/kube-apiserver-insecure-readyz/0.log" Mar 20 08:39:09.391678 master-0 kubenswrapper[7465]: I0320 08:39:09.390406 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758"] Mar 20 08:39:09.391678 master-0 kubenswrapper[7465]: I0320 08:39:09.390769 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" podUID="4117cb69-45cf-4966-82b6-a31340c7db11" containerName="kube-rbac-proxy" containerID="cri-o://3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1" gracePeriod=30 Mar 20 08:39:09.391678 master-0 kubenswrapper[7465]: I0320 08:39:09.391316 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" podUID="4117cb69-45cf-4966-82b6-a31340c7db11" containerName="machine-approver-controller" containerID="cri-o://49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166" gracePeriod=30 Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: I0320 08:39:09.393112 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29"] Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: I0320 08:39:09.393388 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" podUID="75e80b57-a0f3-4f6a-a022-457944c8f59b" containerName="cluster-cloud-controller-manager" containerID="cri-o://aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496" gracePeriod=30 Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: I0320 08:39:09.396546 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" podUID="75e80b57-a0f3-4f6a-a022-457944c8f59b" containerName="config-sync-controllers" containerID="cri-o://8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9" gracePeriod=30 Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: I0320 08:39:09.396687 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" podUID="75e80b57-a0f3-4f6a-a022-457944c8f59b" containerName="kube-rbac-proxy" containerID="cri-o://c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295" gracePeriod=30 Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: I0320 08:39:09.399424 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf485\" (UniqueName: \"kubernetes.io/projected/c0a17669-a122-44aa-bdda-581bf1fc4649-kube-api-access-xf485\") pod \"certified-operators-cc955\" (UID: \"c0a17669-a122-44aa-bdda-581bf1fc4649\") " pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: I0320 08:39:09.399637 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a17669-a122-44aa-bdda-581bf1fc4649-utilities\") pod \"certified-operators-cc955\" (UID: \"c0a17669-a122-44aa-bdda-581bf1fc4649\") " pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: I0320 08:39:09.399719 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a17669-a122-44aa-bdda-581bf1fc4649-catalog-content\") pod \"certified-operators-cc955\" (UID: \"c0a17669-a122-44aa-bdda-581bf1fc4649\") " pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: I0320 08:39:09.408946 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dtqgc"] Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: E0320 08:39:09.409216 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cefc56-c008-45c1-a6ac-b1d3c8778c7b" containerName="extract-content" Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: I0320 08:39:09.409231 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cefc56-c008-45c1-a6ac-b1d3c8778c7b" containerName="extract-content" Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: E0320 08:39:09.409251 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cefc56-c008-45c1-a6ac-b1d3c8778c7b" containerName="extract-utilities" Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: I0320 08:39:09.409259 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cefc56-c008-45c1-a6ac-b1d3c8778c7b" containerName="extract-utilities" Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: E0320 08:39:09.409273 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cefc56-c008-45c1-a6ac-b1d3c8778c7b" containerName="registry-server" Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: I0320 08:39:09.409279 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cefc56-c008-45c1-a6ac-b1d3c8778c7b" containerName="registry-server" Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: I0320 08:39:09.409496 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="55cefc56-c008-45c1-a6ac-b1d3c8778c7b" containerName="registry-server" Mar 20 08:39:09.415224 master-0 kubenswrapper[7465]: I0320 08:39:09.410355 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:39:09.416073 master-0 kubenswrapper[7465]: I0320 08:39:09.415609 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-vd4cn" Mar 20 08:39:09.473791 master-0 kubenswrapper[7465]: I0320 08:39:09.472033 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dtqgc"] Mar 20 08:39:09.495268 master-0 kubenswrapper[7465]: I0320 08:39:09.491750 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_e1d21f11-7386-4a04-a82e-5a03f3602a3b/installer/0.log" Mar 20 08:39:09.501224 master-0 kubenswrapper[7465]: I0320 08:39:09.501159 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxf7x\" (UniqueName: \"kubernetes.io/projected/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-kube-api-access-gxf7x\") pod \"55cefc56-c008-45c1-a6ac-b1d3c8778c7b\" (UID: \"55cefc56-c008-45c1-a6ac-b1d3c8778c7b\") " Mar 20 08:39:09.501832 master-0 kubenswrapper[7465]: I0320 08:39:09.501245 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-utilities\") pod \"55cefc56-c008-45c1-a6ac-b1d3c8778c7b\" (UID: \"55cefc56-c008-45c1-a6ac-b1d3c8778c7b\") " Mar 20 08:39:09.501832 master-0 kubenswrapper[7465]: I0320 08:39:09.501330 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-catalog-content\") pod \"55cefc56-c008-45c1-a6ac-b1d3c8778c7b\" (UID: \"55cefc56-c008-45c1-a6ac-b1d3c8778c7b\") " Mar 20 08:39:09.501832 master-0 kubenswrapper[7465]: I0320 08:39:09.501672 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a17669-a122-44aa-bdda-581bf1fc4649-catalog-content\") pod \"certified-operators-cc955\" (UID: \"c0a17669-a122-44aa-bdda-581bf1fc4649\") " pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:39:09.501832 master-0 kubenswrapper[7465]: I0320 08:39:09.501766 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b639e578-628e-404d-b759-8b6e84e771d9-catalog-content\") pod \"community-operators-dtqgc\" (UID: \"b639e578-628e-404d-b759-8b6e84e771d9\") " pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:39:09.501832 master-0 kubenswrapper[7465]: I0320 08:39:09.501825 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf485\" (UniqueName: \"kubernetes.io/projected/c0a17669-a122-44aa-bdda-581bf1fc4649-kube-api-access-xf485\") pod \"certified-operators-cc955\" (UID: \"c0a17669-a122-44aa-bdda-581bf1fc4649\") " pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:39:09.501994 master-0 kubenswrapper[7465]: I0320 08:39:09.501874 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b639e578-628e-404d-b759-8b6e84e771d9-utilities\") pod \"community-operators-dtqgc\" (UID: \"b639e578-628e-404d-b759-8b6e84e771d9\") " pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:39:09.501994 master-0 kubenswrapper[7465]: I0320 08:39:09.501915 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a17669-a122-44aa-bdda-581bf1fc4649-utilities\") pod \"certified-operators-cc955\" (UID: \"c0a17669-a122-44aa-bdda-581bf1fc4649\") " pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:39:09.501994 master-0 kubenswrapper[7465]: I0320 08:39:09.501947 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zp8f\" (UniqueName: \"kubernetes.io/projected/b639e578-628e-404d-b759-8b6e84e771d9-kube-api-access-9zp8f\") pod \"community-operators-dtqgc\" (UID: \"b639e578-628e-404d-b759-8b6e84e771d9\") " pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:39:09.502906 master-0 kubenswrapper[7465]: I0320 08:39:09.502867 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-utilities" (OuterVolumeSpecName: "utilities") pod "55cefc56-c008-45c1-a6ac-b1d3c8778c7b" (UID: "55cefc56-c008-45c1-a6ac-b1d3c8778c7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:39:09.503891 master-0 kubenswrapper[7465]: I0320 08:39:09.503741 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a17669-a122-44aa-bdda-581bf1fc4649-catalog-content\") pod \"certified-operators-cc955\" (UID: \"c0a17669-a122-44aa-bdda-581bf1fc4649\") " pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:39:09.503891 master-0 kubenswrapper[7465]: I0320 08:39:09.503832 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a17669-a122-44aa-bdda-581bf1fc4649-utilities\") pod \"certified-operators-cc955\" (UID: \"c0a17669-a122-44aa-bdda-581bf1fc4649\") " pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:39:09.510927 master-0 kubenswrapper[7465]: I0320 08:39:09.509244 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-kube-api-access-gxf7x" (OuterVolumeSpecName: "kube-api-access-gxf7x") pod "55cefc56-c008-45c1-a6ac-b1d3c8778c7b" (UID: "55cefc56-c008-45c1-a6ac-b1d3c8778c7b"). InnerVolumeSpecName "kube-api-access-gxf7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:39:09.528666 master-0 kubenswrapper[7465]: I0320 08:39:09.528589 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf485\" (UniqueName: \"kubernetes.io/projected/c0a17669-a122-44aa-bdda-581bf1fc4649-kube-api-access-xf485\") pod \"certified-operators-cc955\" (UID: \"c0a17669-a122-44aa-bdda-581bf1fc4649\") " pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:39:09.567089 master-0 kubenswrapper[7465]: I0320 08:39:09.566489 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:39:09.591220 master-0 kubenswrapper[7465]: I0320 08:39:09.591125 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55cefc56-c008-45c1-a6ac-b1d3c8778c7b" (UID: "55cefc56-c008-45c1-a6ac-b1d3c8778c7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:39:09.591985 master-0 kubenswrapper[7465]: I0320 08:39:09.591945 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:39:09.598822 master-0 kubenswrapper[7465]: I0320 08:39:09.598775 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:39:09.603246 master-0 kubenswrapper[7465]: I0320 08:39:09.603200 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b639e578-628e-404d-b759-8b6e84e771d9-utilities\") pod \"community-operators-dtqgc\" (UID: \"b639e578-628e-404d-b759-8b6e84e771d9\") " pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:39:09.603396 master-0 kubenswrapper[7465]: I0320 08:39:09.603263 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zp8f\" (UniqueName: \"kubernetes.io/projected/b639e578-628e-404d-b759-8b6e84e771d9-kube-api-access-9zp8f\") pod \"community-operators-dtqgc\" (UID: \"b639e578-628e-404d-b759-8b6e84e771d9\") " pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:39:09.603799 master-0 kubenswrapper[7465]: I0320 08:39:09.603501 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b639e578-628e-404d-b759-8b6e84e771d9-catalog-content\") pod \"community-operators-dtqgc\" (UID: \"b639e578-628e-404d-b759-8b6e84e771d9\") " pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:39:09.605859 master-0 kubenswrapper[7465]: I0320 08:39:09.603875 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b639e578-628e-404d-b759-8b6e84e771d9-utilities\") pod \"community-operators-dtqgc\" (UID: \"b639e578-628e-404d-b759-8b6e84e771d9\") " pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:39:09.605859 master-0 kubenswrapper[7465]: I0320 08:39:09.603982 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxf7x\" (UniqueName: \"kubernetes.io/projected/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-kube-api-access-gxf7x\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:09.605859 master-0 kubenswrapper[7465]: I0320 08:39:09.604062 7465 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-utilities\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:09.605859 master-0 kubenswrapper[7465]: I0320 08:39:09.604139 7465 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cefc56-c008-45c1-a6ac-b1d3c8778c7b-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:09.605859 master-0 kubenswrapper[7465]: I0320 08:39:09.604560 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b639e578-628e-404d-b759-8b6e84e771d9-catalog-content\") pod \"community-operators-dtqgc\" (UID: \"b639e578-628e-404d-b759-8b6e84e771d9\") " pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:39:09.610032 master-0 kubenswrapper[7465]: I0320 08:39:09.609712 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:39:09.644479 master-0 kubenswrapper[7465]: I0320 08:39:09.644374 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zp8f\" (UniqueName: \"kubernetes.io/projected/b639e578-628e-404d-b759-8b6e84e771d9-kube-api-access-9zp8f\") pod \"community-operators-dtqgc\" (UID: \"b639e578-628e-404d-b759-8b6e84e771d9\") " pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:39:09.700376 master-0 kubenswrapper[7465]: I0320 08:39:09.700316 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_8b1c7a56-5d00-468a-bb8d-dbaf8e854951/installer/0.log" Mar 20 08:39:09.706046 master-0 kubenswrapper[7465]: I0320 08:39:09.705977 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c649d964-ba32-44e4-a3cb-06a285972d97-catalog-content\") pod \"c649d964-ba32-44e4-a3cb-06a285972d97\" (UID: \"c649d964-ba32-44e4-a3cb-06a285972d97\") " Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.706069 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-697rh\" (UniqueName: \"kubernetes.io/projected/c649d964-ba32-44e4-a3cb-06a285972d97-kube-api-access-697rh\") pod \"c649d964-ba32-44e4-a3cb-06a285972d97\" (UID: \"c649d964-ba32-44e4-a3cb-06a285972d97\") " Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.706106 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/75e80b57-a0f3-4f6a-a022-457944c8f59b-host-etc-kube\") pod \"75e80b57-a0f3-4f6a-a022-457944c8f59b\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.706149 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvf9w\" (UniqueName: \"kubernetes.io/projected/75e80b57-a0f3-4f6a-a022-457944c8f59b-kube-api-access-dvf9w\") pod \"75e80b57-a0f3-4f6a-a022-457944c8f59b\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.706236 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75e80b57-a0f3-4f6a-a022-457944c8f59b-auth-proxy-config\") pod \"75e80b57-a0f3-4f6a-a022-457944c8f59b\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.706288 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/75e80b57-a0f3-4f6a-a022-457944c8f59b-images\") pod \"75e80b57-a0f3-4f6a-a022-457944c8f59b\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.706386 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75e80b57-a0f3-4f6a-a022-457944c8f59b-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "75e80b57-a0f3-4f6a-a022-457944c8f59b" (UID: "75e80b57-a0f3-4f6a-a022-457944c8f59b"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.706428 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c649d964-ba32-44e4-a3cb-06a285972d97-utilities\") pod \"c649d964-ba32-44e4-a3cb-06a285972d97\" (UID: \"c649d964-ba32-44e4-a3cb-06a285972d97\") " Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.706529 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4117cb69-45cf-4966-82b6-a31340c7db11-auth-proxy-config\") pod \"4117cb69-45cf-4966-82b6-a31340c7db11\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.706564 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/75e80b57-a0f3-4f6a-a022-457944c8f59b-cloud-controller-manager-operator-tls\") pod \"75e80b57-a0f3-4f6a-a022-457944c8f59b\" (UID: \"75e80b57-a0f3-4f6a-a022-457944c8f59b\") " Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.706605 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4117cb69-45cf-4966-82b6-a31340c7db11-machine-approver-tls\") pod \"4117cb69-45cf-4966-82b6-a31340c7db11\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.706663 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bwxq\" (UniqueName: \"kubernetes.io/projected/4117cb69-45cf-4966-82b6-a31340c7db11-kube-api-access-9bwxq\") pod \"4117cb69-45cf-4966-82b6-a31340c7db11\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.707456 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4117cb69-45cf-4966-82b6-a31340c7db11-config\") pod \"4117cb69-45cf-4966-82b6-a31340c7db11\" (UID: \"4117cb69-45cf-4966-82b6-a31340c7db11\") " Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.707479 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e80b57-a0f3-4f6a-a022-457944c8f59b-images" (OuterVolumeSpecName: "images") pod "75e80b57-a0f3-4f6a-a022-457944c8f59b" (UID: "75e80b57-a0f3-4f6a-a022-457944c8f59b"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.708204 7465 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/75e80b57-a0f3-4f6a-a022-457944c8f59b-host-etc-kube\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.708230 7465 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/75e80b57-a0f3-4f6a-a022-457944c8f59b-images\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.709338 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e80b57-a0f3-4f6a-a022-457944c8f59b-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "75e80b57-a0f3-4f6a-a022-457944c8f59b" (UID: "75e80b57-a0f3-4f6a-a022-457944c8f59b"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.709609 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4117cb69-45cf-4966-82b6-a31340c7db11-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "4117cb69-45cf-4966-82b6-a31340c7db11" (UID: "4117cb69-45cf-4966-82b6-a31340c7db11"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.710154 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c649d964-ba32-44e4-a3cb-06a285972d97-utilities" (OuterVolumeSpecName: "utilities") pod "c649d964-ba32-44e4-a3cb-06a285972d97" (UID: "c649d964-ba32-44e4-a3cb-06a285972d97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.710380 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4117cb69-45cf-4966-82b6-a31340c7db11-config" (OuterVolumeSpecName: "config") pod "4117cb69-45cf-4966-82b6-a31340c7db11" (UID: "4117cb69-45cf-4966-82b6-a31340c7db11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.714149 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4117cb69-45cf-4966-82b6-a31340c7db11-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "4117cb69-45cf-4966-82b6-a31340c7db11" (UID: "4117cb69-45cf-4966-82b6-a31340c7db11"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.714318 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e80b57-a0f3-4f6a-a022-457944c8f59b-kube-api-access-dvf9w" (OuterVolumeSpecName: "kube-api-access-dvf9w") pod "75e80b57-a0f3-4f6a-a022-457944c8f59b" (UID: "75e80b57-a0f3-4f6a-a022-457944c8f59b"). InnerVolumeSpecName "kube-api-access-dvf9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:39:09.714509 master-0 kubenswrapper[7465]: I0320 08:39:09.714341 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e80b57-a0f3-4f6a-a022-457944c8f59b-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "75e80b57-a0f3-4f6a-a022-457944c8f59b" (UID: "75e80b57-a0f3-4f6a-a022-457944c8f59b"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:39:09.717238 master-0 kubenswrapper[7465]: I0320 08:39:09.715691 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4117cb69-45cf-4966-82b6-a31340c7db11-kube-api-access-9bwxq" (OuterVolumeSpecName: "kube-api-access-9bwxq") pod "4117cb69-45cf-4966-82b6-a31340c7db11" (UID: "4117cb69-45cf-4966-82b6-a31340c7db11"). InnerVolumeSpecName "kube-api-access-9bwxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:39:09.717238 master-0 kubenswrapper[7465]: I0320 08:39:09.717145 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c649d964-ba32-44e4-a3cb-06a285972d97-kube-api-access-697rh" (OuterVolumeSpecName: "kube-api-access-697rh") pod "c649d964-ba32-44e4-a3cb-06a285972d97" (UID: "c649d964-ba32-44e4-a3cb-06a285972d97"). InnerVolumeSpecName "kube-api-access-697rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:39:09.809962 master-0 kubenswrapper[7465]: I0320 08:39:09.808839 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bwxq\" (UniqueName: \"kubernetes.io/projected/4117cb69-45cf-4966-82b6-a31340c7db11-kube-api-access-9bwxq\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:09.809962 master-0 kubenswrapper[7465]: I0320 08:39:09.808894 7465 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4117cb69-45cf-4966-82b6-a31340c7db11-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:09.809962 master-0 kubenswrapper[7465]: I0320 08:39:09.808905 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-697rh\" (UniqueName: \"kubernetes.io/projected/c649d964-ba32-44e4-a3cb-06a285972d97-kube-api-access-697rh\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:09.809962 master-0 kubenswrapper[7465]: I0320 08:39:09.808915 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvf9w\" (UniqueName: \"kubernetes.io/projected/75e80b57-a0f3-4f6a-a022-457944c8f59b-kube-api-access-dvf9w\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:09.809962 master-0 kubenswrapper[7465]: I0320 08:39:09.808927 7465 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75e80b57-a0f3-4f6a-a022-457944c8f59b-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:09.809962 master-0 kubenswrapper[7465]: I0320 08:39:09.808940 7465 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c649d964-ba32-44e4-a3cb-06a285972d97-utilities\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:09.809962 master-0 kubenswrapper[7465]: I0320 08:39:09.808949 7465 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4117cb69-45cf-4966-82b6-a31340c7db11-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:09.809962 master-0 kubenswrapper[7465]: I0320 08:39:09.808959 7465 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/75e80b57-a0f3-4f6a-a022-457944c8f59b-cloud-controller-manager-operator-tls\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:09.809962 master-0 kubenswrapper[7465]: I0320 08:39:09.808972 7465 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4117cb69-45cf-4966-82b6-a31340c7db11-machine-approver-tls\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:09.811745 master-0 kubenswrapper[7465]: I0320 08:39:09.811575 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c649d964-ba32-44e4-a3cb-06a285972d97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c649d964-ba32-44e4-a3cb-06a285972d97" (UID: "c649d964-ba32-44e4-a3cb-06a285972d97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:39:09.881955 master-0 kubenswrapper[7465]: I0320 08:39:09.881147 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:39:09.887546 master-0 kubenswrapper[7465]: I0320 08:39:09.887500 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-qcpb4_fa759777-de22-4440-a3d3-ad429a3b8e7b/kube-controller-manager-operator/0.log" Mar 20 08:39:09.910769 master-0 kubenswrapper[7465]: I0320 08:39:09.910684 7465 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c649d964-ba32-44e4-a3cb-06a285972d97-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: I0320 08:39:09.938484 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-6c8df6d4b-5m857"] Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: E0320 08:39:09.939464 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e80b57-a0f3-4f6a-a022-457944c8f59b" containerName="config-sync-controllers" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: I0320 08:39:09.939481 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e80b57-a0f3-4f6a-a022-457944c8f59b" containerName="config-sync-controllers" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: E0320 08:39:09.939498 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4117cb69-45cf-4966-82b6-a31340c7db11" containerName="machine-approver-controller" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: I0320 08:39:09.939506 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="4117cb69-45cf-4966-82b6-a31340c7db11" containerName="machine-approver-controller" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: E0320 08:39:09.939517 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c649d964-ba32-44e4-a3cb-06a285972d97" containerName="extract-utilities" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: I0320 08:39:09.939554 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="c649d964-ba32-44e4-a3cb-06a285972d97" containerName="extract-utilities" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: E0320 08:39:09.939572 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c649d964-ba32-44e4-a3cb-06a285972d97" containerName="registry-server" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: I0320 08:39:09.939581 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="c649d964-ba32-44e4-a3cb-06a285972d97" containerName="registry-server" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: E0320 08:39:09.939592 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e80b57-a0f3-4f6a-a022-457944c8f59b" containerName="kube-rbac-proxy" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: I0320 08:39:09.939600 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e80b57-a0f3-4f6a-a022-457944c8f59b" containerName="kube-rbac-proxy" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: E0320 08:39:09.939642 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4117cb69-45cf-4966-82b6-a31340c7db11" containerName="kube-rbac-proxy" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: I0320 08:39:09.939652 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="4117cb69-45cf-4966-82b6-a31340c7db11" containerName="kube-rbac-proxy" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: E0320 08:39:09.939660 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e80b57-a0f3-4f6a-a022-457944c8f59b" containerName="cluster-cloud-controller-manager" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: I0320 08:39:09.939668 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e80b57-a0f3-4f6a-a022-457944c8f59b" containerName="cluster-cloud-controller-manager" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: E0320 08:39:09.939679 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c649d964-ba32-44e4-a3cb-06a285972d97" containerName="extract-content" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: I0320 08:39:09.939684 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="c649d964-ba32-44e4-a3cb-06a285972d97" containerName="extract-content" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: I0320 08:39:09.939809 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="4117cb69-45cf-4966-82b6-a31340c7db11" containerName="kube-rbac-proxy" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: I0320 08:39:09.939825 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e80b57-a0f3-4f6a-a022-457944c8f59b" containerName="cluster-cloud-controller-manager" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: I0320 08:39:09.939837 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="c649d964-ba32-44e4-a3cb-06a285972d97" containerName="registry-server" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: I0320 08:39:09.939848 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e80b57-a0f3-4f6a-a022-457944c8f59b" containerName="config-sync-controllers" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: I0320 08:39:09.939862 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="4117cb69-45cf-4966-82b6-a31340c7db11" containerName="machine-approver-controller" Mar 20 08:39:09.941881 master-0 kubenswrapper[7465]: I0320 08:39:09.939870 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e80b57-a0f3-4f6a-a022-457944c8f59b" containerName="kube-rbac-proxy" Mar 20 08:39:09.942816 master-0 kubenswrapper[7465]: I0320 08:39:09.942400 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:39:09.947949 master-0 kubenswrapper[7465]: I0320 08:39:09.945840 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6c8df6d4b-5m857"] Mar 20 08:39:09.949437 master-0 kubenswrapper[7465]: I0320 08:39:09.949174 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-qvkkb" Mar 20 08:39:09.950777 master-0 kubenswrapper[7465]: I0320 08:39:09.949634 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 20 08:39:09.950777 master-0 kubenswrapper[7465]: I0320 08:39:09.949765 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 20 08:39:09.950777 master-0 kubenswrapper[7465]: I0320 08:39:09.950418 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 20 08:39:09.983491 master-0 kubenswrapper[7465]: I0320 08:39:09.983418 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:09.983491 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:09.983491 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:09.983491 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:09.984492 master-0 kubenswrapper[7465]: I0320 08:39:09.983503 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:09.984492 master-0 kubenswrapper[7465]: I0320 08:39:09.983961 7465 generic.go:334] "Generic (PLEG): container finished" podID="c649d964-ba32-44e4-a3cb-06a285972d97" containerID="12530f8cc7a96159435ca72d952505623c195a3951b56e70dfb55c96d0ec3d09" exitCode=0 Mar 20 08:39:09.984492 master-0 kubenswrapper[7465]: I0320 08:39:09.984133 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zspn5" Mar 20 08:39:09.988117 master-0 kubenswrapper[7465]: I0320 08:39:09.984684 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zspn5" event={"ID":"c649d964-ba32-44e4-a3cb-06a285972d97","Type":"ContainerDied","Data":"12530f8cc7a96159435ca72d952505623c195a3951b56e70dfb55c96d0ec3d09"} Mar 20 08:39:09.988117 master-0 kubenswrapper[7465]: I0320 08:39:09.984739 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zspn5" event={"ID":"c649d964-ba32-44e4-a3cb-06a285972d97","Type":"ContainerDied","Data":"72ef06cb8ef1d212f0ed5b75f9026683a6bf969cd551915b43925cfb6e211dc1"} Mar 20 08:39:09.988117 master-0 kubenswrapper[7465]: I0320 08:39:09.984768 7465 scope.go:117] "RemoveContainer" containerID="12530f8cc7a96159435ca72d952505623c195a3951b56e70dfb55c96d0ec3d09" Mar 20 08:39:09.997884 master-0 kubenswrapper[7465]: I0320 08:39:09.996111 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86g9n" event={"ID":"55cefc56-c008-45c1-a6ac-b1d3c8778c7b","Type":"ContainerDied","Data":"07e40aa377bfe7a7fc8825ffe8c45483249a93c88f052dadd76fb2c790f314d3"} Mar 20 08:39:09.997884 master-0 kubenswrapper[7465]: I0320 08:39:09.996232 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86g9n" Mar 20 08:39:10.012275 master-0 kubenswrapper[7465]: I0320 08:39:10.005847 7465 scope.go:117] "RemoveContainer" containerID="8d44586dc2f9d77b931bacec095df15c2bd3b3ec3048ced5896c388e58dacb0c" Mar 20 08:39:10.012275 master-0 kubenswrapper[7465]: I0320 08:39:10.011069 7465 generic.go:334] "Generic (PLEG): container finished" podID="75e80b57-a0f3-4f6a-a022-457944c8f59b" containerID="c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295" exitCode=0 Mar 20 08:39:10.012275 master-0 kubenswrapper[7465]: I0320 08:39:10.011116 7465 generic.go:334] "Generic (PLEG): container finished" podID="75e80b57-a0f3-4f6a-a022-457944c8f59b" containerID="8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9" exitCode=0 Mar 20 08:39:10.012275 master-0 kubenswrapper[7465]: I0320 08:39:10.011127 7465 generic.go:334] "Generic (PLEG): container finished" podID="75e80b57-a0f3-4f6a-a022-457944c8f59b" containerID="aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496" exitCode=0 Mar 20 08:39:10.012275 master-0 kubenswrapper[7465]: I0320 08:39:10.011219 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" event={"ID":"75e80b57-a0f3-4f6a-a022-457944c8f59b","Type":"ContainerDied","Data":"c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295"} Mar 20 08:39:10.012275 master-0 kubenswrapper[7465]: I0320 08:39:10.011279 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" event={"ID":"75e80b57-a0f3-4f6a-a022-457944c8f59b","Type":"ContainerDied","Data":"8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9"} Mar 20 08:39:10.012275 master-0 kubenswrapper[7465]: I0320 08:39:10.011293 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" event={"ID":"75e80b57-a0f3-4f6a-a022-457944c8f59b","Type":"ContainerDied","Data":"aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496"} Mar 20 08:39:10.012275 master-0 kubenswrapper[7465]: I0320 08:39:10.011312 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" event={"ID":"75e80b57-a0f3-4f6a-a022-457944c8f59b","Type":"ContainerDied","Data":"e6c1d5a99612e6d35505b3e74ccfbf34d01a1eaff1e58e1eab8e47b44ad28c82"} Mar 20 08:39:10.012275 master-0 kubenswrapper[7465]: I0320 08:39:10.011961 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29" Mar 20 08:39:10.159472 master-0 kubenswrapper[7465]: I0320 08:39:10.159303 7465 scope.go:117] "RemoveContainer" containerID="413513ca3ab43ee903b1af9b07a1e3a6753fe8f1a044a9c14cd7c2703547a117" Mar 20 08:39:10.163709 master-0 kubenswrapper[7465]: I0320 08:39:10.160737 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:39:10.163709 master-0 kubenswrapper[7465]: I0320 08:39:10.160812 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnm6c\" (UniqueName: \"kubernetes.io/projected/f91d1788-027d-432b-be33-ca952a95046a-kube-api-access-lnm6c\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:39:10.163709 master-0 kubenswrapper[7465]: I0320 08:39:10.160837 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f91d1788-027d-432b-be33-ca952a95046a-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:39:10.163709 master-0 kubenswrapper[7465]: I0320 08:39:10.160861 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:39:10.182352 master-0 kubenswrapper[7465]: I0320 08:39:10.179538 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zspn5"] Mar 20 08:39:10.190395 master-0 kubenswrapper[7465]: I0320 08:39:10.185690 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-qcpb4_fa759777-de22-4440-a3d3-ad429a3b8e7b/kube-controller-manager-operator/1.log" Mar 20 08:39:10.200366 master-0 kubenswrapper[7465]: I0320 08:39:10.197915 7465 generic.go:334] "Generic (PLEG): container finished" podID="4117cb69-45cf-4966-82b6-a31340c7db11" containerID="49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166" exitCode=0 Mar 20 08:39:10.200366 master-0 kubenswrapper[7465]: I0320 08:39:10.197964 7465 generic.go:334] "Generic (PLEG): container finished" podID="4117cb69-45cf-4966-82b6-a31340c7db11" containerID="3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1" exitCode=0 Mar 20 08:39:10.200366 master-0 kubenswrapper[7465]: I0320 08:39:10.198932 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" Mar 20 08:39:10.200366 master-0 kubenswrapper[7465]: I0320 08:39:10.199395 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zspn5"] Mar 20 08:39:10.200366 master-0 kubenswrapper[7465]: I0320 08:39:10.199438 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" event={"ID":"4117cb69-45cf-4966-82b6-a31340c7db11","Type":"ContainerDied","Data":"49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166"} Mar 20 08:39:10.200366 master-0 kubenswrapper[7465]: I0320 08:39:10.199462 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" event={"ID":"4117cb69-45cf-4966-82b6-a31340c7db11","Type":"ContainerDied","Data":"3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1"} Mar 20 08:39:10.200366 master-0 kubenswrapper[7465]: I0320 08:39:10.199473 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758" event={"ID":"4117cb69-45cf-4966-82b6-a31340c7db11","Type":"ContainerDied","Data":"87748b853faf38704aba6691ffdb72f7909a87d3516c0b11cedf1d4b870a3219"} Mar 20 08:39:10.245060 master-0 kubenswrapper[7465]: I0320 08:39:10.244982 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cc955"] Mar 20 08:39:10.260154 master-0 kubenswrapper[7465]: I0320 08:39:10.260019 7465 scope.go:117] "RemoveContainer" containerID="12530f8cc7a96159435ca72d952505623c195a3951b56e70dfb55c96d0ec3d09" Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: I0320 08:39:10.265370 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29"] Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: E0320 08:39:10.268047 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12530f8cc7a96159435ca72d952505623c195a3951b56e70dfb55c96d0ec3d09\": container with ID starting with 12530f8cc7a96159435ca72d952505623c195a3951b56e70dfb55c96d0ec3d09 not found: ID does not exist" containerID="12530f8cc7a96159435ca72d952505623c195a3951b56e70dfb55c96d0ec3d09" Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: I0320 08:39:10.268111 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12530f8cc7a96159435ca72d952505623c195a3951b56e70dfb55c96d0ec3d09"} err="failed to get container status \"12530f8cc7a96159435ca72d952505623c195a3951b56e70dfb55c96d0ec3d09\": rpc error: code = NotFound desc = could not find container \"12530f8cc7a96159435ca72d952505623c195a3951b56e70dfb55c96d0ec3d09\": container with ID starting with 12530f8cc7a96159435ca72d952505623c195a3951b56e70dfb55c96d0ec3d09 not found: ID does not exist" Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: I0320 08:39:10.268157 7465 scope.go:117] "RemoveContainer" containerID="8d44586dc2f9d77b931bacec095df15c2bd3b3ec3048ced5896c388e58dacb0c" Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: E0320 08:39:10.271398 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d44586dc2f9d77b931bacec095df15c2bd3b3ec3048ced5896c388e58dacb0c\": container with ID starting with 8d44586dc2f9d77b931bacec095df15c2bd3b3ec3048ced5896c388e58dacb0c not found: ID does not exist" containerID="8d44586dc2f9d77b931bacec095df15c2bd3b3ec3048ced5896c388e58dacb0c" Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: I0320 08:39:10.271456 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d44586dc2f9d77b931bacec095df15c2bd3b3ec3048ced5896c388e58dacb0c"} err="failed to get container status \"8d44586dc2f9d77b931bacec095df15c2bd3b3ec3048ced5896c388e58dacb0c\": rpc error: code = NotFound desc = could not find container \"8d44586dc2f9d77b931bacec095df15c2bd3b3ec3048ced5896c388e58dacb0c\": container with ID starting with 8d44586dc2f9d77b931bacec095df15c2bd3b3ec3048ced5896c388e58dacb0c not found: ID does not exist" Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: I0320 08:39:10.271493 7465 scope.go:117] "RemoveContainer" containerID="413513ca3ab43ee903b1af9b07a1e3a6753fe8f1a044a9c14cd7c2703547a117" Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: I0320 08:39:10.272124 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnm6c\" (UniqueName: \"kubernetes.io/projected/f91d1788-027d-432b-be33-ca952a95046a-kube-api-access-lnm6c\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: I0320 08:39:10.272170 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f91d1788-027d-432b-be33-ca952a95046a-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: I0320 08:39:10.272229 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: I0320 08:39:10.273739 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: E0320 08:39:10.275308 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"413513ca3ab43ee903b1af9b07a1e3a6753fe8f1a044a9c14cd7c2703547a117\": container with ID starting with 413513ca3ab43ee903b1af9b07a1e3a6753fe8f1a044a9c14cd7c2703547a117 not found: ID does not exist" containerID="413513ca3ab43ee903b1af9b07a1e3a6753fe8f1a044a9c14cd7c2703547a117" Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: I0320 08:39:10.275339 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"413513ca3ab43ee903b1af9b07a1e3a6753fe8f1a044a9c14cd7c2703547a117"} err="failed to get container status \"413513ca3ab43ee903b1af9b07a1e3a6753fe8f1a044a9c14cd7c2703547a117\": rpc error: code = NotFound desc = could not find container \"413513ca3ab43ee903b1af9b07a1e3a6753fe8f1a044a9c14cd7c2703547a117\": container with ID starting with 413513ca3ab43ee903b1af9b07a1e3a6753fe8f1a044a9c14cd7c2703547a117 not found: ID does not exist" Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: I0320 08:39:10.275366 7465 scope.go:117] "RemoveContainer" containerID="068275d1cd841a9bf5f79cb0540e343d651716d7934e461f10b2346a851f5cbb" Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: I0320 08:39:10.277502 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f91d1788-027d-432b-be33-ca952a95046a-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:39:10.282375 master-0 kubenswrapper[7465]: I0320 08:39:10.280451 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-xhq29"] Mar 20 08:39:10.283428 master-0 kubenswrapper[7465]: E0320 08:39:10.282527 7465 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Mar 20 08:39:10.283428 master-0 kubenswrapper[7465]: E0320 08:39:10.282601 7465 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-tls podName:f91d1788-027d-432b-be33-ca952a95046a nodeName:}" failed. No retries permitted until 2026-03-20 08:39:10.782575533 +0000 UTC m=+176.425891023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-tls") pod "prometheus-operator-6c8df6d4b-5m857" (UID: "f91d1788-027d-432b-be33-ca952a95046a") : secret "prometheus-operator-tls" not found Mar 20 08:39:10.288836 master-0 kubenswrapper[7465]: I0320 08:39:10.288383 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:39:10.320308 master-0 kubenswrapper[7465]: I0320 08:39:10.317985 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-86g9n"] Mar 20 08:39:10.323329 master-0 kubenswrapper[7465]: I0320 08:39:10.322073 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-controller-manager-master-0_46f265536aba6292ead501bc9b49f327/kube-controller-manager/1.log" Mar 20 08:39:10.333068 master-0 kubenswrapper[7465]: I0320 08:39:10.332980 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-86g9n"] Mar 20 08:39:10.344459 master-0 kubenswrapper[7465]: I0320 08:39:10.344359 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dtqgc"] Mar 20 08:39:10.348093 master-0 kubenswrapper[7465]: I0320 08:39:10.348065 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnm6c\" (UniqueName: \"kubernetes.io/projected/f91d1788-027d-432b-be33-ca952a95046a-kube-api-access-lnm6c\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:39:10.348293 master-0 kubenswrapper[7465]: I0320 08:39:10.348273 7465 scope.go:117] "RemoveContainer" containerID="826e6ad2813ac1102d49808d3d0e9d3cfd04bbcf8b2b1c66206cb30f43e2de58" Mar 20 08:39:10.349507 master-0 kubenswrapper[7465]: I0320 08:39:10.349451 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758"] Mar 20 08:39:10.363873 master-0 kubenswrapper[7465]: I0320 08:39:10.362267 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr"] Mar 20 08:39:10.364179 master-0 kubenswrapper[7465]: I0320 08:39:10.364073 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.367422 master-0 kubenswrapper[7465]: I0320 08:39:10.366606 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:39:10.367422 master-0 kubenswrapper[7465]: I0320 08:39:10.366893 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 08:39:10.367422 master-0 kubenswrapper[7465]: I0320 08:39:10.367012 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 20 08:39:10.367422 master-0 kubenswrapper[7465]: I0320 08:39:10.367042 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-d7bxn" Mar 20 08:39:10.367422 master-0 kubenswrapper[7465]: I0320 08:39:10.367340 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 20 08:39:10.367823 master-0 kubenswrapper[7465]: I0320 08:39:10.367565 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 20 08:39:10.371584 master-0 kubenswrapper[7465]: I0320 08:39:10.371538 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-mt758"] Mar 20 08:39:10.383685 master-0 kubenswrapper[7465]: I0320 08:39:10.383625 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw4np\" (UniqueName: \"kubernetes.io/projected/d88ba8e1-ee42-423f-9839-e71cb0265c6c-kube-api-access-lw4np\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.383941 master-0 kubenswrapper[7465]: I0320 08:39:10.383824 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/d88ba8e1-ee42-423f-9839-e71cb0265c6c-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.383977 master-0 kubenswrapper[7465]: I0320 08:39:10.383961 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.384410 master-0 kubenswrapper[7465]: I0320 08:39:10.384352 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/d88ba8e1-ee42-423f-9839-e71cb0265c6c-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.384474 master-0 kubenswrapper[7465]: I0320 08:39:10.384449 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.385596 master-0 kubenswrapper[7465]: I0320 08:39:10.385492 7465 scope.go:117] "RemoveContainer" containerID="eb9e9fd88203cc82f4f42778ae8752420346c9a6317474f729b19d61f2a0b11e" Mar 20 08:39:10.425848 master-0 kubenswrapper[7465]: I0320 08:39:10.425768 7465 scope.go:117] "RemoveContainer" containerID="c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295" Mar 20 08:39:10.429150 master-0 kubenswrapper[7465]: I0320 08:39:10.428039 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j"] Mar 20 08:39:10.442115 master-0 kubenswrapper[7465]: I0320 08:39:10.439102 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:39:10.447323 master-0 kubenswrapper[7465]: I0320 08:39:10.447292 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 08:39:10.448201 master-0 kubenswrapper[7465]: I0320 08:39:10.447548 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 08:39:10.448201 master-0 kubenswrapper[7465]: I0320 08:39:10.447707 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-rfqnk" Mar 20 08:39:10.448201 master-0 kubenswrapper[7465]: I0320 08:39:10.447866 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 08:39:10.448201 master-0 kubenswrapper[7465]: I0320 08:39:10.448055 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 08:39:10.448201 master-0 kubenswrapper[7465]: I0320 08:39:10.448171 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 08:39:10.459266 master-0 kubenswrapper[7465]: I0320 08:39:10.459050 7465 scope.go:117] "RemoveContainer" containerID="8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9" Mar 20 08:39:10.486629 master-0 kubenswrapper[7465]: I0320 08:39:10.485945 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/d88ba8e1-ee42-423f-9839-e71cb0265c6c-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.486629 master-0 kubenswrapper[7465]: I0320 08:39:10.486025 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxn2f\" (UniqueName: \"kubernetes.io/projected/ae39c09b-7aef-4615-8ced-0dcad39f23a5-kube-api-access-rxn2f\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:39:10.486629 master-0 kubenswrapper[7465]: I0320 08:39:10.486059 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.486629 master-0 kubenswrapper[7465]: I0320 08:39:10.486107 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4np\" (UniqueName: \"kubernetes.io/projected/d88ba8e1-ee42-423f-9839-e71cb0265c6c-kube-api-access-lw4np\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.486629 master-0 kubenswrapper[7465]: I0320 08:39:10.486137 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ae39c09b-7aef-4615-8ced-0dcad39f23a5-machine-approver-tls\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:39:10.486629 master-0 kubenswrapper[7465]: I0320 08:39:10.486173 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/d88ba8e1-ee42-423f-9839-e71cb0265c6c-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.486629 master-0 kubenswrapper[7465]: I0320 08:39:10.486242 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.486629 master-0 kubenswrapper[7465]: I0320 08:39:10.486559 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-auth-proxy-config\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:39:10.487122 master-0 kubenswrapper[7465]: I0320 08:39:10.486774 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-config\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:39:10.487122 master-0 kubenswrapper[7465]: I0320 08:39:10.486842 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/d88ba8e1-ee42-423f-9839-e71cb0265c6c-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.487179 master-0 kubenswrapper[7465]: I0320 08:39:10.487160 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.488267 master-0 kubenswrapper[7465]: I0320 08:39:10.487583 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.491587 master-0 kubenswrapper[7465]: I0320 08:39:10.490892 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/d88ba8e1-ee42-423f-9839-e71cb0265c6c-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.512371 master-0 kubenswrapper[7465]: I0320 08:39:10.512304 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw4np\" (UniqueName: \"kubernetes.io/projected/d88ba8e1-ee42-423f-9839-e71cb0265c6c-kube-api-access-lw4np\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.519925 master-0 kubenswrapper[7465]: I0320 08:39:10.519871 7465 scope.go:117] "RemoveContainer" containerID="aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496" Mar 20 08:39:10.557749 master-0 kubenswrapper[7465]: I0320 08:39:10.557666 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4117cb69-45cf-4966-82b6-a31340c7db11" path="/var/lib/kubelet/pods/4117cb69-45cf-4966-82b6-a31340c7db11/volumes" Mar 20 08:39:10.558389 master-0 kubenswrapper[7465]: I0320 08:39:10.558245 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55cefc56-c008-45c1-a6ac-b1d3c8778c7b" path="/var/lib/kubelet/pods/55cefc56-c008-45c1-a6ac-b1d3c8778c7b/volumes" Mar 20 08:39:10.558937 master-0 kubenswrapper[7465]: I0320 08:39:10.558909 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e80b57-a0f3-4f6a-a022-457944c8f59b" path="/var/lib/kubelet/pods/75e80b57-a0f3-4f6a-a022-457944c8f59b/volumes" Mar 20 08:39:10.560058 master-0 kubenswrapper[7465]: I0320 08:39:10.560020 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c649d964-ba32-44e4-a3cb-06a285972d97" path="/var/lib/kubelet/pods/c649d964-ba32-44e4-a3cb-06a285972d97/volumes" Mar 20 08:39:10.567430 master-0 kubenswrapper[7465]: I0320 08:39:10.567352 7465 scope.go:117] "RemoveContainer" containerID="c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295" Mar 20 08:39:10.569676 master-0 kubenswrapper[7465]: E0320 08:39:10.568994 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295\": container with ID starting with c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295 not found: ID does not exist" containerID="c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295" Mar 20 08:39:10.569676 master-0 kubenswrapper[7465]: I0320 08:39:10.569060 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295"} err="failed to get container status \"c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295\": rpc error: code = NotFound desc = could not find container \"c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295\": container with ID starting with c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295 not found: ID does not exist" Mar 20 08:39:10.569676 master-0 kubenswrapper[7465]: I0320 08:39:10.569110 7465 scope.go:117] "RemoveContainer" containerID="8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9" Mar 20 08:39:10.572260 master-0 kubenswrapper[7465]: E0320 08:39:10.570009 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9\": container with ID starting with 8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9 not found: ID does not exist" containerID="8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9" Mar 20 08:39:10.572260 master-0 kubenswrapper[7465]: I0320 08:39:10.570112 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9"} err="failed to get container status \"8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9\": rpc error: code = NotFound desc = could not find container \"8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9\": container with ID starting with 8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9 not found: ID does not exist" Mar 20 08:39:10.572260 master-0 kubenswrapper[7465]: I0320 08:39:10.570164 7465 scope.go:117] "RemoveContainer" containerID="aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496" Mar 20 08:39:10.572571 master-0 kubenswrapper[7465]: E0320 08:39:10.572496 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496\": container with ID starting with aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496 not found: ID does not exist" containerID="aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496" Mar 20 08:39:10.572731 master-0 kubenswrapper[7465]: I0320 08:39:10.572587 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496"} err="failed to get container status \"aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496\": rpc error: code = NotFound desc = could not find container \"aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496\": container with ID starting with aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496 not found: ID does not exist" Mar 20 08:39:10.572731 master-0 kubenswrapper[7465]: I0320 08:39:10.572658 7465 scope.go:117] "RemoveContainer" containerID="c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295" Mar 20 08:39:10.573793 master-0 kubenswrapper[7465]: I0320 08:39:10.573157 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295"} err="failed to get container status \"c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295\": rpc error: code = NotFound desc = could not find container \"c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295\": container with ID starting with c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295 not found: ID does not exist" Mar 20 08:39:10.573793 master-0 kubenswrapper[7465]: I0320 08:39:10.573217 7465 scope.go:117] "RemoveContainer" containerID="8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9" Mar 20 08:39:10.574739 master-0 kubenswrapper[7465]: I0320 08:39:10.574673 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9"} err="failed to get container status \"8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9\": rpc error: code = NotFound desc = could not find container \"8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9\": container with ID starting with 8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9 not found: ID does not exist" Mar 20 08:39:10.574739 master-0 kubenswrapper[7465]: I0320 08:39:10.574734 7465 scope.go:117] "RemoveContainer" containerID="aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496" Mar 20 08:39:10.578028 master-0 kubenswrapper[7465]: I0320 08:39:10.577979 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496"} err="failed to get container status \"aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496\": rpc error: code = NotFound desc = could not find container \"aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496\": container with ID starting with aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496 not found: ID does not exist" Mar 20 08:39:10.578028 master-0 kubenswrapper[7465]: I0320 08:39:10.578017 7465 scope.go:117] "RemoveContainer" containerID="c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295" Mar 20 08:39:10.578694 master-0 kubenswrapper[7465]: I0320 08:39:10.578337 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295"} err="failed to get container status \"c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295\": rpc error: code = NotFound desc = could not find container \"c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295\": container with ID starting with c6fe865954cef2a8e789bbec1fd6d66aad6244915c28be8e8bd338769a72e295 not found: ID does not exist" Mar 20 08:39:10.578694 master-0 kubenswrapper[7465]: I0320 08:39:10.578364 7465 scope.go:117] "RemoveContainer" containerID="8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9" Mar 20 08:39:10.579326 master-0 kubenswrapper[7465]: I0320 08:39:10.579272 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9"} err="failed to get container status \"8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9\": rpc error: code = NotFound desc = could not find container \"8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9\": container with ID starting with 8caf5bd3d6db5c6dbb12fe0d6e13927b48212cc7e47720e604d18c36dd1227a9 not found: ID does not exist" Mar 20 08:39:10.579326 master-0 kubenswrapper[7465]: I0320 08:39:10.579325 7465 scope.go:117] "RemoveContainer" containerID="aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496" Mar 20 08:39:10.579750 master-0 kubenswrapper[7465]: I0320 08:39:10.579708 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496"} err="failed to get container status \"aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496\": rpc error: code = NotFound desc = could not find container \"aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496\": container with ID starting with aadf242cbfbc00b46e4740bca796e7b6471d22bea6c5d98ef757deac78038496 not found: ID does not exist" Mar 20 08:39:10.579750 master-0 kubenswrapper[7465]: I0320 08:39:10.579735 7465 scope.go:117] "RemoveContainer" containerID="49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166" Mar 20 08:39:10.588471 master-0 kubenswrapper[7465]: I0320 08:39:10.588418 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-auth-proxy-config\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:39:10.588564 master-0 kubenswrapper[7465]: I0320 08:39:10.588521 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-config\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:39:10.588605 master-0 kubenswrapper[7465]: I0320 08:39:10.588562 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxn2f\" (UniqueName: \"kubernetes.io/projected/ae39c09b-7aef-4615-8ced-0dcad39f23a5-kube-api-access-rxn2f\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:39:10.588605 master-0 kubenswrapper[7465]: I0320 08:39:10.588602 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ae39c09b-7aef-4615-8ced-0dcad39f23a5-machine-approver-tls\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:39:10.589530 master-0 kubenswrapper[7465]: I0320 08:39:10.589491 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-config\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:39:10.590654 master-0 kubenswrapper[7465]: I0320 08:39:10.590616 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-auth-proxy-config\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:39:10.593963 master-0 kubenswrapper[7465]: I0320 08:39:10.593908 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ae39c09b-7aef-4615-8ced-0dcad39f23a5-machine-approver-tls\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:39:10.605360 master-0 kubenswrapper[7465]: I0320 08:39:10.605012 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxn2f\" (UniqueName: \"kubernetes.io/projected/ae39c09b-7aef-4615-8ced-0dcad39f23a5-kube-api-access-rxn2f\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:39:10.615700 master-0 kubenswrapper[7465]: I0320 08:39:10.615633 7465 scope.go:117] "RemoveContainer" containerID="3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1" Mar 20 08:39:10.633177 master-0 kubenswrapper[7465]: I0320 08:39:10.632854 7465 scope.go:117] "RemoveContainer" containerID="49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166" Mar 20 08:39:10.633840 master-0 kubenswrapper[7465]: E0320 08:39:10.633785 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166\": container with ID starting with 49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166 not found: ID does not exist" containerID="49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166" Mar 20 08:39:10.633955 master-0 kubenswrapper[7465]: I0320 08:39:10.633832 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166"} err="failed to get container status \"49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166\": rpc error: code = NotFound desc = could not find container \"49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166\": container with ID starting with 49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166 not found: ID does not exist" Mar 20 08:39:10.633955 master-0 kubenswrapper[7465]: I0320 08:39:10.633864 7465 scope.go:117] "RemoveContainer" containerID="3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1" Mar 20 08:39:10.635515 master-0 kubenswrapper[7465]: E0320 08:39:10.635467 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1\": container with ID starting with 3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1 not found: ID does not exist" containerID="3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1" Mar 20 08:39:10.635593 master-0 kubenswrapper[7465]: I0320 08:39:10.635509 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1"} err="failed to get container status \"3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1\": rpc error: code = NotFound desc = could not find container \"3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1\": container with ID starting with 3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1 not found: ID does not exist" Mar 20 08:39:10.635593 master-0 kubenswrapper[7465]: I0320 08:39:10.635531 7465 scope.go:117] "RemoveContainer" containerID="49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166" Mar 20 08:39:10.638374 master-0 kubenswrapper[7465]: I0320 08:39:10.638321 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166"} err="failed to get container status \"49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166\": rpc error: code = NotFound desc = could not find container \"49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166\": container with ID starting with 49a233d9f4be97d10757bd91e10e5cdb9328ae43931cc3e24b52a9ec55b39166 not found: ID does not exist" Mar 20 08:39:10.638374 master-0 kubenswrapper[7465]: I0320 08:39:10.638358 7465 scope.go:117] "RemoveContainer" containerID="3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1" Mar 20 08:39:10.638881 master-0 kubenswrapper[7465]: I0320 08:39:10.638709 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1"} err="failed to get container status \"3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1\": rpc error: code = NotFound desc = could not find container \"3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1\": container with ID starting with 3f742e27e2d8abc0bf85fe0e0e08d34f0d7c9b1dc6f3bb30732cbe69c148a9a1 not found: ID does not exist" Mar 20 08:39:10.693780 master-0 kubenswrapper[7465]: I0320 08:39:10.693729 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-controller-manager-master-0_46f265536aba6292ead501bc9b49f327/kube-controller-manager/2.log" Mar 20 08:39:10.708842 master-0 kubenswrapper[7465]: I0320 08:39:10.708743 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:39:10.726356 master-0 kubenswrapper[7465]: W0320 08:39:10.726301 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd88ba8e1_ee42_423f_9839_e71cb0265c6c.slice/crio-49f836c59184940ba77efca8e250a75cfca3ff592389502fe02c69990736b85f WatchSource:0}: Error finding container 49f836c59184940ba77efca8e250a75cfca3ff592389502fe02c69990736b85f: Status 404 returned error can't find the container with id 49f836c59184940ba77efca8e250a75cfca3ff592389502fe02c69990736b85f Mar 20 08:39:10.792036 master-0 kubenswrapper[7465]: I0320 08:39:10.791860 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:39:10.795860 master-0 kubenswrapper[7465]: I0320 08:39:10.795759 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:39:10.868617 master-0 kubenswrapper[7465]: I0320 08:39:10.868548 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:39:10.873710 master-0 kubenswrapper[7465]: I0320 08:39:10.873664 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:39:10.894260 master-0 kubenswrapper[7465]: I0320 08:39:10.894210 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-controller-manager-master-0_46f265536aba6292ead501bc9b49f327/cluster-policy-controller/0.log" Mar 20 08:39:10.912379 master-0 kubenswrapper[7465]: W0320 08:39:10.912319 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae39c09b_7aef_4615_8ced_0dcad39f23a5.slice/crio-93a839a770be652f7d12315bd6dada21638d2838dbf7e7ad327b9d696e2d3e07 WatchSource:0}: Error finding container 93a839a770be652f7d12315bd6dada21638d2838dbf7e7ad327b9d696e2d3e07: Status 404 returned error can't find the container with id 93a839a770be652f7d12315bd6dada21638d2838dbf7e7ad327b9d696e2d3e07 Mar 20 08:39:10.980347 master-0 kubenswrapper[7465]: I0320 08:39:10.979603 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:10.980347 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:10.980347 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:10.980347 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:10.980347 master-0 kubenswrapper[7465]: I0320 08:39:10.979715 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:11.096411 master-0 kubenswrapper[7465]: I0320 08:39:11.096359 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-scheduler-master-0_c83737980b9ee109184b1d78e942cf36/kube-scheduler/0.log" Mar 20 08:39:11.209671 master-0 kubenswrapper[7465]: I0320 08:39:11.209616 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" event={"ID":"ae39c09b-7aef-4615-8ced-0dcad39f23a5","Type":"ContainerStarted","Data":"93a839a770be652f7d12315bd6dada21638d2838dbf7e7ad327b9d696e2d3e07"} Mar 20 08:39:11.211990 master-0 kubenswrapper[7465]: I0320 08:39:11.211939 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" event={"ID":"d88ba8e1-ee42-423f-9839-e71cb0265c6c","Type":"ContainerStarted","Data":"40d3c58441549fd94b4fd06f62f9b9e1bdfe941a93f1f046de6cd048124dc220"} Mar 20 08:39:11.212064 master-0 kubenswrapper[7465]: I0320 08:39:11.212003 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" event={"ID":"d88ba8e1-ee42-423f-9839-e71cb0265c6c","Type":"ContainerStarted","Data":"49f836c59184940ba77efca8e250a75cfca3ff592389502fe02c69990736b85f"} Mar 20 08:39:11.219758 master-0 kubenswrapper[7465]: I0320 08:39:11.219708 7465 generic.go:334] "Generic (PLEG): container finished" podID="b639e578-628e-404d-b759-8b6e84e771d9" containerID="1270d65f2b1bd2bb8e9f27e0d20a7b179ff2340a8f906b8f6439c6b5966d578b" exitCode=0 Mar 20 08:39:11.219942 master-0 kubenswrapper[7465]: I0320 08:39:11.219800 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtqgc" event={"ID":"b639e578-628e-404d-b759-8b6e84e771d9","Type":"ContainerDied","Data":"1270d65f2b1bd2bb8e9f27e0d20a7b179ff2340a8f906b8f6439c6b5966d578b"} Mar 20 08:39:11.219942 master-0 kubenswrapper[7465]: I0320 08:39:11.219887 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtqgc" event={"ID":"b639e578-628e-404d-b759-8b6e84e771d9","Type":"ContainerStarted","Data":"d48534fe1c98270494577c8d49aed8602c14ccc175395517708a7b89389db471"} Mar 20 08:39:11.227760 master-0 kubenswrapper[7465]: I0320 08:39:11.227725 7465 generic.go:334] "Generic (PLEG): container finished" podID="c0a17669-a122-44aa-bdda-581bf1fc4649" containerID="b03eca8e9b81865f87dea3515203478115ad1b39533d7a34515e851d32bd2010" exitCode=0 Mar 20 08:39:11.228205 master-0 kubenswrapper[7465]: I0320 08:39:11.228114 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc955" event={"ID":"c0a17669-a122-44aa-bdda-581bf1fc4649","Type":"ContainerDied","Data":"b03eca8e9b81865f87dea3515203478115ad1b39533d7a34515e851d32bd2010"} Mar 20 08:39:11.228339 master-0 kubenswrapper[7465]: I0320 08:39:11.228317 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc955" event={"ID":"c0a17669-a122-44aa-bdda-581bf1fc4649","Type":"ContainerStarted","Data":"ec3f7a57e8d7aa7239f51fc0b75ccf091bb42e503457a1919c637dd65b9da53e"} Mar 20 08:39:11.295406 master-0 kubenswrapper[7465]: I0320 08:39:11.295338 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-scheduler-master-0_c83737980b9ee109184b1d78e942cf36/kube-scheduler/1.log" Mar 20 08:39:11.314754 master-0 kubenswrapper[7465]: W0320 08:39:11.314668 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf91d1788_027d_432b_be33_ca952a95046a.slice/crio-49245723e92395f35c0b36240f7d6fbf94cef777cdd886e69b77ece8cf42ea5f WatchSource:0}: Error finding container 49245723e92395f35c0b36240f7d6fbf94cef777cdd886e69b77ece8cf42ea5f: Status 404 returned error can't find the container with id 49245723e92395f35c0b36240f7d6fbf94cef777cdd886e69b77ece8cf42ea5f Mar 20 08:39:11.323571 master-0 kubenswrapper[7465]: I0320 08:39:11.323510 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6c8df6d4b-5m857"] Mar 20 08:39:11.491444 master-0 kubenswrapper[7465]: I0320 08:39:11.491032 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_fac672fa-7660-449e-a0d1-244dc6282d76/installer/0.log" Mar 20 08:39:11.573215 master-0 kubenswrapper[7465]: I0320 08:39:11.572394 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-srjqw"] Mar 20 08:39:11.573215 master-0 kubenswrapper[7465]: I0320 08:39:11.572850 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-srjqw" podUID="956a8ee3-2618-4f04-8d1c-97c3a4595c94" containerName="registry-server" containerID="cri-o://6b2daff08bafe9a34203a7f9995e51650ea9112e321da5c7bb3cda5cf1b124e2" gracePeriod=2 Mar 20 08:39:11.692704 master-0 kubenswrapper[7465]: I0320 08:39:11.692629 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-dddff6458-vx5d7_a57854ac-809a-4745-aaa1-774f0a08a560/kube-scheduler-operator-container/0.log" Mar 20 08:39:11.775151 master-0 kubenswrapper[7465]: I0320 08:39:11.774402 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mgcb9"] Mar 20 08:39:11.775495 master-0 kubenswrapper[7465]: I0320 08:39:11.775244 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mgcb9" podUID="06159643-e9b8-417f-ba79-ee1e1ca5c951" containerName="registry-server" containerID="cri-o://af9431084ad652fa67903c07dacd4b5fe50459c9cb6c9858baf0da64d22c51f3" gracePeriod=2 Mar 20 08:39:11.980950 master-0 kubenswrapper[7465]: I0320 08:39:11.979857 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:11.980950 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:11.980950 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:11.980950 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:11.980950 master-0 kubenswrapper[7465]: I0320 08:39:11.979933 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:11.985097 master-0 kubenswrapper[7465]: I0320 08:39:11.985040 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hqqrk"] Mar 20 08:39:11.987994 master-0 kubenswrapper[7465]: I0320 08:39:11.986293 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:39:11.992922 master-0 kubenswrapper[7465]: I0320 08:39:11.992878 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-frnfd" Mar 20 08:39:11.997748 master-0 kubenswrapper[7465]: I0320 08:39:11.997710 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqqrk"] Mar 20 08:39:12.017171 master-0 kubenswrapper[7465]: I0320 08:39:12.017129 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcp8t\" (UniqueName: \"kubernetes.io/projected/654b5b1c-2764-415c-bb13-aa06899f4076-kube-api-access-xcp8t\") pod \"redhat-marketplace-hqqrk\" (UID: \"654b5b1c-2764-415c-bb13-aa06899f4076\") " pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:39:12.017380 master-0 kubenswrapper[7465]: I0320 08:39:12.017355 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/654b5b1c-2764-415c-bb13-aa06899f4076-catalog-content\") pod \"redhat-marketplace-hqqrk\" (UID: \"654b5b1c-2764-415c-bb13-aa06899f4076\") " pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:39:12.017492 master-0 kubenswrapper[7465]: I0320 08:39:12.017473 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/654b5b1c-2764-415c-bb13-aa06899f4076-utilities\") pod \"redhat-marketplace-hqqrk\" (UID: \"654b5b1c-2764-415c-bb13-aa06899f4076\") " pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:39:12.017723 master-0 kubenswrapper[7465]: I0320 08:39:12.017202 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:39:12.096550 master-0 kubenswrapper[7465]: I0320 08:39:12.096429 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 20 08:39:12.123839 master-0 kubenswrapper[7465]: I0320 08:39:12.123735 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcp8t\" (UniqueName: \"kubernetes.io/projected/654b5b1c-2764-415c-bb13-aa06899f4076-kube-api-access-xcp8t\") pod \"redhat-marketplace-hqqrk\" (UID: \"654b5b1c-2764-415c-bb13-aa06899f4076\") " pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:39:12.127778 master-0 kubenswrapper[7465]: I0320 08:39:12.126354 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/654b5b1c-2764-415c-bb13-aa06899f4076-catalog-content\") pod \"redhat-marketplace-hqqrk\" (UID: \"654b5b1c-2764-415c-bb13-aa06899f4076\") " pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:39:12.127778 master-0 kubenswrapper[7465]: I0320 08:39:12.126463 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/654b5b1c-2764-415c-bb13-aa06899f4076-utilities\") pod \"redhat-marketplace-hqqrk\" (UID: \"654b5b1c-2764-415c-bb13-aa06899f4076\") " pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:39:12.134142 master-0 kubenswrapper[7465]: I0320 08:39:12.128103 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/654b5b1c-2764-415c-bb13-aa06899f4076-catalog-content\") pod \"redhat-marketplace-hqqrk\" (UID: \"654b5b1c-2764-415c-bb13-aa06899f4076\") " pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:39:12.135395 master-0 kubenswrapper[7465]: I0320 08:39:12.135369 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/654b5b1c-2764-415c-bb13-aa06899f4076-utilities\") pod \"redhat-marketplace-hqqrk\" (UID: \"654b5b1c-2764-415c-bb13-aa06899f4076\") " pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:39:12.152469 master-0 kubenswrapper[7465]: I0320 08:39:12.149465 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcp8t\" (UniqueName: \"kubernetes.io/projected/654b5b1c-2764-415c-bb13-aa06899f4076-kube-api-access-xcp8t\") pod \"redhat-marketplace-hqqrk\" (UID: \"654b5b1c-2764-415c-bb13-aa06899f4076\") " pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:39:12.187618 master-0 kubenswrapper[7465]: I0320 08:39:12.187556 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jstrn"] Mar 20 08:39:12.188671 master-0 kubenswrapper[7465]: E0320 08:39:12.188630 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956a8ee3-2618-4f04-8d1c-97c3a4595c94" containerName="registry-server" Mar 20 08:39:12.188671 master-0 kubenswrapper[7465]: I0320 08:39:12.188668 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="956a8ee3-2618-4f04-8d1c-97c3a4595c94" containerName="registry-server" Mar 20 08:39:12.188771 master-0 kubenswrapper[7465]: E0320 08:39:12.188689 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956a8ee3-2618-4f04-8d1c-97c3a4595c94" containerName="extract-utilities" Mar 20 08:39:12.188771 master-0 kubenswrapper[7465]: I0320 08:39:12.188697 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="956a8ee3-2618-4f04-8d1c-97c3a4595c94" containerName="extract-utilities" Mar 20 08:39:12.188771 master-0 kubenswrapper[7465]: E0320 08:39:12.188713 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956a8ee3-2618-4f04-8d1c-97c3a4595c94" containerName="extract-content" Mar 20 08:39:12.188771 master-0 kubenswrapper[7465]: I0320 08:39:12.188720 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="956a8ee3-2618-4f04-8d1c-97c3a4595c94" containerName="extract-content" Mar 20 08:39:12.188919 master-0 kubenswrapper[7465]: I0320 08:39:12.188870 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="956a8ee3-2618-4f04-8d1c-97c3a4595c94" containerName="registry-server" Mar 20 08:39:12.203096 master-0 kubenswrapper[7465]: I0320 08:39:12.202477 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jstrn"] Mar 20 08:39:12.203096 master-0 kubenswrapper[7465]: I0320 08:39:12.202646 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:39:12.206284 master-0 kubenswrapper[7465]: I0320 08:39:12.205217 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-bksjt" Mar 20 08:39:12.236788 master-0 kubenswrapper[7465]: I0320 08:39:12.236736 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956a8ee3-2618-4f04-8d1c-97c3a4595c94-catalog-content\") pod \"956a8ee3-2618-4f04-8d1c-97c3a4595c94\" (UID: \"956a8ee3-2618-4f04-8d1c-97c3a4595c94\") " Mar 20 08:39:12.246376 master-0 kubenswrapper[7465]: I0320 08:39:12.243265 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgmtk\" (UniqueName: \"kubernetes.io/projected/956a8ee3-2618-4f04-8d1c-97c3a4595c94-kube-api-access-xgmtk\") pod \"956a8ee3-2618-4f04-8d1c-97c3a4595c94\" (UID: \"956a8ee3-2618-4f04-8d1c-97c3a4595c94\") " Mar 20 08:39:12.246870 master-0 kubenswrapper[7465]: I0320 08:39:12.246854 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956a8ee3-2618-4f04-8d1c-97c3a4595c94-utilities\") pod \"956a8ee3-2618-4f04-8d1c-97c3a4595c94\" (UID: \"956a8ee3-2618-4f04-8d1c-97c3a4595c94\") " Mar 20 08:39:12.247944 master-0 kubenswrapper[7465]: I0320 08:39:12.247925 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxmkh\" (UniqueName: \"kubernetes.io/projected/c593e31d-82b5-4d42-992e-6b050ccf3019-kube-api-access-gxmkh\") pod \"redhat-operators-jstrn\" (UID: \"c593e31d-82b5-4d42-992e-6b050ccf3019\") " pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:39:12.249930 master-0 kubenswrapper[7465]: I0320 08:39:12.249910 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c593e31d-82b5-4d42-992e-6b050ccf3019-catalog-content\") pod \"redhat-operators-jstrn\" (UID: \"c593e31d-82b5-4d42-992e-6b050ccf3019\") " pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:39:12.250263 master-0 kubenswrapper[7465]: I0320 08:39:12.250249 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c593e31d-82b5-4d42-992e-6b050ccf3019-utilities\") pod \"redhat-operators-jstrn\" (UID: \"c593e31d-82b5-4d42-992e-6b050ccf3019\") " pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:39:12.251269 master-0 kubenswrapper[7465]: I0320 08:39:12.251108 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956a8ee3-2618-4f04-8d1c-97c3a4595c94-kube-api-access-xgmtk" (OuterVolumeSpecName: "kube-api-access-xgmtk") pod "956a8ee3-2618-4f04-8d1c-97c3a4595c94" (UID: "956a8ee3-2618-4f04-8d1c-97c3a4595c94"). InnerVolumeSpecName "kube-api-access-xgmtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:39:12.254605 master-0 kubenswrapper[7465]: I0320 08:39:12.252434 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:39:12.255128 master-0 kubenswrapper[7465]: I0320 08:39:12.255101 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956a8ee3-2618-4f04-8d1c-97c3a4595c94-utilities" (OuterVolumeSpecName: "utilities") pod "956a8ee3-2618-4f04-8d1c-97c3a4595c94" (UID: "956a8ee3-2618-4f04-8d1c-97c3a4595c94"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:39:12.257897 master-0 kubenswrapper[7465]: I0320 08:39:12.257853 7465 generic.go:334] "Generic (PLEG): container finished" podID="956a8ee3-2618-4f04-8d1c-97c3a4595c94" containerID="6b2daff08bafe9a34203a7f9995e51650ea9112e321da5c7bb3cda5cf1b124e2" exitCode=0 Mar 20 08:39:12.257986 master-0 kubenswrapper[7465]: I0320 08:39:12.257933 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srjqw" event={"ID":"956a8ee3-2618-4f04-8d1c-97c3a4595c94","Type":"ContainerDied","Data":"6b2daff08bafe9a34203a7f9995e51650ea9112e321da5c7bb3cda5cf1b124e2"} Mar 20 08:39:12.257986 master-0 kubenswrapper[7465]: I0320 08:39:12.257972 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srjqw" event={"ID":"956a8ee3-2618-4f04-8d1c-97c3a4595c94","Type":"ContainerDied","Data":"13ee22fe72630c657645fbce4809a9a3e59153a0a482338b9c4adf8c528528a1"} Mar 20 08:39:12.258083 master-0 kubenswrapper[7465]: I0320 08:39:12.257995 7465 scope.go:117] "RemoveContainer" containerID="6b2daff08bafe9a34203a7f9995e51650ea9112e321da5c7bb3cda5cf1b124e2" Mar 20 08:39:12.258174 master-0 kubenswrapper[7465]: I0320 08:39:12.258154 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srjqw" Mar 20 08:39:12.264955 master-0 kubenswrapper[7465]: I0320 08:39:12.263575 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtqgc" event={"ID":"b639e578-628e-404d-b759-8b6e84e771d9","Type":"ContainerStarted","Data":"70cbec433b4a5afb013d99a248cda222d66b2abdceddd1d72d46fce02f57b45d"} Mar 20 08:39:12.266642 master-0 kubenswrapper[7465]: I0320 08:39:12.266451 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956a8ee3-2618-4f04-8d1c-97c3a4595c94-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "956a8ee3-2618-4f04-8d1c-97c3a4595c94" (UID: "956a8ee3-2618-4f04-8d1c-97c3a4595c94"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:39:12.273506 master-0 kubenswrapper[7465]: I0320 08:39:12.273396 7465 generic.go:334] "Generic (PLEG): container finished" podID="06159643-e9b8-417f-ba79-ee1e1ca5c951" containerID="af9431084ad652fa67903c07dacd4b5fe50459c9cb6c9858baf0da64d22c51f3" exitCode=0 Mar 20 08:39:12.273822 master-0 kubenswrapper[7465]: I0320 08:39:12.273534 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mgcb9" Mar 20 08:39:12.273963 master-0 kubenswrapper[7465]: I0320 08:39:12.273867 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mgcb9" event={"ID":"06159643-e9b8-417f-ba79-ee1e1ca5c951","Type":"ContainerDied","Data":"af9431084ad652fa67903c07dacd4b5fe50459c9cb6c9858baf0da64d22c51f3"} Mar 20 08:39:12.274074 master-0 kubenswrapper[7465]: I0320 08:39:12.273974 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mgcb9" event={"ID":"06159643-e9b8-417f-ba79-ee1e1ca5c951","Type":"ContainerDied","Data":"b48c4aacb301fc26474516047fb9f07667987577a2ed5332421858d27dce7d77"} Mar 20 08:39:12.277750 master-0 kubenswrapper[7465]: I0320 08:39:12.277693 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc955" event={"ID":"c0a17669-a122-44aa-bdda-581bf1fc4649","Type":"ContainerStarted","Data":"4d95b114dff245825e6087b6ba414ae9712434ff235bee5e21733cbc9dc925e4"} Mar 20 08:39:12.279321 master-0 kubenswrapper[7465]: I0320 08:39:12.279222 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" event={"ID":"f91d1788-027d-432b-be33-ca952a95046a","Type":"ContainerStarted","Data":"49245723e92395f35c0b36240f7d6fbf94cef777cdd886e69b77ece8cf42ea5f"} Mar 20 08:39:12.288257 master-0 kubenswrapper[7465]: I0320 08:39:12.285336 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" event={"ID":"ae39c09b-7aef-4615-8ced-0dcad39f23a5","Type":"ContainerStarted","Data":"0e60f2693cbc96c33931a792326fb808ba028038939cac58b0b52b50bec85ee7"} Mar 20 08:39:12.288257 master-0 kubenswrapper[7465]: I0320 08:39:12.285406 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" event={"ID":"ae39c09b-7aef-4615-8ced-0dcad39f23a5","Type":"ContainerStarted","Data":"62f67b84eaa6aca590ae16bc4212ddb118ad7d5cdbb373eca099fc3bf11c95b9"} Mar 20 08:39:12.288257 master-0 kubenswrapper[7465]: I0320 08:39:12.285637 7465 scope.go:117] "RemoveContainer" containerID="162d6251581b5398c7cdd12c089739f6900bac67c42e07fd4143959fd82c1770" Mar 20 08:39:12.291166 master-0 kubenswrapper[7465]: I0320 08:39:12.291140 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/setup/0.log" Mar 20 08:39:12.316845 master-0 kubenswrapper[7465]: I0320 08:39:12.316602 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" event={"ID":"d88ba8e1-ee42-423f-9839-e71cb0265c6c","Type":"ContainerStarted","Data":"402362a050fc8c12c159a90a9bcc448b79348e6b94cef2fffbaa9a0b475c7274"} Mar 20 08:39:12.318150 master-0 kubenswrapper[7465]: I0320 08:39:12.318130 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" event={"ID":"d88ba8e1-ee42-423f-9839-e71cb0265c6c","Type":"ContainerStarted","Data":"8849c0e374773ff413e6a07005d70c646b3dbfad2bb39cd593ab7f09dab9e689"} Mar 20 08:39:12.319969 master-0 kubenswrapper[7465]: I0320 08:39:12.319950 7465 scope.go:117] "RemoveContainer" containerID="33768c74647cf1a7ac9859d965b50b5b37abd6757807573438c40398b17fc2b5" Mar 20 08:39:12.337158 master-0 kubenswrapper[7465]: I0320 08:39:12.337021 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:39:12.347529 master-0 kubenswrapper[7465]: I0320 08:39:12.347508 7465 scope.go:117] "RemoveContainer" containerID="6b2daff08bafe9a34203a7f9995e51650ea9112e321da5c7bb3cda5cf1b124e2" Mar 20 08:39:12.349561 master-0 kubenswrapper[7465]: E0320 08:39:12.349509 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b2daff08bafe9a34203a7f9995e51650ea9112e321da5c7bb3cda5cf1b124e2\": container with ID starting with 6b2daff08bafe9a34203a7f9995e51650ea9112e321da5c7bb3cda5cf1b124e2 not found: ID does not exist" containerID="6b2daff08bafe9a34203a7f9995e51650ea9112e321da5c7bb3cda5cf1b124e2" Mar 20 08:39:12.349665 master-0 kubenswrapper[7465]: I0320 08:39:12.349581 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b2daff08bafe9a34203a7f9995e51650ea9112e321da5c7bb3cda5cf1b124e2"} err="failed to get container status \"6b2daff08bafe9a34203a7f9995e51650ea9112e321da5c7bb3cda5cf1b124e2\": rpc error: code = NotFound desc = could not find container \"6b2daff08bafe9a34203a7f9995e51650ea9112e321da5c7bb3cda5cf1b124e2\": container with ID starting with 6b2daff08bafe9a34203a7f9995e51650ea9112e321da5c7bb3cda5cf1b124e2 not found: ID does not exist" Mar 20 08:39:12.349665 master-0 kubenswrapper[7465]: I0320 08:39:12.349627 7465 scope.go:117] "RemoveContainer" containerID="162d6251581b5398c7cdd12c089739f6900bac67c42e07fd4143959fd82c1770" Mar 20 08:39:12.350239 master-0 kubenswrapper[7465]: E0320 08:39:12.350168 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"162d6251581b5398c7cdd12c089739f6900bac67c42e07fd4143959fd82c1770\": container with ID starting with 162d6251581b5398c7cdd12c089739f6900bac67c42e07fd4143959fd82c1770 not found: ID does not exist" containerID="162d6251581b5398c7cdd12c089739f6900bac67c42e07fd4143959fd82c1770" Mar 20 08:39:12.350345 master-0 kubenswrapper[7465]: I0320 08:39:12.350228 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"162d6251581b5398c7cdd12c089739f6900bac67c42e07fd4143959fd82c1770"} err="failed to get container status \"162d6251581b5398c7cdd12c089739f6900bac67c42e07fd4143959fd82c1770\": rpc error: code = NotFound desc = could not find container \"162d6251581b5398c7cdd12c089739f6900bac67c42e07fd4143959fd82c1770\": container with ID starting with 162d6251581b5398c7cdd12c089739f6900bac67c42e07fd4143959fd82c1770 not found: ID does not exist" Mar 20 08:39:12.350345 master-0 kubenswrapper[7465]: I0320 08:39:12.350254 7465 scope.go:117] "RemoveContainer" containerID="33768c74647cf1a7ac9859d965b50b5b37abd6757807573438c40398b17fc2b5" Mar 20 08:39:12.350896 master-0 kubenswrapper[7465]: E0320 08:39:12.350841 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33768c74647cf1a7ac9859d965b50b5b37abd6757807573438c40398b17fc2b5\": container with ID starting with 33768c74647cf1a7ac9859d965b50b5b37abd6757807573438c40398b17fc2b5 not found: ID does not exist" containerID="33768c74647cf1a7ac9859d965b50b5b37abd6757807573438c40398b17fc2b5" Mar 20 08:39:12.350962 master-0 kubenswrapper[7465]: I0320 08:39:12.350906 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33768c74647cf1a7ac9859d965b50b5b37abd6757807573438c40398b17fc2b5"} err="failed to get container status \"33768c74647cf1a7ac9859d965b50b5b37abd6757807573438c40398b17fc2b5\": rpc error: code = NotFound desc = could not find container \"33768c74647cf1a7ac9859d965b50b5b37abd6757807573438c40398b17fc2b5\": container with ID starting with 33768c74647cf1a7ac9859d965b50b5b37abd6757807573438c40398b17fc2b5 not found: ID does not exist" Mar 20 08:39:12.350962 master-0 kubenswrapper[7465]: I0320 08:39:12.350948 7465 scope.go:117] "RemoveContainer" containerID="af9431084ad652fa67903c07dacd4b5fe50459c9cb6c9858baf0da64d22c51f3" Mar 20 08:39:12.351173 master-0 kubenswrapper[7465]: I0320 08:39:12.351130 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06159643-e9b8-417f-ba79-ee1e1ca5c951-catalog-content\") pod \"06159643-e9b8-417f-ba79-ee1e1ca5c951\" (UID: \"06159643-e9b8-417f-ba79-ee1e1ca5c951\") " Mar 20 08:39:12.351288 master-0 kubenswrapper[7465]: I0320 08:39:12.351263 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06159643-e9b8-417f-ba79-ee1e1ca5c951-utilities\") pod \"06159643-e9b8-417f-ba79-ee1e1ca5c951\" (UID: \"06159643-e9b8-417f-ba79-ee1e1ca5c951\") " Mar 20 08:39:12.351335 master-0 kubenswrapper[7465]: I0320 08:39:12.351324 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s9bz\" (UniqueName: \"kubernetes.io/projected/06159643-e9b8-417f-ba79-ee1e1ca5c951-kube-api-access-8s9bz\") pod \"06159643-e9b8-417f-ba79-ee1e1ca5c951\" (UID: \"06159643-e9b8-417f-ba79-ee1e1ca5c951\") " Mar 20 08:39:12.353239 master-0 kubenswrapper[7465]: I0320 08:39:12.352475 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06159643-e9b8-417f-ba79-ee1e1ca5c951-utilities" (OuterVolumeSpecName: "utilities") pod "06159643-e9b8-417f-ba79-ee1e1ca5c951" (UID: "06159643-e9b8-417f-ba79-ee1e1ca5c951"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:39:12.353239 master-0 kubenswrapper[7465]: I0320 08:39:12.353213 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c593e31d-82b5-4d42-992e-6b050ccf3019-catalog-content\") pod \"redhat-operators-jstrn\" (UID: \"c593e31d-82b5-4d42-992e-6b050ccf3019\") " pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:39:12.353386 master-0 kubenswrapper[7465]: I0320 08:39:12.353302 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c593e31d-82b5-4d42-992e-6b050ccf3019-utilities\") pod \"redhat-operators-jstrn\" (UID: \"c593e31d-82b5-4d42-992e-6b050ccf3019\") " pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:39:12.353633 master-0 kubenswrapper[7465]: I0320 08:39:12.353590 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxmkh\" (UniqueName: \"kubernetes.io/projected/c593e31d-82b5-4d42-992e-6b050ccf3019-kube-api-access-gxmkh\") pod \"redhat-operators-jstrn\" (UID: \"c593e31d-82b5-4d42-992e-6b050ccf3019\") " pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:39:12.353871 master-0 kubenswrapper[7465]: I0320 08:39:12.353809 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c593e31d-82b5-4d42-992e-6b050ccf3019-utilities\") pod \"redhat-operators-jstrn\" (UID: \"c593e31d-82b5-4d42-992e-6b050ccf3019\") " pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:39:12.354052 master-0 kubenswrapper[7465]: I0320 08:39:12.354023 7465 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956a8ee3-2618-4f04-8d1c-97c3a4595c94-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:12.354112 master-0 kubenswrapper[7465]: I0320 08:39:12.354086 7465 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06159643-e9b8-417f-ba79-ee1e1ca5c951-utilities\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:12.354162 master-0 kubenswrapper[7465]: I0320 08:39:12.354118 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgmtk\" (UniqueName: \"kubernetes.io/projected/956a8ee3-2618-4f04-8d1c-97c3a4595c94-kube-api-access-xgmtk\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:12.354162 master-0 kubenswrapper[7465]: I0320 08:39:12.354140 7465 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956a8ee3-2618-4f04-8d1c-97c3a4595c94-utilities\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:12.354729 master-0 kubenswrapper[7465]: I0320 08:39:12.354701 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c593e31d-82b5-4d42-992e-6b050ccf3019-catalog-content\") pod \"redhat-operators-jstrn\" (UID: \"c593e31d-82b5-4d42-992e-6b050ccf3019\") " pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:39:12.355332 master-0 kubenswrapper[7465]: I0320 08:39:12.355284 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06159643-e9b8-417f-ba79-ee1e1ca5c951-kube-api-access-8s9bz" (OuterVolumeSpecName: "kube-api-access-8s9bz") pod "06159643-e9b8-417f-ba79-ee1e1ca5c951" (UID: "06159643-e9b8-417f-ba79-ee1e1ca5c951"). InnerVolumeSpecName "kube-api-access-8s9bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:39:12.378502 master-0 kubenswrapper[7465]: I0320 08:39:12.377724 7465 scope.go:117] "RemoveContainer" containerID="e7321b68cd658b736e2e6b297892a77f8a40c4d3d188ee1bd535ab60a2be4cf0" Mar 20 08:39:12.380479 master-0 kubenswrapper[7465]: I0320 08:39:12.380400 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" podStartSLOduration=2.380369922 podStartE2EDuration="2.380369922s" podCreationTimestamp="2026-03-20 08:39:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:39:12.378748024 +0000 UTC m=+178.022063524" watchObservedRunningTime="2026-03-20 08:39:12.380369922 +0000 UTC m=+178.023685412" Mar 20 08:39:12.392075 master-0 kubenswrapper[7465]: I0320 08:39:12.391638 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxmkh\" (UniqueName: \"kubernetes.io/projected/c593e31d-82b5-4d42-992e-6b050ccf3019-kube-api-access-gxmkh\") pod \"redhat-operators-jstrn\" (UID: \"c593e31d-82b5-4d42-992e-6b050ccf3019\") " pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:39:12.410049 master-0 kubenswrapper[7465]: I0320 08:39:12.409404 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" podStartSLOduration=2.409372468 podStartE2EDuration="2.409372468s" podCreationTimestamp="2026-03-20 08:39:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:39:12.403722933 +0000 UTC m=+178.047038433" watchObservedRunningTime="2026-03-20 08:39:12.409372468 +0000 UTC m=+178.052687958" Mar 20 08:39:12.434221 master-0 kubenswrapper[7465]: I0320 08:39:12.434155 7465 scope.go:117] "RemoveContainer" containerID="e009b9f493edc688ad18ed0c51c3e2a8f79f6c2dff58a7c93226076a653427d7" Mar 20 08:39:12.457637 master-0 kubenswrapper[7465]: I0320 08:39:12.456929 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s9bz\" (UniqueName: \"kubernetes.io/projected/06159643-e9b8-417f-ba79-ee1e1ca5c951-kube-api-access-8s9bz\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:12.483737 master-0 kubenswrapper[7465]: I0320 08:39:12.483682 7465 scope.go:117] "RemoveContainer" containerID="af9431084ad652fa67903c07dacd4b5fe50459c9cb6c9858baf0da64d22c51f3" Mar 20 08:39:12.487599 master-0 kubenswrapper[7465]: E0320 08:39:12.486683 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af9431084ad652fa67903c07dacd4b5fe50459c9cb6c9858baf0da64d22c51f3\": container with ID starting with af9431084ad652fa67903c07dacd4b5fe50459c9cb6c9858baf0da64d22c51f3 not found: ID does not exist" containerID="af9431084ad652fa67903c07dacd4b5fe50459c9cb6c9858baf0da64d22c51f3" Mar 20 08:39:12.487599 master-0 kubenswrapper[7465]: I0320 08:39:12.486782 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af9431084ad652fa67903c07dacd4b5fe50459c9cb6c9858baf0da64d22c51f3"} err="failed to get container status \"af9431084ad652fa67903c07dacd4b5fe50459c9cb6c9858baf0da64d22c51f3\": rpc error: code = NotFound desc = could not find container \"af9431084ad652fa67903c07dacd4b5fe50459c9cb6c9858baf0da64d22c51f3\": container with ID starting with af9431084ad652fa67903c07dacd4b5fe50459c9cb6c9858baf0da64d22c51f3 not found: ID does not exist" Mar 20 08:39:12.487599 master-0 kubenswrapper[7465]: I0320 08:39:12.486869 7465 scope.go:117] "RemoveContainer" containerID="e7321b68cd658b736e2e6b297892a77f8a40c4d3d188ee1bd535ab60a2be4cf0" Mar 20 08:39:12.489058 master-0 kubenswrapper[7465]: E0320 08:39:12.488939 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7321b68cd658b736e2e6b297892a77f8a40c4d3d188ee1bd535ab60a2be4cf0\": container with ID starting with e7321b68cd658b736e2e6b297892a77f8a40c4d3d188ee1bd535ab60a2be4cf0 not found: ID does not exist" containerID="e7321b68cd658b736e2e6b297892a77f8a40c4d3d188ee1bd535ab60a2be4cf0" Mar 20 08:39:12.489058 master-0 kubenswrapper[7465]: I0320 08:39:12.488968 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7321b68cd658b736e2e6b297892a77f8a40c4d3d188ee1bd535ab60a2be4cf0"} err="failed to get container status \"e7321b68cd658b736e2e6b297892a77f8a40c4d3d188ee1bd535ab60a2be4cf0\": rpc error: code = NotFound desc = could not find container \"e7321b68cd658b736e2e6b297892a77f8a40c4d3d188ee1bd535ab60a2be4cf0\": container with ID starting with e7321b68cd658b736e2e6b297892a77f8a40c4d3d188ee1bd535ab60a2be4cf0 not found: ID does not exist" Mar 20 08:39:12.489058 master-0 kubenswrapper[7465]: I0320 08:39:12.488986 7465 scope.go:117] "RemoveContainer" containerID="e009b9f493edc688ad18ed0c51c3e2a8f79f6c2dff58a7c93226076a653427d7" Mar 20 08:39:12.489644 master-0 kubenswrapper[7465]: E0320 08:39:12.489592 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e009b9f493edc688ad18ed0c51c3e2a8f79f6c2dff58a7c93226076a653427d7\": container with ID starting with e009b9f493edc688ad18ed0c51c3e2a8f79f6c2dff58a7c93226076a653427d7 not found: ID does not exist" containerID="e009b9f493edc688ad18ed0c51c3e2a8f79f6c2dff58a7c93226076a653427d7" Mar 20 08:39:12.489644 master-0 kubenswrapper[7465]: I0320 08:39:12.489631 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e009b9f493edc688ad18ed0c51c3e2a8f79f6c2dff58a7c93226076a653427d7"} err="failed to get container status \"e009b9f493edc688ad18ed0c51c3e2a8f79f6c2dff58a7c93226076a653427d7\": rpc error: code = NotFound desc = could not find container \"e009b9f493edc688ad18ed0c51c3e2a8f79f6c2dff58a7c93226076a653427d7\": container with ID starting with e009b9f493edc688ad18ed0c51c3e2a8f79f6c2dff58a7c93226076a653427d7 not found: ID does not exist" Mar 20 08:39:12.494386 master-0 kubenswrapper[7465]: I0320 08:39:12.494267 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/3.log" Mar 20 08:39:12.538494 master-0 kubenswrapper[7465]: I0320 08:39:12.538211 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:39:12.553388 master-0 kubenswrapper[7465]: I0320 08:39:12.553159 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06159643-e9b8-417f-ba79-ee1e1ca5c951-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06159643-e9b8-417f-ba79-ee1e1ca5c951" (UID: "06159643-e9b8-417f-ba79-ee1e1ca5c951"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:39:12.558926 master-0 kubenswrapper[7465]: I0320 08:39:12.558890 7465 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06159643-e9b8-417f-ba79-ee1e1ca5c951-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:12.587415 master-0 kubenswrapper[7465]: I0320 08:39:12.587351 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-srjqw"] Mar 20 08:39:12.593166 master-0 kubenswrapper[7465]: I0320 08:39:12.593114 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-srjqw"] Mar 20 08:39:12.614435 master-0 kubenswrapper[7465]: I0320 08:39:12.614281 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mgcb9"] Mar 20 08:39:12.621668 master-0 kubenswrapper[7465]: I0320 08:39:12.621606 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mgcb9"] Mar 20 08:39:12.689893 master-0 kubenswrapper[7465]: I0320 08:39:12.689795 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-controller-b4f87c5b9-pj7rj_de6078d7-2aad-46fe-b17a-b6b38e4eaa41/machine-config-controller/0.log" Mar 20 08:39:12.807363 master-0 kubenswrapper[7465]: I0320 08:39:12.807025 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hqqrk"] Mar 20 08:39:12.811294 master-0 kubenswrapper[7465]: W0320 08:39:12.811253 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod654b5b1c_2764_415c_bb13_aa06899f4076.slice/crio-4308310cb66871b8d038228451f6cb347ae9fe6f0a69349b4baf59dc1b20775d WatchSource:0}: Error finding container 4308310cb66871b8d038228451f6cb347ae9fe6f0a69349b4baf59dc1b20775d: Status 404 returned error can't find the container with id 4308310cb66871b8d038228451f6cb347ae9fe6f0a69349b4baf59dc1b20775d Mar 20 08:39:12.828468 master-0 kubenswrapper[7465]: I0320 08:39:12.828401 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jstrn"] Mar 20 08:39:12.841209 master-0 kubenswrapper[7465]: W0320 08:39:12.841146 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc593e31d_82b5_4d42_992e_6b050ccf3019.slice/crio-ef4e9117db9997ed5777e4eb0ac97e214bfa4a8d0f2ec4d63af464bd290bf782 WatchSource:0}: Error finding container ef4e9117db9997ed5777e4eb0ac97e214bfa4a8d0f2ec4d63af464bd290bf782: Status 404 returned error can't find the container with id ef4e9117db9997ed5777e4eb0ac97e214bfa4a8d0f2ec4d63af464bd290bf782 Mar 20 08:39:12.887888 master-0 kubenswrapper[7465]: I0320 08:39:12.887857 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-controller-b4f87c5b9-pj7rj_de6078d7-2aad-46fe-b17a-b6b38e4eaa41/kube-rbac-proxy/0.log" Mar 20 08:39:12.980272 master-0 kubenswrapper[7465]: I0320 08:39:12.979827 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:12.980272 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:12.980272 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:12.980272 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:12.980272 master-0 kubenswrapper[7465]: I0320 08:39:12.979925 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:13.088737 master-0 kubenswrapper[7465]: I0320 08:39:13.088667 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-daemon-9t8x6_f5782718-9118-4682-a287-7998cd0304b3/machine-config-daemon/0.log" Mar 20 08:39:13.286406 master-0 kubenswrapper[7465]: I0320 08:39:13.286262 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-daemon-9t8x6_f5782718-9118-4682-a287-7998cd0304b3/kube-rbac-proxy/0.log" Mar 20 08:39:13.326980 master-0 kubenswrapper[7465]: I0320 08:39:13.326897 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jstrn" event={"ID":"c593e31d-82b5-4d42-992e-6b050ccf3019","Type":"ContainerStarted","Data":"633b6e62526e4e0cbd07fd4d4b0af4afaceded4e0aa25ebac529d888061b8a40"} Mar 20 08:39:13.326980 master-0 kubenswrapper[7465]: I0320 08:39:13.326963 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jstrn" event={"ID":"c593e31d-82b5-4d42-992e-6b050ccf3019","Type":"ContainerStarted","Data":"ef4e9117db9997ed5777e4eb0ac97e214bfa4a8d0f2ec4d63af464bd290bf782"} Mar 20 08:39:13.332534 master-0 kubenswrapper[7465]: I0320 08:39:13.331868 7465 generic.go:334] "Generic (PLEG): container finished" podID="b639e578-628e-404d-b759-8b6e84e771d9" containerID="70cbec433b4a5afb013d99a248cda222d66b2abdceddd1d72d46fce02f57b45d" exitCode=0 Mar 20 08:39:13.332534 master-0 kubenswrapper[7465]: I0320 08:39:13.331925 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtqgc" event={"ID":"b639e578-628e-404d-b759-8b6e84e771d9","Type":"ContainerDied","Data":"70cbec433b4a5afb013d99a248cda222d66b2abdceddd1d72d46fce02f57b45d"} Mar 20 08:39:13.335512 master-0 kubenswrapper[7465]: I0320 08:39:13.335456 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqqrk" event={"ID":"654b5b1c-2764-415c-bb13-aa06899f4076","Type":"ContainerStarted","Data":"ba2c385399d075f7d3be23f2cb6f802e608ef356862884d90b8c839e8667b5b3"} Mar 20 08:39:13.335512 master-0 kubenswrapper[7465]: I0320 08:39:13.335502 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqqrk" event={"ID":"654b5b1c-2764-415c-bb13-aa06899f4076","Type":"ContainerStarted","Data":"4308310cb66871b8d038228451f6cb347ae9fe6f0a69349b4baf59dc1b20775d"} Mar 20 08:39:13.338075 master-0 kubenswrapper[7465]: I0320 08:39:13.338036 7465 generic.go:334] "Generic (PLEG): container finished" podID="c0a17669-a122-44aa-bdda-581bf1fc4649" containerID="4d95b114dff245825e6087b6ba414ae9712434ff235bee5e21733cbc9dc925e4" exitCode=0 Mar 20 08:39:13.338928 master-0 kubenswrapper[7465]: I0320 08:39:13.338901 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc955" event={"ID":"c0a17669-a122-44aa-bdda-581bf1fc4649","Type":"ContainerDied","Data":"4d95b114dff245825e6087b6ba414ae9712434ff235bee5e21733cbc9dc925e4"} Mar 20 08:39:13.496740 master-0 kubenswrapper[7465]: I0320 08:39:13.495736 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-operator-84d549f6d5-gm4qr_42df77ec-94aa-48ba-bb35-7b1f1e8b8e97/machine-config-operator/0.log" Mar 20 08:39:13.688026 master-0 kubenswrapper[7465]: I0320 08:39:13.687905 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-operator-84d549f6d5-gm4qr_42df77ec-94aa-48ba-bb35-7b1f1e8b8e97/kube-rbac-proxy/0.log" Mar 20 08:39:13.885056 master-0 kubenswrapper[7465]: I0320 08:39:13.884982 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_machine-config-server-gj4pm_4ddac301-a604-4f07-8849-5928befd336e/machine-config-server/0.log" Mar 20 08:39:13.978107 master-0 kubenswrapper[7465]: I0320 08:39:13.978025 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:13.978107 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:13.978107 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:13.978107 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:13.978363 master-0 kubenswrapper[7465]: I0320 08:39:13.978139 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:14.090528 master-0 kubenswrapper[7465]: I0320 08:39:14.090405 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-d65958b8-th2vj_325f0a83-d56d-4b62-977b-088a7d5f0e00/openshift-apiserver-operator/0.log" Mar 20 08:39:14.289133 master-0 kubenswrapper[7465]: I0320 08:39:14.289060 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-d65958b8-th2vj_325f0a83-d56d-4b62-977b-088a7d5f0e00/openshift-apiserver-operator/1.log" Mar 20 08:39:14.349116 master-0 kubenswrapper[7465]: I0320 08:39:14.348955 7465 generic.go:334] "Generic (PLEG): container finished" podID="654b5b1c-2764-415c-bb13-aa06899f4076" containerID="ba2c385399d075f7d3be23f2cb6f802e608ef356862884d90b8c839e8667b5b3" exitCode=0 Mar 20 08:39:14.349116 master-0 kubenswrapper[7465]: I0320 08:39:14.349051 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqqrk" event={"ID":"654b5b1c-2764-415c-bb13-aa06899f4076","Type":"ContainerDied","Data":"ba2c385399d075f7d3be23f2cb6f802e608ef356862884d90b8c839e8667b5b3"} Mar 20 08:39:14.354356 master-0 kubenswrapper[7465]: I0320 08:39:14.353176 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc955" event={"ID":"c0a17669-a122-44aa-bdda-581bf1fc4649","Type":"ContainerStarted","Data":"efcd91e8fee4e2c0e31de5da275313b21efe9ca0b897e5b0a39fdcdb9033ff18"} Mar 20 08:39:14.359269 master-0 kubenswrapper[7465]: I0320 08:39:14.359216 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" event={"ID":"f91d1788-027d-432b-be33-ca952a95046a","Type":"ContainerStarted","Data":"a66c13b8d3c8a27dd4cd87a525d5a24a89e0e07f3750199c7db475093b70bb91"} Mar 20 08:39:14.359269 master-0 kubenswrapper[7465]: I0320 08:39:14.359257 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" event={"ID":"f91d1788-027d-432b-be33-ca952a95046a","Type":"ContainerStarted","Data":"b94c08b7e5587b5832ecd9153622e9d6c7645f7646fa587d7cb88f5fb9199df4"} Mar 20 08:39:14.361426 master-0 kubenswrapper[7465]: I0320 08:39:14.361358 7465 generic.go:334] "Generic (PLEG): container finished" podID="c593e31d-82b5-4d42-992e-6b050ccf3019" containerID="633b6e62526e4e0cbd07fd4d4b0af4afaceded4e0aa25ebac529d888061b8a40" exitCode=0 Mar 20 08:39:14.361535 master-0 kubenswrapper[7465]: I0320 08:39:14.361442 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jstrn" event={"ID":"c593e31d-82b5-4d42-992e-6b050ccf3019","Type":"ContainerDied","Data":"633b6e62526e4e0cbd07fd4d4b0af4afaceded4e0aa25ebac529d888061b8a40"} Mar 20 08:39:14.365897 master-0 kubenswrapper[7465]: I0320 08:39:14.365828 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtqgc" event={"ID":"b639e578-628e-404d-b759-8b6e84e771d9","Type":"ContainerStarted","Data":"b8bdf077983bcc6ca23493b1788cffc2d1c4bb1c6018cd76d67efa10f3e3c4d0"} Mar 20 08:39:14.405127 master-0 kubenswrapper[7465]: I0320 08:39:14.405025 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cc955" podStartSLOduration=2.858805711 podStartE2EDuration="5.404997726s" podCreationTimestamp="2026-03-20 08:39:09 +0000 UTC" firstStartedPulling="2026-03-20 08:39:11.229507727 +0000 UTC m=+176.872823217" lastFinishedPulling="2026-03-20 08:39:13.775699732 +0000 UTC m=+179.419015232" observedRunningTime="2026-03-20 08:39:14.401287808 +0000 UTC m=+180.044603308" watchObservedRunningTime="2026-03-20 08:39:14.404997726 +0000 UTC m=+180.048313206" Mar 20 08:39:14.435846 master-0 kubenswrapper[7465]: I0320 08:39:14.435341 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dtqgc" podStartSLOduration=2.750411069 podStartE2EDuration="5.435308091s" podCreationTimestamp="2026-03-20 08:39:09 +0000 UTC" firstStartedPulling="2026-03-20 08:39:11.22276255 +0000 UTC m=+176.866078040" lastFinishedPulling="2026-03-20 08:39:13.907659572 +0000 UTC m=+179.550975062" observedRunningTime="2026-03-20 08:39:14.428302036 +0000 UTC m=+180.071617536" watchObservedRunningTime="2026-03-20 08:39:14.435308091 +0000 UTC m=+180.078623581" Mar 20 08:39:14.476004 master-0 kubenswrapper[7465]: I0320 08:39:14.475904 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" podStartSLOduration=3.3847765819999998 podStartE2EDuration="5.475877035s" podCreationTimestamp="2026-03-20 08:39:09 +0000 UTC" firstStartedPulling="2026-03-20 08:39:11.321398739 +0000 UTC m=+176.964714229" lastFinishedPulling="2026-03-20 08:39:13.412499192 +0000 UTC m=+179.055814682" observedRunningTime="2026-03-20 08:39:14.473740653 +0000 UTC m=+180.117056153" watchObservedRunningTime="2026-03-20 08:39:14.475877035 +0000 UTC m=+180.119192535" Mar 20 08:39:14.488414 master-0 kubenswrapper[7465]: I0320 08:39:14.488362 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-779f85678d-lrzfz_46de2acc-9f5d-4ecf-befe-a480f86466f5/fix-audit-permissions/0.log" Mar 20 08:39:14.547960 master-0 kubenswrapper[7465]: I0320 08:39:14.547893 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06159643-e9b8-417f-ba79-ee1e1ca5c951" path="/var/lib/kubelet/pods/06159643-e9b8-417f-ba79-ee1e1ca5c951/volumes" Mar 20 08:39:14.552199 master-0 kubenswrapper[7465]: I0320 08:39:14.549139 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="956a8ee3-2618-4f04-8d1c-97c3a4595c94" path="/var/lib/kubelet/pods/956a8ee3-2618-4f04-8d1c-97c3a4595c94/volumes" Mar 20 08:39:14.686798 master-0 kubenswrapper[7465]: I0320 08:39:14.686738 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-779f85678d-lrzfz_46de2acc-9f5d-4ecf-befe-a480f86466f5/openshift-apiserver/0.log" Mar 20 08:39:14.887213 master-0 kubenswrapper[7465]: I0320 08:39:14.887030 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-779f85678d-lrzfz_46de2acc-9f5d-4ecf-befe-a480f86466f5/openshift-apiserver-check-endpoints/0.log" Mar 20 08:39:15.018873 master-0 kubenswrapper[7465]: I0320 08:39:15.018802 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:15.018873 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:15.018873 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:15.018873 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:15.019299 master-0 kubenswrapper[7465]: I0320 08:39:15.018893 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:15.085956 master-0 kubenswrapper[7465]: I0320 08:39:15.085923 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-brhw4_f046860d-2d54-4746-8ba2-f8e90fa55e38/etcd-operator/0.log" Mar 20 08:39:15.291757 master-0 kubenswrapper[7465]: I0320 08:39:15.291697 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-brhw4_f046860d-2d54-4746-8ba2-f8e90fa55e38/etcd-operator/1.log" Mar 20 08:39:15.378358 master-0 kubenswrapper[7465]: I0320 08:39:15.378248 7465 generic.go:334] "Generic (PLEG): container finished" podID="654b5b1c-2764-415c-bb13-aa06899f4076" containerID="40882309ca6cfeb4b89a668f255895a49cb18211d2d4c38846d98d1ae8591f1f" exitCode=0 Mar 20 08:39:15.378848 master-0 kubenswrapper[7465]: I0320 08:39:15.378779 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqqrk" event={"ID":"654b5b1c-2764-415c-bb13-aa06899f4076","Type":"ContainerDied","Data":"40882309ca6cfeb4b89a668f255895a49cb18211d2d4c38846d98d1ae8591f1f"} Mar 20 08:39:15.491316 master-0 kubenswrapper[7465]: I0320 08:39:15.491137 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-p7pt6_68252533-bd64-4fc5-838a-cc350cbe77f0/openshift-controller-manager-operator/0.log" Mar 20 08:39:15.686684 master-0 kubenswrapper[7465]: I0320 08:39:15.686618 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-p7pt6_68252533-bd64-4fc5-838a-cc350cbe77f0/openshift-controller-manager-operator/1.log" Mar 20 08:39:15.890602 master-0 kubenswrapper[7465]: I0320 08:39:15.890531 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-fc56bb77c-qd4sn_a9a9ecf2-c476-4962-8333-21f242dbcb89/controller-manager/0.log" Mar 20 08:39:15.978448 master-0 kubenswrapper[7465]: I0320 08:39:15.978347 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:15.978448 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:15.978448 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:15.978448 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:15.978850 master-0 kubenswrapper[7465]: I0320 08:39:15.978465 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:16.091174 master-0 kubenswrapper[7465]: I0320 08:39:16.091095 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-56f686584b-fdcx5_a638c468-010c-4da3-ad62-26f5f2bbdbb9/route-controller-manager/0.log" Mar 20 08:39:16.296614 master-0 kubenswrapper[7465]: I0320 08:39:16.296566 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-68f85b4d6c-fzm28_df428d5a-c722-4536-8e7f-cdd85c560481/catalog-operator/0.log" Mar 20 08:39:16.355163 master-0 kubenswrapper[7465]: I0320 08:39:16.353576 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv"] Mar 20 08:39:16.355163 master-0 kubenswrapper[7465]: E0320 08:39:16.353875 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06159643-e9b8-417f-ba79-ee1e1ca5c951" containerName="extract-utilities" Mar 20 08:39:16.355163 master-0 kubenswrapper[7465]: I0320 08:39:16.353888 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="06159643-e9b8-417f-ba79-ee1e1ca5c951" containerName="extract-utilities" Mar 20 08:39:16.355163 master-0 kubenswrapper[7465]: E0320 08:39:16.353906 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06159643-e9b8-417f-ba79-ee1e1ca5c951" containerName="extract-content" Mar 20 08:39:16.355163 master-0 kubenswrapper[7465]: I0320 08:39:16.353913 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="06159643-e9b8-417f-ba79-ee1e1ca5c951" containerName="extract-content" Mar 20 08:39:16.355163 master-0 kubenswrapper[7465]: E0320 08:39:16.353929 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06159643-e9b8-417f-ba79-ee1e1ca5c951" containerName="registry-server" Mar 20 08:39:16.355163 master-0 kubenswrapper[7465]: I0320 08:39:16.353935 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="06159643-e9b8-417f-ba79-ee1e1ca5c951" containerName="registry-server" Mar 20 08:39:16.355163 master-0 kubenswrapper[7465]: I0320 08:39:16.354049 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="06159643-e9b8-417f-ba79-ee1e1ca5c951" containerName="registry-server" Mar 20 08:39:16.355163 master-0 kubenswrapper[7465]: I0320 08:39:16.354941 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:39:16.359810 master-0 kubenswrapper[7465]: I0320 08:39:16.357817 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 20 08:39:16.359810 master-0 kubenswrapper[7465]: I0320 08:39:16.358177 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-n6dht" Mar 20 08:39:16.371068 master-0 kubenswrapper[7465]: I0320 08:39:16.370901 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 20 08:39:16.375101 master-0 kubenswrapper[7465]: I0320 08:39:16.375047 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv"] Mar 20 08:39:16.390858 master-0 kubenswrapper[7465]: I0320 08:39:16.390752 7465 generic.go:334] "Generic (PLEG): container finished" podID="c593e31d-82b5-4d42-992e-6b050ccf3019" containerID="06e5d2b0055041a2ae0e49aa4151d374fab625e94cc004550ff8ef85c3bfe80e" exitCode=0 Mar 20 08:39:16.391113 master-0 kubenswrapper[7465]: I0320 08:39:16.390878 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jstrn" event={"ID":"c593e31d-82b5-4d42-992e-6b050ccf3019","Type":"ContainerDied","Data":"06e5d2b0055041a2ae0e49aa4151d374fab625e94cc004550ff8ef85c3bfe80e"} Mar 20 08:39:16.395325 master-0 kubenswrapper[7465]: I0320 08:39:16.395216 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqqrk" event={"ID":"654b5b1c-2764-415c-bb13-aa06899f4076","Type":"ContainerStarted","Data":"3067a13f78d11d596f1c77026b3bb1da6fa4f2d79a95bf33c69377015a27bf8d"} Mar 20 08:39:16.445953 master-0 kubenswrapper[7465]: I0320 08:39:16.443438 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1db4d695-5a6a-4fbe-b610-3777bfebed79-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:39:16.445953 master-0 kubenswrapper[7465]: I0320 08:39:16.443569 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whmmk\" (UniqueName: \"kubernetes.io/projected/1db4d695-5a6a-4fbe-b610-3777bfebed79-kube-api-access-whmmk\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:39:16.445953 master-0 kubenswrapper[7465]: I0320 08:39:16.443656 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:39:16.445953 master-0 kubenswrapper[7465]: I0320 08:39:16.443689 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:39:16.449200 master-0 kubenswrapper[7465]: I0320 08:39:16.447993 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq"] Mar 20 08:39:16.449588 master-0 kubenswrapper[7465]: I0320 08:39:16.449553 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.453211 master-0 kubenswrapper[7465]: I0320 08:39:16.452767 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lb4t5"] Mar 20 08:39:16.457370 master-0 kubenswrapper[7465]: I0320 08:39:16.455398 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 20 08:39:16.457370 master-0 kubenswrapper[7465]: I0320 08:39:16.455655 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 20 08:39:16.457370 master-0 kubenswrapper[7465]: I0320 08:39:16.455927 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 20 08:39:16.457740 master-0 kubenswrapper[7465]: I0320 08:39:16.457671 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-js69c" Mar 20 08:39:16.462045 master-0 kubenswrapper[7465]: I0320 08:39:16.461999 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq"] Mar 20 08:39:16.462267 master-0 kubenswrapper[7465]: I0320 08:39:16.462241 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.467002 master-0 kubenswrapper[7465]: I0320 08:39:16.466961 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 20 08:39:16.467266 master-0 kubenswrapper[7465]: I0320 08:39:16.467245 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-62tl6" Mar 20 08:39:16.471982 master-0 kubenswrapper[7465]: I0320 08:39:16.468396 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 20 08:39:16.492978 master-0 kubenswrapper[7465]: I0320 08:39:16.492882 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hqqrk" podStartSLOduration=3.857567433 podStartE2EDuration="5.492850085s" podCreationTimestamp="2026-03-20 08:39:11 +0000 UTC" firstStartedPulling="2026-03-20 08:39:14.351881796 +0000 UTC m=+179.995197296" lastFinishedPulling="2026-03-20 08:39:15.987164458 +0000 UTC m=+181.630479948" observedRunningTime="2026-03-20 08:39:16.485340456 +0000 UTC m=+182.128655946" watchObservedRunningTime="2026-03-20 08:39:16.492850085 +0000 UTC m=+182.136165585" Mar 20 08:39:16.531010 master-0 kubenswrapper[7465]: I0320 08:39:16.529555 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-5c9796789-tjm9l_7b489385-2c96-4a97-8b31-362162de020e/olm-operator/0.log" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544126 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1468ec0-2aa4-461c-a62f-e9f067be490f-metrics-client-ca\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544216 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-sys\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544252 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544276 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prcgg\" (UniqueName: \"kubernetes.io/projected/a25248c0-8de7-4624-b785-f053665fcb23-kube-api-access-prcgg\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544310 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1db4d695-5a6a-4fbe-b610-3777bfebed79-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544347 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whmmk\" (UniqueName: \"kubernetes.io/projected/1db4d695-5a6a-4fbe-b610-3777bfebed79-kube-api-access-whmmk\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544380 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544400 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-root\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544422 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-wtmp\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544444 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqmv5\" (UniqueName: \"kubernetes.io/projected/f1468ec0-2aa4-461c-a62f-e9f067be490f-kube-api-access-bqmv5\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544466 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544492 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a25248c0-8de7-4624-b785-f053665fcb23-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544520 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-textfile\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544545 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544577 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:39:16.544555 master-0 kubenswrapper[7465]: I0320 08:39:16.544599 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-tls\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.545294 master-0 kubenswrapper[7465]: I0320 08:39:16.544626 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.545294 master-0 kubenswrapper[7465]: I0320 08:39:16.544666 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.545890 master-0 kubenswrapper[7465]: I0320 08:39:16.545778 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1db4d695-5a6a-4fbe-b610-3777bfebed79-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:39:16.553256 master-0 kubenswrapper[7465]: I0320 08:39:16.550082 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:39:16.553878 master-0 kubenswrapper[7465]: I0320 08:39:16.553836 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:39:16.569027 master-0 kubenswrapper[7465]: I0320 08:39:16.568472 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whmmk\" (UniqueName: \"kubernetes.io/projected/1db4d695-5a6a-4fbe-b610-3777bfebed79-kube-api-access-whmmk\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:39:16.645842 master-0 kubenswrapper[7465]: I0320 08:39:16.645758 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.645842 master-0 kubenswrapper[7465]: I0320 08:39:16.645836 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-root\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.645842 master-0 kubenswrapper[7465]: I0320 08:39:16.645859 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-wtmp\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.646179 master-0 kubenswrapper[7465]: I0320 08:39:16.645883 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqmv5\" (UniqueName: \"kubernetes.io/projected/f1468ec0-2aa4-461c-a62f-e9f067be490f-kube-api-access-bqmv5\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.646179 master-0 kubenswrapper[7465]: I0320 08:39:16.645910 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.646179 master-0 kubenswrapper[7465]: I0320 08:39:16.645935 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a25248c0-8de7-4624-b785-f053665fcb23-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.646179 master-0 kubenswrapper[7465]: I0320 08:39:16.645962 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-textfile\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.646179 master-0 kubenswrapper[7465]: I0320 08:39:16.645984 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-tls\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.646179 master-0 kubenswrapper[7465]: I0320 08:39:16.645999 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.646179 master-0 kubenswrapper[7465]: I0320 08:39:16.646030 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.646179 master-0 kubenswrapper[7465]: I0320 08:39:16.646075 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1468ec0-2aa4-461c-a62f-e9f067be490f-metrics-client-ca\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.646179 master-0 kubenswrapper[7465]: I0320 08:39:16.646101 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-sys\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.646179 master-0 kubenswrapper[7465]: I0320 08:39:16.646124 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.646179 master-0 kubenswrapper[7465]: I0320 08:39:16.646149 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prcgg\" (UniqueName: \"kubernetes.io/projected/a25248c0-8de7-4624-b785-f053665fcb23-kube-api-access-prcgg\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.646989 master-0 kubenswrapper[7465]: I0320 08:39:16.646940 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a25248c0-8de7-4624-b785-f053665fcb23-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.647684 master-0 kubenswrapper[7465]: I0320 08:39:16.647634 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.647744 master-0 kubenswrapper[7465]: I0320 08:39:16.647669 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1468ec0-2aa4-461c-a62f-e9f067be490f-metrics-client-ca\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.647860 master-0 kubenswrapper[7465]: I0320 08:39:16.647832 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-root\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.648024 master-0 kubenswrapper[7465]: I0320 08:39:16.647999 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-wtmp\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.648346 master-0 kubenswrapper[7465]: I0320 08:39:16.648317 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-sys\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.648626 master-0 kubenswrapper[7465]: I0320 08:39:16.648598 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.648626 master-0 kubenswrapper[7465]: I0320 08:39:16.648617 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-textfile\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.650146 master-0 kubenswrapper[7465]: I0320 08:39:16.650113 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.652304 master-0 kubenswrapper[7465]: I0320 08:39:16.652267 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.653176 master-0 kubenswrapper[7465]: I0320 08:39:16.653087 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.653969 master-0 kubenswrapper[7465]: I0320 08:39:16.653907 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-tls\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.672667 master-0 kubenswrapper[7465]: I0320 08:39:16.672597 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:39:16.674210 master-0 kubenswrapper[7465]: I0320 08:39:16.674137 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqmv5\" (UniqueName: \"kubernetes.io/projected/f1468ec0-2aa4-461c-a62f-e9f067be490f-kube-api-access-bqmv5\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.677552 master-0 kubenswrapper[7465]: I0320 08:39:16.677527 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prcgg\" (UniqueName: \"kubernetes.io/projected/a25248c0-8de7-4624-b785-f053665fcb23-kube-api-access-prcgg\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.686652 master-0 kubenswrapper[7465]: I0320 08:39:16.686614 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-2pg77_bbc0b783-28d5-4554-b49d-c66082546f44/kube-rbac-proxy/0.log" Mar 20 08:39:16.787535 master-0 kubenswrapper[7465]: I0320 08:39:16.779390 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:39:16.824286 master-0 kubenswrapper[7465]: I0320 08:39:16.824236 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:39:16.851348 master-0 kubenswrapper[7465]: W0320 08:39:16.851278 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1468ec0_2aa4_461c_a62f_e9f067be490f.slice/crio-89ef9ca9615c3facb086051e9be1597bf7ab0b883ad4dc7992533c4b3196f4e1 WatchSource:0}: Error finding container 89ef9ca9615c3facb086051e9be1597bf7ab0b883ad4dc7992533c4b3196f4e1: Status 404 returned error can't find the container with id 89ef9ca9615c3facb086051e9be1597bf7ab0b883ad4dc7992533c4b3196f4e1 Mar 20 08:39:16.891094 master-0 kubenswrapper[7465]: I0320 08:39:16.891054 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-2pg77_bbc0b783-28d5-4554-b49d-c66082546f44/package-server-manager/0.log" Mar 20 08:39:16.978006 master-0 kubenswrapper[7465]: I0320 08:39:16.977816 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:16.978006 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:16.978006 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:16.978006 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:16.978006 master-0 kubenswrapper[7465]: I0320 08:39:16.977958 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:17.089105 master-0 kubenswrapper[7465]: I0320 08:39:17.088748 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_packageserver-6c85f64bb9-fmpsg_b543f82e-683d-47c1-af73-4dcede4cf4df/packageserver/0.log" Mar 20 08:39:17.106276 master-0 kubenswrapper[7465]: I0320 08:39:17.106221 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv"] Mar 20 08:39:17.259999 master-0 kubenswrapper[7465]: I0320 08:39:17.259943 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq"] Mar 20 08:39:17.267249 master-0 kubenswrapper[7465]: W0320 08:39:17.267193 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda25248c0_8de7_4624_b785_f053665fcb23.slice/crio-ddcba86a9171baf183956cb4886e6b2242be15b32412346569edbcdcef731dfb WatchSource:0}: Error finding container ddcba86a9171baf183956cb4886e6b2242be15b32412346569edbcdcef731dfb: Status 404 returned error can't find the container with id ddcba86a9171baf183956cb4886e6b2242be15b32412346569edbcdcef731dfb Mar 20 08:39:17.406743 master-0 kubenswrapper[7465]: I0320 08:39:17.405848 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jstrn" event={"ID":"c593e31d-82b5-4d42-992e-6b050ccf3019","Type":"ContainerStarted","Data":"f7a9dfc612086fe204a444301a032e9febcd36b7ff057eed4b49245b1a8cb51b"} Mar 20 08:39:17.410260 master-0 kubenswrapper[7465]: I0320 08:39:17.409511 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" event={"ID":"a25248c0-8de7-4624-b785-f053665fcb23","Type":"ContainerStarted","Data":"ddcba86a9171baf183956cb4886e6b2242be15b32412346569edbcdcef731dfb"} Mar 20 08:39:17.412152 master-0 kubenswrapper[7465]: I0320 08:39:17.412108 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lb4t5" event={"ID":"f1468ec0-2aa4-461c-a62f-e9f067be490f","Type":"ContainerStarted","Data":"89ef9ca9615c3facb086051e9be1597bf7ab0b883ad4dc7992533c4b3196f4e1"} Mar 20 08:39:17.414667 master-0 kubenswrapper[7465]: I0320 08:39:17.414617 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" event={"ID":"1db4d695-5a6a-4fbe-b610-3777bfebed79","Type":"ContainerStarted","Data":"3e60c5cfa98299dd39cc95de18ed36bba9874f58f0e45ca3537df89924dd66d3"} Mar 20 08:39:17.414667 master-0 kubenswrapper[7465]: I0320 08:39:17.414651 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" event={"ID":"1db4d695-5a6a-4fbe-b610-3777bfebed79","Type":"ContainerStarted","Data":"a07ae992a49295676f3184ce503f903e0b4447cd57b0d7e0c91d07d9a0f3bc30"} Mar 20 08:39:17.446082 master-0 kubenswrapper[7465]: I0320 08:39:17.445975 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jstrn" podStartSLOduration=2.924778764 podStartE2EDuration="5.445943488s" podCreationTimestamp="2026-03-20 08:39:12 +0000 UTC" firstStartedPulling="2026-03-20 08:39:14.362909458 +0000 UTC m=+180.006224958" lastFinishedPulling="2026-03-20 08:39:16.884074202 +0000 UTC m=+182.527389682" observedRunningTime="2026-03-20 08:39:17.439533031 +0000 UTC m=+183.082848531" watchObservedRunningTime="2026-03-20 08:39:17.445943488 +0000 UTC m=+183.089258978" Mar 20 08:39:17.978111 master-0 kubenswrapper[7465]: I0320 08:39:17.978035 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:17.978111 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:17.978111 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:17.978111 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:17.978111 master-0 kubenswrapper[7465]: I0320 08:39:17.978116 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:18.435686 master-0 kubenswrapper[7465]: I0320 08:39:18.435603 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" event={"ID":"1db4d695-5a6a-4fbe-b610-3777bfebed79","Type":"ContainerStarted","Data":"856f29ac023f36f98944c12ae603aac4a4e79b44143d8336c20eaae8f55415c9"} Mar 20 08:39:18.978420 master-0 kubenswrapper[7465]: I0320 08:39:18.978125 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:18.978420 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:18.978420 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:18.978420 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:18.978420 master-0 kubenswrapper[7465]: I0320 08:39:18.978418 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:19.448813 master-0 kubenswrapper[7465]: I0320 08:39:19.448744 7465 generic.go:334] "Generic (PLEG): container finished" podID="f1468ec0-2aa4-461c-a62f-e9f067be490f" containerID="883d2d12dc7b471a5dda61efc08657fb43e4a9f74d94e048d8c741bca0b177ad" exitCode=0 Mar 20 08:39:19.449520 master-0 kubenswrapper[7465]: I0320 08:39:19.448810 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lb4t5" event={"ID":"f1468ec0-2aa4-461c-a62f-e9f067be490f","Type":"ContainerDied","Data":"883d2d12dc7b471a5dda61efc08657fb43e4a9f74d94e048d8c741bca0b177ad"} Mar 20 08:39:19.568615 master-0 kubenswrapper[7465]: I0320 08:39:19.568546 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:39:19.569260 master-0 kubenswrapper[7465]: I0320 08:39:19.569219 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:39:19.609815 master-0 kubenswrapper[7465]: I0320 08:39:19.609767 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:39:19.882682 master-0 kubenswrapper[7465]: I0320 08:39:19.881721 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:39:19.882778 master-0 kubenswrapper[7465]: I0320 08:39:19.882661 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:39:19.947404 master-0 kubenswrapper[7465]: I0320 08:39:19.946970 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:39:19.978500 master-0 kubenswrapper[7465]: I0320 08:39:19.978456 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:19.978500 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:19.978500 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:19.978500 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:19.978668 master-0 kubenswrapper[7465]: I0320 08:39:19.978523 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:20.019859 master-0 kubenswrapper[7465]: I0320 08:39:20.019728 7465 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 20 08:39:20.019859 master-0 kubenswrapper[7465]: I0320 08:39:20.019837 7465 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 20 08:39:20.020288 master-0 kubenswrapper[7465]: E0320 08:39:20.020260 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" Mar 20 08:39:20.020288 master-0 kubenswrapper[7465]: I0320 08:39:20.020286 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" Mar 20 08:39:20.020288 master-0 kubenswrapper[7465]: E0320 08:39:20.020302 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 20 08:39:20.020462 master-0 kubenswrapper[7465]: I0320 08:39:20.020312 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 20 08:39:20.020462 master-0 kubenswrapper[7465]: E0320 08:39:20.020349 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 20 08:39:20.020462 master-0 kubenswrapper[7465]: I0320 08:39:20.020359 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 20 08:39:20.020582 master-0 kubenswrapper[7465]: I0320 08:39:20.020517 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 20 08:39:20.020582 master-0 kubenswrapper[7465]: I0320 08:39:20.020536 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 20 08:39:20.020582 master-0 kubenswrapper[7465]: I0320 08:39:20.020556 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 20 08:39:20.020582 master-0 kubenswrapper[7465]: I0320 08:39:20.020573 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" Mar 20 08:39:20.020766 master-0 kubenswrapper[7465]: E0320 08:39:20.020742 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 20 08:39:20.020766 master-0 kubenswrapper[7465]: I0320 08:39:20.020761 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 20 08:39:20.020873 master-0 kubenswrapper[7465]: I0320 08:39:20.020824 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" containerID="cri-o://9adcc83ca09a3e8a61346c1bb76c593566cc39bfca1852854fa89f14749366d6" gracePeriod=30 Mar 20 08:39:20.021053 master-0 kubenswrapper[7465]: I0320 08:39:20.021022 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" containerID="cri-o://8cdf7ffa9625537bd484b3cd72f3ca62a1fbd66303b800564461ec0e3e2735c7" gracePeriod=30 Mar 20 08:39:20.023345 master-0 kubenswrapper[7465]: I0320 08:39:20.022316 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:20.095154 master-0 kubenswrapper[7465]: I0320 08:39:20.095090 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 20 08:39:20.171404 master-0 kubenswrapper[7465]: I0320 08:39:20.171361 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:39:20.196618 master-0 kubenswrapper[7465]: I0320 08:39:20.196535 7465 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="6a4dad0a-55bd-4b34-bab2-ec0b5f326c08" Mar 20 08:39:20.225825 master-0 kubenswrapper[7465]: I0320 08:39:20.225772 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2028761b8522f874dcebf13c4683d033\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:20.225971 master-0 kubenswrapper[7465]: I0320 08:39:20.225853 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2028761b8522f874dcebf13c4683d033\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:20.327356 master-0 kubenswrapper[7465]: I0320 08:39:20.327278 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 20 08:39:20.327633 master-0 kubenswrapper[7465]: I0320 08:39:20.327449 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:39:20.327633 master-0 kubenswrapper[7465]: I0320 08:39:20.327475 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 20 08:39:20.327633 master-0 kubenswrapper[7465]: I0320 08:39:20.327563 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config" (OuterVolumeSpecName: "config") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:39:20.327795 master-0 kubenswrapper[7465]: I0320 08:39:20.327645 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 20 08:39:20.327795 master-0 kubenswrapper[7465]: I0320 08:39:20.327697 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 20 08:39:20.327795 master-0 kubenswrapper[7465]: I0320 08:39:20.327754 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:39:20.327795 master-0 kubenswrapper[7465]: I0320 08:39:20.327793 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs" (OuterVolumeSpecName: "logs") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:39:20.327942 master-0 kubenswrapper[7465]: I0320 08:39:20.327761 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 20 08:39:20.327942 master-0 kubenswrapper[7465]: I0320 08:39:20.327779 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets" (OuterVolumeSpecName: "secrets") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:39:20.328055 master-0 kubenswrapper[7465]: I0320 08:39:20.328012 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2028761b8522f874dcebf13c4683d033\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:20.328104 master-0 kubenswrapper[7465]: I0320 08:39:20.328069 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2028761b8522f874dcebf13c4683d033\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:20.328171 master-0 kubenswrapper[7465]: I0320 08:39:20.328145 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2028761b8522f874dcebf13c4683d033\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:20.328332 master-0 kubenswrapper[7465]: I0320 08:39:20.328267 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2028761b8522f874dcebf13c4683d033\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:20.328332 master-0 kubenswrapper[7465]: I0320 08:39:20.328320 7465 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:20.328419 master-0 kubenswrapper[7465]: I0320 08:39:20.328360 7465 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:20.328419 master-0 kubenswrapper[7465]: I0320 08:39:20.328385 7465 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:20.328419 master-0 kubenswrapper[7465]: I0320 08:39:20.328405 7465 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:20.328530 master-0 kubenswrapper[7465]: I0320 08:39:20.328423 7465 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:20.389453 master-0 kubenswrapper[7465]: I0320 08:39:20.389384 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:20.458970 master-0 kubenswrapper[7465]: I0320 08:39:20.458907 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" event={"ID":"a25248c0-8de7-4624-b785-f053665fcb23","Type":"ContainerStarted","Data":"6e8c691b31da6df37623c08d39c5a3d5d1885fb91c071985b479d2eb81e7db7d"} Mar 20 08:39:20.458970 master-0 kubenswrapper[7465]: I0320 08:39:20.458970 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" event={"ID":"a25248c0-8de7-4624-b785-f053665fcb23","Type":"ContainerStarted","Data":"663a1ee240ddbbb57df122e471c26a4e956062c21904538d83d0ecc72e0d36d2"} Mar 20 08:39:20.458970 master-0 kubenswrapper[7465]: I0320 08:39:20.458984 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" event={"ID":"a25248c0-8de7-4624-b785-f053665fcb23","Type":"ContainerStarted","Data":"de2b21b57e0c31844cd5e9cb7d4c3aefcf1e8f73d137e744bfa1142beaa27fcb"} Mar 20 08:39:20.464794 master-0 kubenswrapper[7465]: I0320 08:39:20.464754 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lb4t5" event={"ID":"f1468ec0-2aa4-461c-a62f-e9f067be490f","Type":"ContainerStarted","Data":"b0c63e9d7c1f9be7381bf4be717b03c8b5ca9ba05360c41198144679850f6e32"} Mar 20 08:39:20.464918 master-0 kubenswrapper[7465]: I0320 08:39:20.464799 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lb4t5" event={"ID":"f1468ec0-2aa4-461c-a62f-e9f067be490f","Type":"ContainerStarted","Data":"8a0af0baf6c97cf8c67073408806404b1f7a015e994a90e5da1cb7cb116ae5cd"} Mar 20 08:39:20.468856 master-0 kubenswrapper[7465]: I0320 08:39:20.468787 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" event={"ID":"1db4d695-5a6a-4fbe-b610-3777bfebed79","Type":"ContainerStarted","Data":"17b9005f6779cb1212a0b15002baf44d2f058535e99a27fb893b2250411c5f89"} Mar 20 08:39:20.471395 master-0 kubenswrapper[7465]: I0320 08:39:20.471360 7465 generic.go:334] "Generic (PLEG): container finished" podID="8b1c7a56-5d00-468a-bb8d-dbaf8e854951" containerID="5b4a47b78349fa5185bcf45526d28c821dd34bc78966a86b575a5f0037835565" exitCode=0 Mar 20 08:39:20.471453 master-0 kubenswrapper[7465]: I0320 08:39:20.471418 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"8b1c7a56-5d00-468a-bb8d-dbaf8e854951","Type":"ContainerDied","Data":"5b4a47b78349fa5185bcf45526d28c821dd34bc78966a86b575a5f0037835565"} Mar 20 08:39:20.475540 master-0 kubenswrapper[7465]: I0320 08:39:20.475494 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"760e18ae3700547257879a5def4bd8a3e6845f819368981331d76400fe97af9e"} Mar 20 08:39:20.478401 master-0 kubenswrapper[7465]: I0320 08:39:20.478365 7465 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="8cdf7ffa9625537bd484b3cd72f3ca62a1fbd66303b800564461ec0e3e2735c7" exitCode=0 Mar 20 08:39:20.478401 master-0 kubenswrapper[7465]: I0320 08:39:20.478392 7465 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="9adcc83ca09a3e8a61346c1bb76c593566cc39bfca1852854fa89f14749366d6" exitCode=0 Mar 20 08:39:20.478550 master-0 kubenswrapper[7465]: I0320 08:39:20.478427 7465 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24e53548727bf62b2d0124119a6e8fe69dde1c3031b4a91a063f8dd58773fece" Mar 20 08:39:20.478550 master-0 kubenswrapper[7465]: I0320 08:39:20.478423 7465 scope.go:117] "RemoveContainer" containerID="17402333a3400904f60b3f059728c9faa13bc6bb53c95e63d5e8325a42104e2f" Mar 20 08:39:20.478550 master-0 kubenswrapper[7465]: I0320 08:39:20.478471 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 20 08:39:20.489726 master-0 kubenswrapper[7465]: I0320 08:39:20.489632 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" podStartSLOduration=2.198050467 podStartE2EDuration="4.48960541s" podCreationTimestamp="2026-03-20 08:39:16 +0000 UTC" firstStartedPulling="2026-03-20 08:39:17.270715945 +0000 UTC m=+182.914031425" lastFinishedPulling="2026-03-20 08:39:19.562270878 +0000 UTC m=+185.205586368" observedRunningTime="2026-03-20 08:39:20.485699456 +0000 UTC m=+186.129014966" watchObservedRunningTime="2026-03-20 08:39:20.48960541 +0000 UTC m=+186.132920900" Mar 20 08:39:20.535937 master-0 kubenswrapper[7465]: I0320 08:39:20.535860 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:39:20.546033 master-0 kubenswrapper[7465]: I0320 08:39:20.545979 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f265536aba6292ead501bc9b49f327" path="/var/lib/kubelet/pods/46f265536aba6292ead501bc9b49f327/volumes" Mar 20 08:39:20.546428 master-0 kubenswrapper[7465]: I0320 08:39:20.546404 7465 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Mar 20 08:39:20.740396 master-0 kubenswrapper[7465]: I0320 08:39:20.739958 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" podStartSLOduration=2.5912525520000003 podStartE2EDuration="4.739878384s" podCreationTimestamp="2026-03-20 08:39:16 +0000 UTC" firstStartedPulling="2026-03-20 08:39:17.427718017 +0000 UTC m=+183.071033507" lastFinishedPulling="2026-03-20 08:39:19.576343849 +0000 UTC m=+185.219659339" observedRunningTime="2026-03-20 08:39:20.737388811 +0000 UTC m=+186.380704321" watchObservedRunningTime="2026-03-20 08:39:20.739878384 +0000 UTC m=+186.383193874" Mar 20 08:39:20.941568 master-0 kubenswrapper[7465]: I0320 08:39:20.941497 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:39:20.941568 master-0 kubenswrapper[7465]: I0320 08:39:20.941558 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 20 08:39:20.941568 master-0 kubenswrapper[7465]: I0320 08:39:20.941573 7465 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="6a4dad0a-55bd-4b34-bab2-ec0b5f326c08" Mar 20 08:39:20.943703 master-0 kubenswrapper[7465]: I0320 08:39:20.943643 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 20 08:39:20.943778 master-0 kubenswrapper[7465]: I0320 08:39:20.943717 7465 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="6a4dad0a-55bd-4b34-bab2-ec0b5f326c08" Mar 20 08:39:20.946155 master-0 kubenswrapper[7465]: I0320 08:39:20.946076 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lb4t5" podStartSLOduration=3.306766412 podStartE2EDuration="4.946060121s" podCreationTimestamp="2026-03-20 08:39:16 +0000 UTC" firstStartedPulling="2026-03-20 08:39:16.855247521 +0000 UTC m=+182.498563011" lastFinishedPulling="2026-03-20 08:39:18.49454123 +0000 UTC m=+184.137856720" observedRunningTime="2026-03-20 08:39:20.941498218 +0000 UTC m=+186.584813728" watchObservedRunningTime="2026-03-20 08:39:20.946060121 +0000 UTC m=+186.589375611" Mar 20 08:39:20.978967 master-0 kubenswrapper[7465]: I0320 08:39:20.978896 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:20.978967 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:20.978967 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:20.978967 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:20.979359 master-0 kubenswrapper[7465]: I0320 08:39:20.978980 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:21.497134 master-0 kubenswrapper[7465]: I0320 08:39:21.497055 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"cfd1a328fad8a1fbf228c2c6250b1d586490a1a703c23535a3eb7dcf1bbe5867"} Mar 20 08:39:21.497134 master-0 kubenswrapper[7465]: I0320 08:39:21.497130 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"df731fba572a5e906cf5649aa4cc14b16e25717d1ba7fd8e7dc02e7ea0aa85be"} Mar 20 08:39:21.497753 master-0 kubenswrapper[7465]: I0320 08:39:21.497146 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"fe1c54a0e0873f0780c7faf59fe96cab185ee928520426321b3ddc92f25a0204"} Mar 20 08:39:21.802679 master-0 kubenswrapper[7465]: I0320 08:39:21.802331 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:39:21.958912 master-0 kubenswrapper[7465]: I0320 08:39:21.958772 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-kube-api-access\") pod \"8b1c7a56-5d00-468a-bb8d-dbaf8e854951\" (UID: \"8b1c7a56-5d00-468a-bb8d-dbaf8e854951\") " Mar 20 08:39:21.959247 master-0 kubenswrapper[7465]: I0320 08:39:21.959231 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-var-lock\") pod \"8b1c7a56-5d00-468a-bb8d-dbaf8e854951\" (UID: \"8b1c7a56-5d00-468a-bb8d-dbaf8e854951\") " Mar 20 08:39:21.959390 master-0 kubenswrapper[7465]: I0320 08:39:21.959378 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-kubelet-dir\") pod \"8b1c7a56-5d00-468a-bb8d-dbaf8e854951\" (UID: \"8b1c7a56-5d00-468a-bb8d-dbaf8e854951\") " Mar 20 08:39:21.959561 master-0 kubenswrapper[7465]: I0320 08:39:21.959384 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-var-lock" (OuterVolumeSpecName: "var-lock") pod "8b1c7a56-5d00-468a-bb8d-dbaf8e854951" (UID: "8b1c7a56-5d00-468a-bb8d-dbaf8e854951"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:39:21.959611 master-0 kubenswrapper[7465]: I0320 08:39:21.959447 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8b1c7a56-5d00-468a-bb8d-dbaf8e854951" (UID: "8b1c7a56-5d00-468a-bb8d-dbaf8e854951"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:39:21.959901 master-0 kubenswrapper[7465]: I0320 08:39:21.959880 7465 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:21.959980 master-0 kubenswrapper[7465]: I0320 08:39:21.959967 7465 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:21.962102 master-0 kubenswrapper[7465]: I0320 08:39:21.962027 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8b1c7a56-5d00-468a-bb8d-dbaf8e854951" (UID: "8b1c7a56-5d00-468a-bb8d-dbaf8e854951"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:39:21.978778 master-0 kubenswrapper[7465]: I0320 08:39:21.978718 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:21.978778 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:21.978778 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:21.978778 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:21.979001 master-0 kubenswrapper[7465]: I0320 08:39:21.978800 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:22.062132 master-0 kubenswrapper[7465]: I0320 08:39:22.062037 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b1c7a56-5d00-468a-bb8d-dbaf8e854951-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:22.340456 master-0 kubenswrapper[7465]: I0320 08:39:22.340357 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:39:22.340456 master-0 kubenswrapper[7465]: I0320 08:39:22.340455 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:39:22.386599 master-0 kubenswrapper[7465]: I0320 08:39:22.386532 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:39:22.504749 master-0 kubenswrapper[7465]: I0320 08:39:22.504579 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:39:22.504749 master-0 kubenswrapper[7465]: I0320 08:39:22.504584 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"8b1c7a56-5d00-468a-bb8d-dbaf8e854951","Type":"ContainerDied","Data":"ff84ae96bfd96fbd48f779854c556cddaa04ed8b8b78d40e920af9da64968681"} Mar 20 08:39:22.504749 master-0 kubenswrapper[7465]: I0320 08:39:22.504762 7465 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff84ae96bfd96fbd48f779854c556cddaa04ed8b8b78d40e920af9da64968681" Mar 20 08:39:22.507689 master-0 kubenswrapper[7465]: I0320 08:39:22.507636 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"665625ddaf4d7d5a13e6f9aa415e12a52677c52b3254cb6bcb690bbf3d2cdd27"} Mar 20 08:39:22.548825 master-0 kubenswrapper[7465]: I0320 08:39:22.548768 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:39:22.549051 master-0 kubenswrapper[7465]: I0320 08:39:22.548848 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:39:22.549051 master-0 kubenswrapper[7465]: I0320 08:39:22.548865 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:39:22.753227 master-0 kubenswrapper[7465]: I0320 08:39:22.753029 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.753007142 podStartE2EDuration="2.753007142s" podCreationTimestamp="2026-03-20 08:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:39:22.750807268 +0000 UTC m=+188.394122758" watchObservedRunningTime="2026-03-20 08:39:22.753007142 +0000 UTC m=+188.396322632" Mar 20 08:39:22.980283 master-0 kubenswrapper[7465]: I0320 08:39:22.980216 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:22.980283 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:22.980283 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:22.980283 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:22.980622 master-0 kubenswrapper[7465]: I0320 08:39:22.980312 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:23.590740 master-0 kubenswrapper[7465]: I0320 08:39:23.590543 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jstrn" podUID="c593e31d-82b5-4d42-992e-6b050ccf3019" containerName="registry-server" probeResult="failure" output=< Mar 20 08:39:23.590740 master-0 kubenswrapper[7465]: timeout: failed to connect service ":50051" within 1s Mar 20 08:39:23.590740 master-0 kubenswrapper[7465]: > Mar 20 08:39:23.977455 master-0 kubenswrapper[7465]: I0320 08:39:23.977274 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:23.977455 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:23.977455 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:23.977455 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:23.977455 master-0 kubenswrapper[7465]: I0320 08:39:23.977358 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:24.978092 master-0 kubenswrapper[7465]: I0320 08:39:24.977902 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:24.978092 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:24.978092 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:24.978092 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:24.978092 master-0 kubenswrapper[7465]: I0320 08:39:24.978042 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:25.979055 master-0 kubenswrapper[7465]: I0320 08:39:25.978569 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:25.979055 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:25.979055 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:25.979055 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:25.979055 master-0 kubenswrapper[7465]: I0320 08:39:25.978686 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:26.978944 master-0 kubenswrapper[7465]: I0320 08:39:26.978859 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:26.978944 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:26.978944 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:26.978944 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:26.980821 master-0 kubenswrapper[7465]: I0320 08:39:26.980757 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:27.226177 master-0 kubenswrapper[7465]: I0320 08:39:27.226117 7465 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 20 08:39:27.226772 master-0 kubenswrapper[7465]: I0320 08:39:27.226741 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" containerID="cri-o://865ba621afdb76437d315fccb4bc81df59adddaddae54ec22d99130fa629dce8" gracePeriod=30 Mar 20 08:39:27.235331 master-0 kubenswrapper[7465]: I0320 08:39:27.235231 7465 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 20 08:39:27.235734 master-0 kubenswrapper[7465]: E0320 08:39:27.235706 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 20 08:39:27.235734 master-0 kubenswrapper[7465]: I0320 08:39:27.235734 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 20 08:39:27.235811 master-0 kubenswrapper[7465]: E0320 08:39:27.235756 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b1c7a56-5d00-468a-bb8d-dbaf8e854951" containerName="installer" Mar 20 08:39:27.235811 master-0 kubenswrapper[7465]: I0320 08:39:27.235767 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b1c7a56-5d00-468a-bb8d-dbaf8e854951" containerName="installer" Mar 20 08:39:27.235811 master-0 kubenswrapper[7465]: E0320 08:39:27.235777 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 20 08:39:27.235811 master-0 kubenswrapper[7465]: I0320 08:39:27.235785 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 20 08:39:27.236011 master-0 kubenswrapper[7465]: I0320 08:39:27.235990 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b1c7a56-5d00-468a-bb8d-dbaf8e854951" containerName="installer" Mar 20 08:39:27.236052 master-0 kubenswrapper[7465]: I0320 08:39:27.236012 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 20 08:39:27.236052 master-0 kubenswrapper[7465]: I0320 08:39:27.236023 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 20 08:39:27.237636 master-0 kubenswrapper[7465]: I0320 08:39:27.237603 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:39:27.373308 master-0 kubenswrapper[7465]: I0320 08:39:27.373180 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/e3c8b9da1cd5cef8ca0690a6bbf5a601-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"e3c8b9da1cd5cef8ca0690a6bbf5a601\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:39:27.373624 master-0 kubenswrapper[7465]: I0320 08:39:27.373457 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/e3c8b9da1cd5cef8ca0690a6bbf5a601-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"e3c8b9da1cd5cef8ca0690a6bbf5a601\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:39:27.475513 master-0 kubenswrapper[7465]: I0320 08:39:27.475345 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/e3c8b9da1cd5cef8ca0690a6bbf5a601-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"e3c8b9da1cd5cef8ca0690a6bbf5a601\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:39:27.475920 master-0 kubenswrapper[7465]: I0320 08:39:27.475541 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/e3c8b9da1cd5cef8ca0690a6bbf5a601-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"e3c8b9da1cd5cef8ca0690a6bbf5a601\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:39:27.475920 master-0 kubenswrapper[7465]: I0320 08:39:27.475639 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/e3c8b9da1cd5cef8ca0690a6bbf5a601-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"e3c8b9da1cd5cef8ca0690a6bbf5a601\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:39:27.475920 master-0 kubenswrapper[7465]: I0320 08:39:27.475542 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/e3c8b9da1cd5cef8ca0690a6bbf5a601-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"e3c8b9da1cd5cef8ca0690a6bbf5a601\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:39:27.977518 master-0 kubenswrapper[7465]: I0320 08:39:27.977437 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:27.977518 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:27.977518 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:27.977518 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:27.977518 master-0 kubenswrapper[7465]: I0320 08:39:27.977514 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:28.131044 master-0 kubenswrapper[7465]: I0320 08:39:28.130964 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:39:28.142120 master-0 kubenswrapper[7465]: I0320 08:39:28.142018 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 20 08:39:28.160577 master-0 kubenswrapper[7465]: W0320 08:39:28.160516 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3c8b9da1cd5cef8ca0690a6bbf5a601.slice/crio-854c2e8507c2801d7ffa3834f81534fd70cc4fe64f92e8f44b54916d68349b2e WatchSource:0}: Error finding container 854c2e8507c2801d7ffa3834f81534fd70cc4fe64f92e8f44b54916d68349b2e: Status 404 returned error can't find the container with id 854c2e8507c2801d7ffa3834f81534fd70cc4fe64f92e8f44b54916d68349b2e Mar 20 08:39:28.479429 master-0 kubenswrapper[7465]: I0320 08:39:28.479397 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:39:28.552576 master-0 kubenswrapper[7465]: I0320 08:39:28.552429 7465 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Mar 20 08:39:28.568420 master-0 kubenswrapper[7465]: I0320 08:39:28.568348 7465 generic.go:334] "Generic (PLEG): container finished" podID="fac672fa-7660-449e-a0d1-244dc6282d76" containerID="aecbf33029725426faa2806ba773a548665753d84d9ec4f0ac83ae36cdffa3ce" exitCode=0 Mar 20 08:39:28.572140 master-0 kubenswrapper[7465]: I0320 08:39:28.571893 7465 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="865ba621afdb76437d315fccb4bc81df59adddaddae54ec22d99130fa629dce8" exitCode=0 Mar 20 08:39:28.572140 master-0 kubenswrapper[7465]: I0320 08:39:28.572019 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 20 08:39:28.573438 master-0 kubenswrapper[7465]: I0320 08:39:28.573387 7465 status_manager.go:875] "Failed to update status for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c56eead-effb-4886-902a-d6118236b54b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:39:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:39:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-scheduler]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:39:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-scheduler]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://865ba621afdb76437d315fccb4bc81df59adddaddae54ec22d99130fa629dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6369057ac22c507cf87d187a9b3dae20c7b88e730a831d4ea4937e43b3fed7fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T08:37:50Z\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T08:33:58Z\\\"}},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865ba621afdb76437d315fccb4bc81df59adddaddae54ec22d99130fa629dce8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:39:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:37:51Z\\\"}}}]}}\" for pod \"kube-system\"/\"bootstrap-kube-scheduler-master-0\": pods \"bootstrap-kube-scheduler-master-0\" not found" Mar 20 08:39:28.574540 master-0 kubenswrapper[7465]: I0320 08:39:28.574473 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"e3c8b9da1cd5cef8ca0690a6bbf5a601","Type":"ContainerStarted","Data":"854c2e8507c2801d7ffa3834f81534fd70cc4fe64f92e8f44b54916d68349b2e"} Mar 20 08:39:28.574540 master-0 kubenswrapper[7465]: I0320 08:39:28.574530 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"fac672fa-7660-449e-a0d1-244dc6282d76","Type":"ContainerDied","Data":"aecbf33029725426faa2806ba773a548665753d84d9ec4f0ac83ae36cdffa3ce"} Mar 20 08:39:28.574672 master-0 kubenswrapper[7465]: I0320 08:39:28.574566 7465 scope.go:117] "RemoveContainer" containerID="865ba621afdb76437d315fccb4bc81df59adddaddae54ec22d99130fa629dce8" Mar 20 08:39:28.575252 master-0 kubenswrapper[7465]: I0320 08:39:28.575230 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 20 08:39:28.575252 master-0 kubenswrapper[7465]: I0320 08:39:28.575249 7465 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="9c56eead-effb-4886-902a-d6118236b54b" Mar 20 08:39:28.579310 master-0 kubenswrapper[7465]: I0320 08:39:28.579243 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 20 08:39:28.579310 master-0 kubenswrapper[7465]: I0320 08:39:28.579300 7465 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="9c56eead-effb-4886-902a-d6118236b54b" Mar 20 08:39:28.600756 master-0 kubenswrapper[7465]: I0320 08:39:28.600693 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"c83737980b9ee109184b1d78e942cf36\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " Mar 20 08:39:28.600756 master-0 kubenswrapper[7465]: I0320 08:39:28.600753 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"c83737980b9ee109184b1d78e942cf36\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " Mar 20 08:39:28.601122 master-0 kubenswrapper[7465]: I0320 08:39:28.600831 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets" (OuterVolumeSpecName: "secrets") pod "c83737980b9ee109184b1d78e942cf36" (UID: "c83737980b9ee109184b1d78e942cf36"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:39:28.601122 master-0 kubenswrapper[7465]: I0320 08:39:28.600909 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs" (OuterVolumeSpecName: "logs") pod "c83737980b9ee109184b1d78e942cf36" (UID: "c83737980b9ee109184b1d78e942cf36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:39:28.601655 master-0 kubenswrapper[7465]: I0320 08:39:28.601621 7465 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:28.601655 master-0 kubenswrapper[7465]: I0320 08:39:28.601645 7465 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:28.612316 master-0 kubenswrapper[7465]: I0320 08:39:28.612271 7465 scope.go:117] "RemoveContainer" containerID="6369057ac22c507cf87d187a9b3dae20c7b88e730a831d4ea4937e43b3fed7fb" Mar 20 08:39:28.679320 master-0 kubenswrapper[7465]: I0320 08:39:28.679234 7465 scope.go:117] "RemoveContainer" containerID="865ba621afdb76437d315fccb4bc81df59adddaddae54ec22d99130fa629dce8" Mar 20 08:39:28.680160 master-0 kubenswrapper[7465]: E0320 08:39:28.680086 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865ba621afdb76437d315fccb4bc81df59adddaddae54ec22d99130fa629dce8\": container with ID starting with 865ba621afdb76437d315fccb4bc81df59adddaddae54ec22d99130fa629dce8 not found: ID does not exist" containerID="865ba621afdb76437d315fccb4bc81df59adddaddae54ec22d99130fa629dce8" Mar 20 08:39:28.680237 master-0 kubenswrapper[7465]: I0320 08:39:28.680172 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865ba621afdb76437d315fccb4bc81df59adddaddae54ec22d99130fa629dce8"} err="failed to get container status \"865ba621afdb76437d315fccb4bc81df59adddaddae54ec22d99130fa629dce8\": rpc error: code = NotFound desc = could not find container \"865ba621afdb76437d315fccb4bc81df59adddaddae54ec22d99130fa629dce8\": container with ID starting with 865ba621afdb76437d315fccb4bc81df59adddaddae54ec22d99130fa629dce8 not found: ID does not exist" Mar 20 08:39:28.680237 master-0 kubenswrapper[7465]: I0320 08:39:28.680229 7465 scope.go:117] "RemoveContainer" containerID="6369057ac22c507cf87d187a9b3dae20c7b88e730a831d4ea4937e43b3fed7fb" Mar 20 08:39:28.680798 master-0 kubenswrapper[7465]: E0320 08:39:28.680752 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6369057ac22c507cf87d187a9b3dae20c7b88e730a831d4ea4937e43b3fed7fb\": container with ID starting with 6369057ac22c507cf87d187a9b3dae20c7b88e730a831d4ea4937e43b3fed7fb not found: ID does not exist" containerID="6369057ac22c507cf87d187a9b3dae20c7b88e730a831d4ea4937e43b3fed7fb" Mar 20 08:39:28.680867 master-0 kubenswrapper[7465]: I0320 08:39:28.680803 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6369057ac22c507cf87d187a9b3dae20c7b88e730a831d4ea4937e43b3fed7fb"} err="failed to get container status \"6369057ac22c507cf87d187a9b3dae20c7b88e730a831d4ea4937e43b3fed7fb\": rpc error: code = NotFound desc = could not find container \"6369057ac22c507cf87d187a9b3dae20c7b88e730a831d4ea4937e43b3fed7fb\": container with ID starting with 6369057ac22c507cf87d187a9b3dae20c7b88e730a831d4ea4937e43b3fed7fb not found: ID does not exist" Mar 20 08:39:28.978574 master-0 kubenswrapper[7465]: I0320 08:39:28.978481 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:28.978574 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:28.978574 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:28.978574 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:28.979155 master-0 kubenswrapper[7465]: I0320 08:39:28.978592 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:29.582483 master-0 kubenswrapper[7465]: I0320 08:39:29.582427 7465 generic.go:334] "Generic (PLEG): container finished" podID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerID="b43c5f4dbc5493b32b9934371a1875a8e1d7c69940c30587bfa291adee73b603" exitCode=0 Mar 20 08:39:29.585640 master-0 kubenswrapper[7465]: I0320 08:39:29.582511 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"e3c8b9da1cd5cef8ca0690a6bbf5a601","Type":"ContainerDied","Data":"b43c5f4dbc5493b32b9934371a1875a8e1d7c69940c30587bfa291adee73b603"} Mar 20 08:39:29.604959 master-0 kubenswrapper[7465]: I0320 08:39:29.603630 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 20 08:39:29.606276 master-0 kubenswrapper[7465]: I0320 08:39:29.605761 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 20 08:39:29.609532 master-0 kubenswrapper[7465]: I0320 08:39:29.608903 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-k4ktd" Mar 20 08:39:29.617399 master-0 kubenswrapper[7465]: I0320 08:39:29.617010 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 08:39:29.618402 master-0 kubenswrapper[7465]: I0320 08:39:29.618327 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/691bb06b-5a2c-4740-8120-ed99a5555c86-kube-api-access\") pod \"installer-2-master-0\" (UID: \"691bb06b-5a2c-4740-8120-ed99a5555c86\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 20 08:39:29.618522 master-0 kubenswrapper[7465]: I0320 08:39:29.618488 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/691bb06b-5a2c-4740-8120-ed99a5555c86-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"691bb06b-5a2c-4740-8120-ed99a5555c86\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 20 08:39:29.618600 master-0 kubenswrapper[7465]: I0320 08:39:29.618586 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/691bb06b-5a2c-4740-8120-ed99a5555c86-var-lock\") pod \"installer-2-master-0\" (UID: \"691bb06b-5a2c-4740-8120-ed99a5555c86\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 20 08:39:29.629601 master-0 kubenswrapper[7465]: I0320 08:39:29.629330 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 20 08:39:29.719506 master-0 kubenswrapper[7465]: I0320 08:39:29.719429 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/691bb06b-5a2c-4740-8120-ed99a5555c86-kube-api-access\") pod \"installer-2-master-0\" (UID: \"691bb06b-5a2c-4740-8120-ed99a5555c86\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 20 08:39:29.719723 master-0 kubenswrapper[7465]: I0320 08:39:29.719561 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/691bb06b-5a2c-4740-8120-ed99a5555c86-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"691bb06b-5a2c-4740-8120-ed99a5555c86\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 20 08:39:29.719723 master-0 kubenswrapper[7465]: I0320 08:39:29.719606 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/691bb06b-5a2c-4740-8120-ed99a5555c86-var-lock\") pod \"installer-2-master-0\" (UID: \"691bb06b-5a2c-4740-8120-ed99a5555c86\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 20 08:39:29.719723 master-0 kubenswrapper[7465]: I0320 08:39:29.719689 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/691bb06b-5a2c-4740-8120-ed99a5555c86-var-lock\") pod \"installer-2-master-0\" (UID: \"691bb06b-5a2c-4740-8120-ed99a5555c86\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 20 08:39:29.720126 master-0 kubenswrapper[7465]: I0320 08:39:29.720088 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/691bb06b-5a2c-4740-8120-ed99a5555c86-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"691bb06b-5a2c-4740-8120-ed99a5555c86\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 20 08:39:29.742570 master-0 kubenswrapper[7465]: I0320 08:39:29.742508 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/691bb06b-5a2c-4740-8120-ed99a5555c86-kube-api-access\") pod \"installer-2-master-0\" (UID: \"691bb06b-5a2c-4740-8120-ed99a5555c86\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 20 08:39:29.944091 master-0 kubenswrapper[7465]: I0320 08:39:29.944029 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 20 08:39:29.977346 master-0 kubenswrapper[7465]: I0320 08:39:29.977279 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 20 08:39:29.977730 master-0 kubenswrapper[7465]: I0320 08:39:29.977436 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:29.977730 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:29.977730 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:29.977730 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:29.977730 master-0 kubenswrapper[7465]: I0320 08:39:29.977471 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:30.129213 master-0 kubenswrapper[7465]: I0320 08:39:30.129109 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fac672fa-7660-449e-a0d1-244dc6282d76-var-lock\") pod \"fac672fa-7660-449e-a0d1-244dc6282d76\" (UID: \"fac672fa-7660-449e-a0d1-244dc6282d76\") " Mar 20 08:39:30.129367 master-0 kubenswrapper[7465]: I0320 08:39:30.129242 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fac672fa-7660-449e-a0d1-244dc6282d76-kubelet-dir\") pod \"fac672fa-7660-449e-a0d1-244dc6282d76\" (UID: \"fac672fa-7660-449e-a0d1-244dc6282d76\") " Mar 20 08:39:30.129502 master-0 kubenswrapper[7465]: I0320 08:39:30.129427 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fac672fa-7660-449e-a0d1-244dc6282d76-kube-api-access\") pod \"fac672fa-7660-449e-a0d1-244dc6282d76\" (UID: \"fac672fa-7660-449e-a0d1-244dc6282d76\") " Mar 20 08:39:30.129502 master-0 kubenswrapper[7465]: I0320 08:39:30.129462 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac672fa-7660-449e-a0d1-244dc6282d76-var-lock" (OuterVolumeSpecName: "var-lock") pod "fac672fa-7660-449e-a0d1-244dc6282d76" (UID: "fac672fa-7660-449e-a0d1-244dc6282d76"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:39:30.129616 master-0 kubenswrapper[7465]: I0320 08:39:30.129534 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fac672fa-7660-449e-a0d1-244dc6282d76-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fac672fa-7660-449e-a0d1-244dc6282d76" (UID: "fac672fa-7660-449e-a0d1-244dc6282d76"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:39:30.130200 master-0 kubenswrapper[7465]: I0320 08:39:30.129946 7465 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fac672fa-7660-449e-a0d1-244dc6282d76-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:30.130200 master-0 kubenswrapper[7465]: I0320 08:39:30.129985 7465 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fac672fa-7660-449e-a0d1-244dc6282d76-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:30.134380 master-0 kubenswrapper[7465]: I0320 08:39:30.134066 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fac672fa-7660-449e-a0d1-244dc6282d76-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fac672fa-7660-449e-a0d1-244dc6282d76" (UID: "fac672fa-7660-449e-a0d1-244dc6282d76"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:39:30.233713 master-0 kubenswrapper[7465]: I0320 08:39:30.233313 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fac672fa-7660-449e-a0d1-244dc6282d76-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:39:30.392679 master-0 kubenswrapper[7465]: I0320 08:39:30.392606 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:30.393548 master-0 kubenswrapper[7465]: I0320 08:39:30.393503 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:30.393641 master-0 kubenswrapper[7465]: I0320 08:39:30.393553 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:30.393641 master-0 kubenswrapper[7465]: I0320 08:39:30.393605 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:30.398298 master-0 kubenswrapper[7465]: I0320 08:39:30.398224 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:30.401538 master-0 kubenswrapper[7465]: I0320 08:39:30.401479 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:30.423726 master-0 kubenswrapper[7465]: I0320 08:39:30.423654 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 20 08:39:30.433698 master-0 kubenswrapper[7465]: W0320 08:39:30.433602 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod691bb06b_5a2c_4740_8120_ed99a5555c86.slice/crio-66c547d67e9a85e6414707cd68db7045d29e9e0bee76db900e4800136f7a7875 WatchSource:0}: Error finding container 66c547d67e9a85e6414707cd68db7045d29e9e0bee76db900e4800136f7a7875: Status 404 returned error can't find the container with id 66c547d67e9a85e6414707cd68db7045d29e9e0bee76db900e4800136f7a7875 Mar 20 08:39:30.550764 master-0 kubenswrapper[7465]: I0320 08:39:30.550660 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83737980b9ee109184b1d78e942cf36" path="/var/lib/kubelet/pods/c83737980b9ee109184b1d78e942cf36/volumes" Mar 20 08:39:30.604162 master-0 kubenswrapper[7465]: I0320 08:39:30.604007 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"fac672fa-7660-449e-a0d1-244dc6282d76","Type":"ContainerDied","Data":"49b715d08715612464f503c3f66bf7c99b13a5e872383e023e27eea30084adb2"} Mar 20 08:39:30.604162 master-0 kubenswrapper[7465]: I0320 08:39:30.604089 7465 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49b715d08715612464f503c3f66bf7c99b13a5e872383e023e27eea30084adb2" Mar 20 08:39:30.604162 master-0 kubenswrapper[7465]: I0320 08:39:30.604045 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 20 08:39:30.607364 master-0 kubenswrapper[7465]: I0320 08:39:30.607337 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"e3c8b9da1cd5cef8ca0690a6bbf5a601","Type":"ContainerStarted","Data":"73a7f9993a52ad274232661b06d25f3b18e0675faed4b301aeb4072dcc7cfa79"} Mar 20 08:39:30.607430 master-0 kubenswrapper[7465]: I0320 08:39:30.607365 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"e3c8b9da1cd5cef8ca0690a6bbf5a601","Type":"ContainerStarted","Data":"230d37232882904f1764e96ed6057bf568baed29ff892b892e215ce87e945710"} Mar 20 08:39:30.607430 master-0 kubenswrapper[7465]: I0320 08:39:30.607374 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"e3c8b9da1cd5cef8ca0690a6bbf5a601","Type":"ContainerStarted","Data":"6e31068727643e077f1c9461b5883b919e163a79d9088735e4c5d39688c47867"} Mar 20 08:39:30.607905 master-0 kubenswrapper[7465]: I0320 08:39:30.607794 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:39:30.610414 master-0 kubenswrapper[7465]: I0320 08:39:30.610131 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"691bb06b-5a2c-4740-8120-ed99a5555c86","Type":"ContainerStarted","Data":"66c547d67e9a85e6414707cd68db7045d29e9e0bee76db900e4800136f7a7875"} Mar 20 08:39:30.614665 master-0 kubenswrapper[7465]: I0320 08:39:30.614631 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:30.632399 master-0 kubenswrapper[7465]: I0320 08:39:30.632233 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=3.632203999 podStartE2EDuration="3.632203999s" podCreationTimestamp="2026-03-20 08:39:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:39:30.628534151 +0000 UTC m=+196.271849661" watchObservedRunningTime="2026-03-20 08:39:30.632203999 +0000 UTC m=+196.275519509" Mar 20 08:39:30.978167 master-0 kubenswrapper[7465]: I0320 08:39:30.977768 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:30.978167 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:30.978167 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:30.978167 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:30.978167 master-0 kubenswrapper[7465]: I0320 08:39:30.977857 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:31.621066 master-0 kubenswrapper[7465]: I0320 08:39:31.620992 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"691bb06b-5a2c-4740-8120-ed99a5555c86","Type":"ContainerStarted","Data":"0c7be4808113e3bc2cf4d774c20218ea8cdd721e1ebcaa053a65aca874068e52"} Mar 20 08:39:31.628031 master-0 kubenswrapper[7465]: I0320 08:39:31.627972 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:39:31.651106 master-0 kubenswrapper[7465]: I0320 08:39:31.651001 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-0" podStartSLOduration=2.650974799 podStartE2EDuration="2.650974799s" podCreationTimestamp="2026-03-20 08:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:39:31.645965883 +0000 UTC m=+197.289281373" watchObservedRunningTime="2026-03-20 08:39:31.650974799 +0000 UTC m=+197.294290279" Mar 20 08:39:31.978828 master-0 kubenswrapper[7465]: I0320 08:39:31.978645 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:31.978828 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:31.978828 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:31.978828 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:31.978828 master-0 kubenswrapper[7465]: I0320 08:39:31.978773 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:32.595254 master-0 kubenswrapper[7465]: I0320 08:39:32.595154 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:39:32.657788 master-0 kubenswrapper[7465]: I0320 08:39:32.657267 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:39:32.978354 master-0 kubenswrapper[7465]: I0320 08:39:32.978096 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:32.978354 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:32.978354 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:32.978354 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:32.978354 master-0 kubenswrapper[7465]: I0320 08:39:32.978272 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:33.978129 master-0 kubenswrapper[7465]: I0320 08:39:33.978055 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:33.978129 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:33.978129 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:33.978129 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:33.978878 master-0 kubenswrapper[7465]: I0320 08:39:33.978162 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:34.977208 master-0 kubenswrapper[7465]: I0320 08:39:34.977042 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:34.977208 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:34.977208 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:34.977208 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:34.977208 master-0 kubenswrapper[7465]: I0320 08:39:34.977127 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:35.993430 master-0 kubenswrapper[7465]: I0320 08:39:35.993312 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:35.993430 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:35.993430 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:35.993430 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:35.993430 master-0 kubenswrapper[7465]: I0320 08:39:35.993407 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:36.978859 master-0 kubenswrapper[7465]: I0320 08:39:36.978763 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:36.978859 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:36.978859 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:36.978859 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:36.979339 master-0 kubenswrapper[7465]: I0320 08:39:36.978882 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:37.977607 master-0 kubenswrapper[7465]: I0320 08:39:37.977504 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:37.977607 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:37.977607 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:37.977607 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:37.978421 master-0 kubenswrapper[7465]: I0320 08:39:37.977641 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:38.961231 master-0 kubenswrapper[7465]: I0320 08:39:38.960376 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-64c67d44c4-s7vfs"] Mar 20 08:39:38.961231 master-0 kubenswrapper[7465]: E0320 08:39:38.960760 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac672fa-7660-449e-a0d1-244dc6282d76" containerName="installer" Mar 20 08:39:38.961231 master-0 kubenswrapper[7465]: I0320 08:39:38.960775 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac672fa-7660-449e-a0d1-244dc6282d76" containerName="installer" Mar 20 08:39:38.961231 master-0 kubenswrapper[7465]: I0320 08:39:38.960921 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac672fa-7660-449e-a0d1-244dc6282d76" containerName="installer" Mar 20 08:39:38.963913 master-0 kubenswrapper[7465]: I0320 08:39:38.963875 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:38.966847 master-0 kubenswrapper[7465]: I0320 08:39:38.966800 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 20 08:39:38.966977 master-0 kubenswrapper[7465]: I0320 08:39:38.966945 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 20 08:39:38.967122 master-0 kubenswrapper[7465]: I0320 08:39:38.967096 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-2trhv" Mar 20 08:39:38.967160 master-0 kubenswrapper[7465]: I0320 08:39:38.967092 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 20 08:39:38.967289 master-0 kubenswrapper[7465]: I0320 08:39:38.967271 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 20 08:39:38.974455 master-0 kubenswrapper[7465]: I0320 08:39:38.970554 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-40c28rqu4fltf" Mar 20 08:39:38.979118 master-0 kubenswrapper[7465]: I0320 08:39:38.978889 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-64c67d44c4-s7vfs"] Mar 20 08:39:38.979924 master-0 kubenswrapper[7465]: I0320 08:39:38.979268 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:39:38.979924 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:39:38.979924 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:39:38.979924 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:39:38.979924 master-0 kubenswrapper[7465]: I0320 08:39:38.979358 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:39:38.979924 master-0 kubenswrapper[7465]: I0320 08:39:38.979435 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:39:38.980172 master-0 kubenswrapper[7465]: I0320 08:39:38.980087 7465 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"239c73fe7acb21fae05105f39e9db825bd1c976521db41e975ce454317c12c28"} pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" containerMessage="Container router failed startup probe, will be restarted" Mar 20 08:39:38.980172 master-0 kubenswrapper[7465]: I0320 08:39:38.980130 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" containerID="cri-o://239c73fe7acb21fae05105f39e9db825bd1c976521db41e975ce454317c12c28" gracePeriod=3600 Mar 20 08:39:39.076230 master-0 kubenswrapper[7465]: I0320 08:39:39.075561 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6p8v\" (UniqueName: \"kubernetes.io/projected/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-kube-api-access-c6p8v\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.076230 master-0 kubenswrapper[7465]: I0320 08:39:39.075634 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-server-tls\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.076230 master-0 kubenswrapper[7465]: I0320 08:39:39.075674 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-client-certs\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.076230 master-0 kubenswrapper[7465]: I0320 08:39:39.075722 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.076230 master-0 kubenswrapper[7465]: I0320 08:39:39.075758 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-metrics-server-audit-profiles\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.076230 master-0 kubenswrapper[7465]: I0320 08:39:39.075782 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-audit-log\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.076230 master-0 kubenswrapper[7465]: I0320 08:39:39.075827 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-client-ca-bundle\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.177824 master-0 kubenswrapper[7465]: I0320 08:39:39.177730 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-client-certs\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.178125 master-0 kubenswrapper[7465]: I0320 08:39:39.177856 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.178125 master-0 kubenswrapper[7465]: I0320 08:39:39.177898 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-metrics-server-audit-profiles\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.178125 master-0 kubenswrapper[7465]: I0320 08:39:39.177935 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-audit-log\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.178125 master-0 kubenswrapper[7465]: I0320 08:39:39.178009 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-client-ca-bundle\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.178125 master-0 kubenswrapper[7465]: I0320 08:39:39.178051 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6p8v\" (UniqueName: \"kubernetes.io/projected/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-kube-api-access-c6p8v\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.178125 master-0 kubenswrapper[7465]: I0320 08:39:39.178082 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-server-tls\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.181211 master-0 kubenswrapper[7465]: I0320 08:39:39.179396 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-audit-log\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.181211 master-0 kubenswrapper[7465]: I0320 08:39:39.179484 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.181211 master-0 kubenswrapper[7465]: I0320 08:39:39.180885 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-metrics-server-audit-profiles\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.185574 master-0 kubenswrapper[7465]: I0320 08:39:39.182115 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-client-certs\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.187127 master-0 kubenswrapper[7465]: I0320 08:39:39.187074 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-server-tls\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.188216 master-0 kubenswrapper[7465]: I0320 08:39:39.187762 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-client-ca-bundle\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.225346 master-0 kubenswrapper[7465]: I0320 08:39:39.225057 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6p8v\" (UniqueName: \"kubernetes.io/projected/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-kube-api-access-c6p8v\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.301210 master-0 kubenswrapper[7465]: I0320 08:39:39.301056 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:39.852235 master-0 kubenswrapper[7465]: I0320 08:39:39.852136 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-64c67d44c4-s7vfs"] Mar 20 08:39:39.857560 master-0 kubenswrapper[7465]: W0320 08:39:39.857491 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda69e8d3a_a0b1_4688_8631_d9f265aa4c69.slice/crio-9786d6c6139bd8ce0f2d0b4a8dd76068f202fd2531fb86bd7b58ce95ed53f64f WatchSource:0}: Error finding container 9786d6c6139bd8ce0f2d0b4a8dd76068f202fd2531fb86bd7b58ce95ed53f64f: Status 404 returned error can't find the container with id 9786d6c6139bd8ce0f2d0b4a8dd76068f202fd2531fb86bd7b58ce95ed53f64f Mar 20 08:39:40.693990 master-0 kubenswrapper[7465]: I0320 08:39:40.693914 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" event={"ID":"a69e8d3a-a0b1-4688-8631-d9f265aa4c69","Type":"ContainerStarted","Data":"9786d6c6139bd8ce0f2d0b4a8dd76068f202fd2531fb86bd7b58ce95ed53f64f"} Mar 20 08:39:41.703293 master-0 kubenswrapper[7465]: I0320 08:39:41.703122 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" event={"ID":"a69e8d3a-a0b1-4688-8631-d9f265aa4c69","Type":"ContainerStarted","Data":"3fbc2ee15fb564cff30cd6bea616239c5ddcc5c59f79a881b3626f244e98915b"} Mar 20 08:39:41.725588 master-0 kubenswrapper[7465]: I0320 08:39:41.725504 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" podStartSLOduration=2.280507313 podStartE2EDuration="3.7254795s" podCreationTimestamp="2026-03-20 08:39:38 +0000 UTC" firstStartedPulling="2026-03-20 08:39:39.860778594 +0000 UTC m=+205.504094084" lastFinishedPulling="2026-03-20 08:39:41.305750771 +0000 UTC m=+206.949066271" observedRunningTime="2026-03-20 08:39:41.721809653 +0000 UTC m=+207.365125163" watchObservedRunningTime="2026-03-20 08:39:41.7254795 +0000 UTC m=+207.368794990" Mar 20 08:39:47.795918 master-0 kubenswrapper[7465]: I0320 08:39:47.795825 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 20 08:39:47.797350 master-0 kubenswrapper[7465]: I0320 08:39:47.797310 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 20 08:39:47.801573 master-0 kubenswrapper[7465]: I0320 08:39:47.801523 7465 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-2crwp" Mar 20 08:39:47.801667 master-0 kubenswrapper[7465]: I0320 08:39:47.801556 7465 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 20 08:39:47.809806 master-0 kubenswrapper[7465]: I0320 08:39:47.809744 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 20 08:39:47.947301 master-0 kubenswrapper[7465]: I0320 08:39:47.947218 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e219558-98b7-4528-88cf-97b87cd1eb6c-var-lock\") pod \"installer-3-master-0\" (UID: \"7e219558-98b7-4528-88cf-97b87cd1eb6c\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 20 08:39:47.947301 master-0 kubenswrapper[7465]: I0320 08:39:47.947277 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e219558-98b7-4528-88cf-97b87cd1eb6c-kube-api-access\") pod \"installer-3-master-0\" (UID: \"7e219558-98b7-4528-88cf-97b87cd1eb6c\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 20 08:39:47.947635 master-0 kubenswrapper[7465]: I0320 08:39:47.947330 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e219558-98b7-4528-88cf-97b87cd1eb6c-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"7e219558-98b7-4528-88cf-97b87cd1eb6c\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 20 08:39:48.049237 master-0 kubenswrapper[7465]: I0320 08:39:48.049071 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e219558-98b7-4528-88cf-97b87cd1eb6c-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"7e219558-98b7-4528-88cf-97b87cd1eb6c\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 20 08:39:48.049486 master-0 kubenswrapper[7465]: I0320 08:39:48.049259 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e219558-98b7-4528-88cf-97b87cd1eb6c-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"7e219558-98b7-4528-88cf-97b87cd1eb6c\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 20 08:39:48.049486 master-0 kubenswrapper[7465]: I0320 08:39:48.049287 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e219558-98b7-4528-88cf-97b87cd1eb6c-var-lock\") pod \"installer-3-master-0\" (UID: \"7e219558-98b7-4528-88cf-97b87cd1eb6c\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 20 08:39:48.049486 master-0 kubenswrapper[7465]: I0320 08:39:48.049321 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e219558-98b7-4528-88cf-97b87cd1eb6c-kube-api-access\") pod \"installer-3-master-0\" (UID: \"7e219558-98b7-4528-88cf-97b87cd1eb6c\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 20 08:39:48.049486 master-0 kubenswrapper[7465]: I0320 08:39:48.049358 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e219558-98b7-4528-88cf-97b87cd1eb6c-var-lock\") pod \"installer-3-master-0\" (UID: \"7e219558-98b7-4528-88cf-97b87cd1eb6c\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 20 08:39:48.065396 master-0 kubenswrapper[7465]: I0320 08:39:48.065359 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e219558-98b7-4528-88cf-97b87cd1eb6c-kube-api-access\") pod \"installer-3-master-0\" (UID: \"7e219558-98b7-4528-88cf-97b87cd1eb6c\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 20 08:39:48.120219 master-0 kubenswrapper[7465]: I0320 08:39:48.120147 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 20 08:39:48.544486 master-0 kubenswrapper[7465]: W0320 08:39:48.544427 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7e219558_98b7_4528_88cf_97b87cd1eb6c.slice/crio-2d6d575a8acffb3103f6760ea1452c58c9caadc565e202f8ee52c999c161d223 WatchSource:0}: Error finding container 2d6d575a8acffb3103f6760ea1452c58c9caadc565e202f8ee52c999c161d223: Status 404 returned error can't find the container with id 2d6d575a8acffb3103f6760ea1452c58c9caadc565e202f8ee52c999c161d223 Mar 20 08:39:48.552837 master-0 kubenswrapper[7465]: I0320 08:39:48.552789 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 20 08:39:49.230956 master-0 kubenswrapper[7465]: I0320 08:39:49.230707 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"7e219558-98b7-4528-88cf-97b87cd1eb6c","Type":"ContainerStarted","Data":"7ea4527aa6e7513c4d82b891ac586f8993c11e7ba38c1d1a048fc3535809e191"} Mar 20 08:39:49.230956 master-0 kubenswrapper[7465]: I0320 08:39:49.230773 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"7e219558-98b7-4528-88cf-97b87cd1eb6c","Type":"ContainerStarted","Data":"2d6d575a8acffb3103f6760ea1452c58c9caadc565e202f8ee52c999c161d223"} Mar 20 08:39:49.256450 master-0 kubenswrapper[7465]: I0320 08:39:49.256342 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=2.256315291 podStartE2EDuration="2.256315291s" podCreationTimestamp="2026-03-20 08:39:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:39:49.249133871 +0000 UTC m=+214.892449361" watchObservedRunningTime="2026-03-20 08:39:49.256315291 +0000 UTC m=+214.899630791" Mar 20 08:39:52.794528 master-0 kubenswrapper[7465]: I0320 08:39:52.794437 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 20 08:39:52.795117 master-0 kubenswrapper[7465]: I0320 08:39:52.794781 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-2-master-0" podUID="691bb06b-5a2c-4740-8120-ed99a5555c86" containerName="installer" containerID="cri-o://0c7be4808113e3bc2cf4d774c20218ea8cdd721e1ebcaa053a65aca874068e52" gracePeriod=30 Mar 20 08:39:56.598851 master-0 kubenswrapper[7465]: I0320 08:39:56.598696 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 20 08:39:56.600405 master-0 kubenswrapper[7465]: I0320 08:39:56.600361 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:39:56.619915 master-0 kubenswrapper[7465]: I0320 08:39:56.619391 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 20 08:39:56.715053 master-0 kubenswrapper[7465]: I0320 08:39:56.714973 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-var-lock\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:39:56.715341 master-0 kubenswrapper[7465]: I0320 08:39:56.715145 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:39:56.715341 master-0 kubenswrapper[7465]: I0320 08:39:56.715316 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:39:56.817719 master-0 kubenswrapper[7465]: I0320 08:39:56.817629 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:39:56.818057 master-0 kubenswrapper[7465]: I0320 08:39:56.817771 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:39:56.818057 master-0 kubenswrapper[7465]: I0320 08:39:56.817828 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-var-lock\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:39:56.818057 master-0 kubenswrapper[7465]: I0320 08:39:56.817955 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-var-lock\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:39:56.818494 master-0 kubenswrapper[7465]: I0320 08:39:56.818431 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:39:56.842476 master-0 kubenswrapper[7465]: I0320 08:39:56.842398 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:39:56.928141 master-0 kubenswrapper[7465]: I0320 08:39:56.927902 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:39:57.416329 master-0 kubenswrapper[7465]: I0320 08:39:57.416278 7465 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 20 08:39:58.299952 master-0 kubenswrapper[7465]: I0320 08:39:58.299599 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"d245e5b2-a30d-45c8-9b79-6e8096765c14","Type":"ContainerStarted","Data":"215282b36f0afcf690dcc7252b249b0c54821b459a2fe5a0ff25640fd36b6290"} Mar 20 08:39:58.299952 master-0 kubenswrapper[7465]: I0320 08:39:58.299667 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"d245e5b2-a30d-45c8-9b79-6e8096765c14","Type":"ContainerStarted","Data":"20814875b6be1da3d4e673bd4cab493f3904bdc8689013ab1822b3b670087e6a"} Mar 20 08:39:58.325733 master-0 kubenswrapper[7465]: I0320 08:39:58.325621 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=2.325588425 podStartE2EDuration="2.325588425s" podCreationTimestamp="2026-03-20 08:39:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:39:58.322869836 +0000 UTC m=+223.966185336" watchObservedRunningTime="2026-03-20 08:39:58.325588425 +0000 UTC m=+223.968903925" Mar 20 08:39:59.342206 master-0 kubenswrapper[7465]: I0320 08:39:59.301174 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:39:59.342928 master-0 kubenswrapper[7465]: I0320 08:39:59.342489 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:01.946828 master-0 kubenswrapper[7465]: I0320 08:40:01.946759 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_691bb06b-5a2c-4740-8120-ed99a5555c86/installer/0.log" Mar 20 08:40:01.947457 master-0 kubenswrapper[7465]: I0320 08:40:01.946852 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 20 08:40:02.137630 master-0 kubenswrapper[7465]: I0320 08:40:02.137554 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/691bb06b-5a2c-4740-8120-ed99a5555c86-kube-api-access\") pod \"691bb06b-5a2c-4740-8120-ed99a5555c86\" (UID: \"691bb06b-5a2c-4740-8120-ed99a5555c86\") " Mar 20 08:40:02.137899 master-0 kubenswrapper[7465]: I0320 08:40:02.137774 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/691bb06b-5a2c-4740-8120-ed99a5555c86-kubelet-dir\") pod \"691bb06b-5a2c-4740-8120-ed99a5555c86\" (UID: \"691bb06b-5a2c-4740-8120-ed99a5555c86\") " Mar 20 08:40:02.137986 master-0 kubenswrapper[7465]: I0320 08:40:02.137924 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/691bb06b-5a2c-4740-8120-ed99a5555c86-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "691bb06b-5a2c-4740-8120-ed99a5555c86" (UID: "691bb06b-5a2c-4740-8120-ed99a5555c86"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:02.137986 master-0 kubenswrapper[7465]: I0320 08:40:02.137949 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/691bb06b-5a2c-4740-8120-ed99a5555c86-var-lock\") pod \"691bb06b-5a2c-4740-8120-ed99a5555c86\" (UID: \"691bb06b-5a2c-4740-8120-ed99a5555c86\") " Mar 20 08:40:02.138056 master-0 kubenswrapper[7465]: I0320 08:40:02.138009 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/691bb06b-5a2c-4740-8120-ed99a5555c86-var-lock" (OuterVolumeSpecName: "var-lock") pod "691bb06b-5a2c-4740-8120-ed99a5555c86" (UID: "691bb06b-5a2c-4740-8120-ed99a5555c86"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:02.138898 master-0 kubenswrapper[7465]: I0320 08:40:02.138841 7465 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/691bb06b-5a2c-4740-8120-ed99a5555c86-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:02.138898 master-0 kubenswrapper[7465]: I0320 08:40:02.138870 7465 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/691bb06b-5a2c-4740-8120-ed99a5555c86-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:02.141058 master-0 kubenswrapper[7465]: I0320 08:40:02.141018 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/691bb06b-5a2c-4740-8120-ed99a5555c86-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "691bb06b-5a2c-4740-8120-ed99a5555c86" (UID: "691bb06b-5a2c-4740-8120-ed99a5555c86"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:40:02.240502 master-0 kubenswrapper[7465]: I0320 08:40:02.240428 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/691bb06b-5a2c-4740-8120-ed99a5555c86-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:02.326911 master-0 kubenswrapper[7465]: I0320 08:40:02.326842 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_691bb06b-5a2c-4740-8120-ed99a5555c86/installer/0.log" Mar 20 08:40:02.327277 master-0 kubenswrapper[7465]: I0320 08:40:02.326933 7465 generic.go:334] "Generic (PLEG): container finished" podID="691bb06b-5a2c-4740-8120-ed99a5555c86" containerID="0c7be4808113e3bc2cf4d774c20218ea8cdd721e1ebcaa053a65aca874068e52" exitCode=1 Mar 20 08:40:02.327277 master-0 kubenswrapper[7465]: I0320 08:40:02.326997 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"691bb06b-5a2c-4740-8120-ed99a5555c86","Type":"ContainerDied","Data":"0c7be4808113e3bc2cf4d774c20218ea8cdd721e1ebcaa053a65aca874068e52"} Mar 20 08:40:02.327277 master-0 kubenswrapper[7465]: I0320 08:40:02.327040 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"691bb06b-5a2c-4740-8120-ed99a5555c86","Type":"ContainerDied","Data":"66c547d67e9a85e6414707cd68db7045d29e9e0bee76db900e4800136f7a7875"} Mar 20 08:40:02.327277 master-0 kubenswrapper[7465]: I0320 08:40:02.327053 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 20 08:40:02.327277 master-0 kubenswrapper[7465]: I0320 08:40:02.327082 7465 scope.go:117] "RemoveContainer" containerID="0c7be4808113e3bc2cf4d774c20218ea8cdd721e1ebcaa053a65aca874068e52" Mar 20 08:40:02.345522 master-0 kubenswrapper[7465]: I0320 08:40:02.345475 7465 scope.go:117] "RemoveContainer" containerID="0c7be4808113e3bc2cf4d774c20218ea8cdd721e1ebcaa053a65aca874068e52" Mar 20 08:40:02.346581 master-0 kubenswrapper[7465]: E0320 08:40:02.346541 7465 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c7be4808113e3bc2cf4d774c20218ea8cdd721e1ebcaa053a65aca874068e52\": container with ID starting with 0c7be4808113e3bc2cf4d774c20218ea8cdd721e1ebcaa053a65aca874068e52 not found: ID does not exist" containerID="0c7be4808113e3bc2cf4d774c20218ea8cdd721e1ebcaa053a65aca874068e52" Mar 20 08:40:02.346668 master-0 kubenswrapper[7465]: I0320 08:40:02.346611 7465 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c7be4808113e3bc2cf4d774c20218ea8cdd721e1ebcaa053a65aca874068e52"} err="failed to get container status \"0c7be4808113e3bc2cf4d774c20218ea8cdd721e1ebcaa053a65aca874068e52\": rpc error: code = NotFound desc = could not find container \"0c7be4808113e3bc2cf4d774c20218ea8cdd721e1ebcaa053a65aca874068e52\": container with ID starting with 0c7be4808113e3bc2cf4d774c20218ea8cdd721e1ebcaa053a65aca874068e52 not found: ID does not exist" Mar 20 08:40:02.381010 master-0 kubenswrapper[7465]: I0320 08:40:02.380704 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 20 08:40:02.386479 master-0 kubenswrapper[7465]: I0320 08:40:02.385873 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 20 08:40:02.551560 master-0 kubenswrapper[7465]: I0320 08:40:02.551506 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="691bb06b-5a2c-4740-8120-ed99a5555c86" path="/var/lib/kubelet/pods/691bb06b-5a2c-4740-8120-ed99a5555c86/volumes" Mar 20 08:40:18.144042 master-0 kubenswrapper[7465]: I0320 08:40:18.143958 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:19.309323 master-0 kubenswrapper[7465]: I0320 08:40:19.309148 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:19.316923 master-0 kubenswrapper[7465]: I0320 08:40:19.316874 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:19.604975 master-0 kubenswrapper[7465]: I0320 08:40:19.604805 7465 scope.go:117] "RemoveContainer" containerID="9adcc83ca09a3e8a61346c1bb76c593566cc39bfca1852854fa89f14749366d6" Mar 20 08:40:20.096709 master-0 kubenswrapper[7465]: I0320 08:40:20.096633 7465 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 20 08:40:20.097312 master-0 kubenswrapper[7465]: I0320 08:40:20.097010 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler" containerID="cri-o://6e31068727643e077f1c9461b5883b919e163a79d9088735e4c5d39688c47867" gracePeriod=30 Mar 20 08:40:20.097312 master-0 kubenswrapper[7465]: I0320 08:40:20.097205 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler-cert-syncer" containerID="cri-o://230d37232882904f1764e96ed6057bf568baed29ff892b892e215ce87e945710" gracePeriod=30 Mar 20 08:40:20.097312 master-0 kubenswrapper[7465]: I0320 08:40:20.097125 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler-recovery-controller" containerID="cri-o://73a7f9993a52ad274232661b06d25f3b18e0675faed4b301aeb4072dcc7cfa79" gracePeriod=30 Mar 20 08:40:20.099476 master-0 kubenswrapper[7465]: I0320 08:40:20.099412 7465 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 20 08:40:20.099952 master-0 kubenswrapper[7465]: E0320 08:40:20.099913 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler" Mar 20 08:40:20.099952 master-0 kubenswrapper[7465]: I0320 08:40:20.099946 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler" Mar 20 08:40:20.100129 master-0 kubenswrapper[7465]: E0320 08:40:20.099980 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691bb06b-5a2c-4740-8120-ed99a5555c86" containerName="installer" Mar 20 08:40:20.100129 master-0 kubenswrapper[7465]: I0320 08:40:20.099994 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="691bb06b-5a2c-4740-8120-ed99a5555c86" containerName="installer" Mar 20 08:40:20.100129 master-0 kubenswrapper[7465]: E0320 08:40:20.100022 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler-cert-syncer" Mar 20 08:40:20.100129 master-0 kubenswrapper[7465]: I0320 08:40:20.100034 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler-cert-syncer" Mar 20 08:40:20.100129 master-0 kubenswrapper[7465]: E0320 08:40:20.100118 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler-recovery-controller" Mar 20 08:40:20.100129 master-0 kubenswrapper[7465]: I0320 08:40:20.100131 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler-recovery-controller" Mar 20 08:40:20.100503 master-0 kubenswrapper[7465]: E0320 08:40:20.100155 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="wait-for-host-port" Mar 20 08:40:20.100503 master-0 kubenswrapper[7465]: I0320 08:40:20.100169 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="wait-for-host-port" Mar 20 08:40:20.100503 master-0 kubenswrapper[7465]: I0320 08:40:20.100401 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler" Mar 20 08:40:20.100503 master-0 kubenswrapper[7465]: I0320 08:40:20.100423 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler-cert-syncer" Mar 20 08:40:20.100503 master-0 kubenswrapper[7465]: I0320 08:40:20.100443 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler-recovery-controller" Mar 20 08:40:20.100503 master-0 kubenswrapper[7465]: I0320 08:40:20.100461 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="691bb06b-5a2c-4740-8120-ed99a5555c86" containerName="installer" Mar 20 08:40:20.259206 master-0 kubenswrapper[7465]: I0320 08:40:20.259104 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:20.259206 master-0 kubenswrapper[7465]: I0320 08:40:20.259200 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:20.282251 master-0 kubenswrapper[7465]: I0320 08:40:20.282179 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_e3c8b9da1cd5cef8ca0690a6bbf5a601/kube-scheduler-cert-syncer/0.log" Mar 20 08:40:20.283733 master-0 kubenswrapper[7465]: I0320 08:40:20.283686 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:20.287253 master-0 kubenswrapper[7465]: I0320 08:40:20.287171 7465 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" podUID="8dd3d3608fe9c86b0f65904ec2353df4" Mar 20 08:40:20.361795 master-0 kubenswrapper[7465]: I0320 08:40:20.361617 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:20.361795 master-0 kubenswrapper[7465]: I0320 08:40:20.361740 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:20.361795 master-0 kubenswrapper[7465]: I0320 08:40:20.361768 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:20.362571 master-0 kubenswrapper[7465]: I0320 08:40:20.361818 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:20.463441 master-0 kubenswrapper[7465]: I0320 08:40:20.463366 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/e3c8b9da1cd5cef8ca0690a6bbf5a601-resource-dir\") pod \"e3c8b9da1cd5cef8ca0690a6bbf5a601\" (UID: \"e3c8b9da1cd5cef8ca0690a6bbf5a601\") " Mar 20 08:40:20.463732 master-0 kubenswrapper[7465]: I0320 08:40:20.463498 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3c8b9da1cd5cef8ca0690a6bbf5a601-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "e3c8b9da1cd5cef8ca0690a6bbf5a601" (UID: "e3c8b9da1cd5cef8ca0690a6bbf5a601"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:20.463732 master-0 kubenswrapper[7465]: I0320 08:40:20.463561 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/e3c8b9da1cd5cef8ca0690a6bbf5a601-cert-dir\") pod \"e3c8b9da1cd5cef8ca0690a6bbf5a601\" (UID: \"e3c8b9da1cd5cef8ca0690a6bbf5a601\") " Mar 20 08:40:20.463808 master-0 kubenswrapper[7465]: I0320 08:40:20.463735 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3c8b9da1cd5cef8ca0690a6bbf5a601-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "e3c8b9da1cd5cef8ca0690a6bbf5a601" (UID: "e3c8b9da1cd5cef8ca0690a6bbf5a601"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:20.463917 master-0 kubenswrapper[7465]: I0320 08:40:20.463891 7465 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/e3c8b9da1cd5cef8ca0690a6bbf5a601-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:20.463917 master-0 kubenswrapper[7465]: I0320 08:40:20.463912 7465 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/e3c8b9da1cd5cef8ca0690a6bbf5a601-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:20.467152 master-0 kubenswrapper[7465]: I0320 08:40:20.467106 7465 generic.go:334] "Generic (PLEG): container finished" podID="7e219558-98b7-4528-88cf-97b87cd1eb6c" containerID="7ea4527aa6e7513c4d82b891ac586f8993c11e7ba38c1d1a048fc3535809e191" exitCode=0 Mar 20 08:40:20.467263 master-0 kubenswrapper[7465]: I0320 08:40:20.467202 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"7e219558-98b7-4528-88cf-97b87cd1eb6c","Type":"ContainerDied","Data":"7ea4527aa6e7513c4d82b891ac586f8993c11e7ba38c1d1a048fc3535809e191"} Mar 20 08:40:20.470485 master-0 kubenswrapper[7465]: I0320 08:40:20.470436 7465 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_e3c8b9da1cd5cef8ca0690a6bbf5a601/kube-scheduler-cert-syncer/0.log" Mar 20 08:40:20.471716 master-0 kubenswrapper[7465]: I0320 08:40:20.471674 7465 generic.go:334] "Generic (PLEG): container finished" podID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerID="73a7f9993a52ad274232661b06d25f3b18e0675faed4b301aeb4072dcc7cfa79" exitCode=0 Mar 20 08:40:20.471716 master-0 kubenswrapper[7465]: I0320 08:40:20.471704 7465 generic.go:334] "Generic (PLEG): container finished" podID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerID="230d37232882904f1764e96ed6057bf568baed29ff892b892e215ce87e945710" exitCode=2 Mar 20 08:40:20.471716 master-0 kubenswrapper[7465]: I0320 08:40:20.471714 7465 generic.go:334] "Generic (PLEG): container finished" podID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerID="6e31068727643e077f1c9461b5883b919e163a79d9088735e4c5d39688c47867" exitCode=0 Mar 20 08:40:20.471873 master-0 kubenswrapper[7465]: I0320 08:40:20.471745 7465 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="854c2e8507c2801d7ffa3834f81534fd70cc4fe64f92e8f44b54916d68349b2e" Mar 20 08:40:20.471873 master-0 kubenswrapper[7465]: I0320 08:40:20.471772 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:20.493166 master-0 kubenswrapper[7465]: I0320 08:40:20.492995 7465 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" podUID="8dd3d3608fe9c86b0f65904ec2353df4" Mar 20 08:40:20.551063 master-0 kubenswrapper[7465]: I0320 08:40:20.550969 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" path="/var/lib/kubelet/pods/e3c8b9da1cd5cef8ca0690a6bbf5a601/volumes" Mar 20 08:40:21.807266 master-0 kubenswrapper[7465]: I0320 08:40:21.807202 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 20 08:40:21.990108 master-0 kubenswrapper[7465]: I0320 08:40:21.990054 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e219558-98b7-4528-88cf-97b87cd1eb6c-var-lock\") pod \"7e219558-98b7-4528-88cf-97b87cd1eb6c\" (UID: \"7e219558-98b7-4528-88cf-97b87cd1eb6c\") " Mar 20 08:40:21.990516 master-0 kubenswrapper[7465]: I0320 08:40:21.990495 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e219558-98b7-4528-88cf-97b87cd1eb6c-kubelet-dir\") pod \"7e219558-98b7-4528-88cf-97b87cd1eb6c\" (UID: \"7e219558-98b7-4528-88cf-97b87cd1eb6c\") " Mar 20 08:40:21.990692 master-0 kubenswrapper[7465]: I0320 08:40:21.990282 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e219558-98b7-4528-88cf-97b87cd1eb6c-var-lock" (OuterVolumeSpecName: "var-lock") pod "7e219558-98b7-4528-88cf-97b87cd1eb6c" (UID: "7e219558-98b7-4528-88cf-97b87cd1eb6c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:21.990741 master-0 kubenswrapper[7465]: I0320 08:40:21.990569 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e219558-98b7-4528-88cf-97b87cd1eb6c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7e219558-98b7-4528-88cf-97b87cd1eb6c" (UID: "7e219558-98b7-4528-88cf-97b87cd1eb6c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:21.990863 master-0 kubenswrapper[7465]: I0320 08:40:21.990844 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e219558-98b7-4528-88cf-97b87cd1eb6c-kube-api-access\") pod \"7e219558-98b7-4528-88cf-97b87cd1eb6c\" (UID: \"7e219558-98b7-4528-88cf-97b87cd1eb6c\") " Mar 20 08:40:21.991105 master-0 kubenswrapper[7465]: I0320 08:40:21.991088 7465 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7e219558-98b7-4528-88cf-97b87cd1eb6c-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:21.991178 master-0 kubenswrapper[7465]: I0320 08:40:21.991165 7465 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e219558-98b7-4528-88cf-97b87cd1eb6c-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:21.994292 master-0 kubenswrapper[7465]: I0320 08:40:21.994124 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e219558-98b7-4528-88cf-97b87cd1eb6c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7e219558-98b7-4528-88cf-97b87cd1eb6c" (UID: "7e219558-98b7-4528-88cf-97b87cd1eb6c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:40:22.092489 master-0 kubenswrapper[7465]: I0320 08:40:22.092301 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7e219558-98b7-4528-88cf-97b87cd1eb6c-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:22.486663 master-0 kubenswrapper[7465]: I0320 08:40:22.486588 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"7e219558-98b7-4528-88cf-97b87cd1eb6c","Type":"ContainerDied","Data":"2d6d575a8acffb3103f6760ea1452c58c9caadc565e202f8ee52c999c161d223"} Mar 20 08:40:22.486663 master-0 kubenswrapper[7465]: I0320 08:40:22.486656 7465 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d6d575a8acffb3103f6760ea1452c58c9caadc565e202f8ee52c999c161d223" Mar 20 08:40:22.486952 master-0 kubenswrapper[7465]: I0320 08:40:22.486720 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 20 08:40:25.513081 master-0 kubenswrapper[7465]: I0320 08:40:25.513005 7465 generic.go:334] "Generic (PLEG): container finished" podID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerID="239c73fe7acb21fae05105f39e9db825bd1c976521db41e975ce454317c12c28" exitCode=0 Mar 20 08:40:25.513081 master-0 kubenswrapper[7465]: I0320 08:40:25.513083 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" event={"ID":"91b2899e-8d24-41a0-bec8-d11c67b8f955","Type":"ContainerDied","Data":"239c73fe7acb21fae05105f39e9db825bd1c976521db41e975ce454317c12c28"} Mar 20 08:40:25.514144 master-0 kubenswrapper[7465]: I0320 08:40:25.513149 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" event={"ID":"91b2899e-8d24-41a0-bec8-d11c67b8f955","Type":"ContainerStarted","Data":"62850383ce84470064c579fd7119b27a28f493cfc80dbd0ce02b112368cba3fd"} Mar 20 08:40:25.975863 master-0 kubenswrapper[7465]: I0320 08:40:25.975777 7465 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:40:25.979374 master-0 kubenswrapper[7465]: I0320 08:40:25.979323 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:25.979374 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:25.979374 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:25.979374 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:25.979598 master-0 kubenswrapper[7465]: I0320 08:40:25.979408 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:26.978975 master-0 kubenswrapper[7465]: I0320 08:40:26.978906 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:26.978975 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:26.978975 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:26.978975 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:26.979653 master-0 kubenswrapper[7465]: I0320 08:40:26.979014 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:27.975735 master-0 kubenswrapper[7465]: I0320 08:40:27.975629 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:40:27.978537 master-0 kubenswrapper[7465]: I0320 08:40:27.978478 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:27.978537 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:27.978537 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:27.978537 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:27.978804 master-0 kubenswrapper[7465]: I0320 08:40:27.978536 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:28.978010 master-0 kubenswrapper[7465]: I0320 08:40:28.977951 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:28.978010 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:28.978010 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:28.978010 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:28.978660 master-0 kubenswrapper[7465]: I0320 08:40:28.978039 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:29.979525 master-0 kubenswrapper[7465]: I0320 08:40:29.979420 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:29.979525 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:29.979525 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:29.979525 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:29.980292 master-0 kubenswrapper[7465]: I0320 08:40:29.979554 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:30.978527 master-0 kubenswrapper[7465]: I0320 08:40:30.978431 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:30.978527 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:30.978527 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:30.978527 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:30.978967 master-0 kubenswrapper[7465]: I0320 08:40:30.978556 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:31.978800 master-0 kubenswrapper[7465]: I0320 08:40:31.978672 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:31.978800 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:31.978800 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:31.978800 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:31.978800 master-0 kubenswrapper[7465]: I0320 08:40:31.978796 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:32.978346 master-0 kubenswrapper[7465]: I0320 08:40:32.978204 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:32.978346 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:32.978346 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:32.978346 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:32.978346 master-0 kubenswrapper[7465]: I0320 08:40:32.978333 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:33.978450 master-0 kubenswrapper[7465]: I0320 08:40:33.978393 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:33.978450 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:33.978450 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:33.978450 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:33.978905 master-0 kubenswrapper[7465]: I0320 08:40:33.978873 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:34.541588 master-0 kubenswrapper[7465]: I0320 08:40:34.541514 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:34.582126 master-0 kubenswrapper[7465]: I0320 08:40:34.582052 7465 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e1a2d1ae-a8fa-4078-a0c8-a4c18e87ff2e" Mar 20 08:40:34.582126 master-0 kubenswrapper[7465]: I0320 08:40:34.582110 7465 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e1a2d1ae-a8fa-4078-a0c8-a4c18e87ff2e" Mar 20 08:40:34.600744 master-0 kubenswrapper[7465]: I0320 08:40:34.600664 7465 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:34.605148 master-0 kubenswrapper[7465]: I0320 08:40:34.604236 7465 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 20 08:40:34.608622 master-0 kubenswrapper[7465]: I0320 08:40:34.606760 7465 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 20 08:40:34.622912 master-0 kubenswrapper[7465]: I0320 08:40:34.622846 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:34.628002 master-0 kubenswrapper[7465]: I0320 08:40:34.627957 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 20 08:40:34.673936 master-0 kubenswrapper[7465]: W0320 08:40:34.667455 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dd3d3608fe9c86b0f65904ec2353df4.slice/crio-89e373acd0a6c2bca2cd563de1cef238723b46b416d83b3242f9ebb18d644754 WatchSource:0}: Error finding container 89e373acd0a6c2bca2cd563de1cef238723b46b416d83b3242f9ebb18d644754: Status 404 returned error can't find the container with id 89e373acd0a6c2bca2cd563de1cef238723b46b416d83b3242f9ebb18d644754 Mar 20 08:40:34.978158 master-0 kubenswrapper[7465]: I0320 08:40:34.978093 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:34.978158 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:34.978158 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:34.978158 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:34.978413 master-0 kubenswrapper[7465]: I0320 08:40:34.978178 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:35.610388 master-0 kubenswrapper[7465]: I0320 08:40:35.610269 7465 generic.go:334] "Generic (PLEG): container finished" podID="8dd3d3608fe9c86b0f65904ec2353df4" containerID="2bb061585e269ad50f22944212861cbe8de65df048c827dffb60a910fb8f58b1" exitCode=0 Mar 20 08:40:35.610388 master-0 kubenswrapper[7465]: I0320 08:40:35.610348 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerDied","Data":"2bb061585e269ad50f22944212861cbe8de65df048c827dffb60a910fb8f58b1"} Mar 20 08:40:35.610388 master-0 kubenswrapper[7465]: I0320 08:40:35.610388 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerStarted","Data":"89e373acd0a6c2bca2cd563de1cef238723b46b416d83b3242f9ebb18d644754"} Mar 20 08:40:35.977808 master-0 kubenswrapper[7465]: I0320 08:40:35.977734 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:35.977808 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:35.977808 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:35.977808 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:35.978268 master-0 kubenswrapper[7465]: I0320 08:40:35.977837 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:36.622145 master-0 kubenswrapper[7465]: I0320 08:40:36.622059 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerStarted","Data":"3c189090fb625d43e4a0aad0248864aa122ec54e7ab96d232c95dbcba79fcc95"} Mar 20 08:40:36.622145 master-0 kubenswrapper[7465]: I0320 08:40:36.622145 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerStarted","Data":"aa54b546db103668e772defaa4b64e4e2c01b2bc8d91706ab1484cd99b14f9d9"} Mar 20 08:40:36.622145 master-0 kubenswrapper[7465]: I0320 08:40:36.622161 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerStarted","Data":"5aef46f74824b7bd8319047c24bacbb9cbf6ac782ef810c2be78f3961d31d75e"} Mar 20 08:40:36.653300 master-0 kubenswrapper[7465]: I0320 08:40:36.650434 7465 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.650400784 podStartE2EDuration="2.650400784s" podCreationTimestamp="2026-03-20 08:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:40:36.647906271 +0000 UTC m=+262.291221831" watchObservedRunningTime="2026-03-20 08:40:36.650400784 +0000 UTC m=+262.293716284" Mar 20 08:40:36.978107 master-0 kubenswrapper[7465]: I0320 08:40:36.977949 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:36.978107 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:36.978107 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:36.978107 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:36.978107 master-0 kubenswrapper[7465]: I0320 08:40:36.978023 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:37.631367 master-0 kubenswrapper[7465]: I0320 08:40:37.631300 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:37.978656 master-0 kubenswrapper[7465]: I0320 08:40:37.978440 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:37.978656 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:37.978656 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:37.978656 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:37.979263 master-0 kubenswrapper[7465]: I0320 08:40:37.978654 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:38.978321 master-0 kubenswrapper[7465]: I0320 08:40:38.978239 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:38.978321 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:38.978321 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:38.978321 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:38.979351 master-0 kubenswrapper[7465]: I0320 08:40:38.978345 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:39.978259 master-0 kubenswrapper[7465]: I0320 08:40:39.978145 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:39.978259 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:39.978259 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:39.978259 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:39.978259 master-0 kubenswrapper[7465]: I0320 08:40:39.978249 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:40.978366 master-0 kubenswrapper[7465]: I0320 08:40:40.978270 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:40.978366 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:40.978366 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:40.978366 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:40.979527 master-0 kubenswrapper[7465]: I0320 08:40:40.978405 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:41.979400 master-0 kubenswrapper[7465]: I0320 08:40:41.979290 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:41.979400 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:41.979400 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:41.979400 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:41.980545 master-0 kubenswrapper[7465]: I0320 08:40:41.979401 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:42.979249 master-0 kubenswrapper[7465]: I0320 08:40:42.979138 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:42.979249 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:42.979249 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:42.979249 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:42.981067 master-0 kubenswrapper[7465]: I0320 08:40:42.979311 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:43.978263 master-0 kubenswrapper[7465]: I0320 08:40:43.978128 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:43.978263 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:43.978263 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:43.978263 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:43.978263 master-0 kubenswrapper[7465]: I0320 08:40:43.978234 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:44.977938 master-0 kubenswrapper[7465]: I0320 08:40:44.977722 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:44.977938 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:44.977938 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:44.977938 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:44.977938 master-0 kubenswrapper[7465]: I0320 08:40:44.977873 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:45.584332 master-0 kubenswrapper[7465]: I0320 08:40:45.583511 7465 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 20 08:40:45.584332 master-0 kubenswrapper[7465]: I0320 08:40:45.584013 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" containerID="cri-o://c7ff704cef5e82a8995a139ddd4e2496d1fd9c707ed823bbd9e67f8d259c2ea7" gracePeriod=15 Mar 20 08:40:45.584332 master-0 kubenswrapper[7465]: I0320 08:40:45.584230 7465 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://21ccef18afe96346c593d227394cf1225a9a87bf9c404fb2038be61860ddf492" gracePeriod=15 Mar 20 08:40:45.586740 master-0 kubenswrapper[7465]: I0320 08:40:45.586688 7465 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 20 08:40:45.586994 master-0 kubenswrapper[7465]: E0320 08:40:45.586953 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 20 08:40:45.586994 master-0 kubenswrapper[7465]: I0320 08:40:45.586971 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 20 08:40:45.586994 master-0 kubenswrapper[7465]: E0320 08:40:45.586984 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e219558-98b7-4528-88cf-97b87cd1eb6c" containerName="installer" Mar 20 08:40:45.586994 master-0 kubenswrapper[7465]: I0320 08:40:45.586994 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e219558-98b7-4528-88cf-97b87cd1eb6c" containerName="installer" Mar 20 08:40:45.587433 master-0 kubenswrapper[7465]: E0320 08:40:45.587014 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 20 08:40:45.587433 master-0 kubenswrapper[7465]: I0320 08:40:45.587023 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 20 08:40:45.587433 master-0 kubenswrapper[7465]: E0320 08:40:45.587037 7465 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 20 08:40:45.587433 master-0 kubenswrapper[7465]: I0320 08:40:45.587044 7465 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 20 08:40:45.587433 master-0 kubenswrapper[7465]: I0320 08:40:45.587216 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 20 08:40:45.587433 master-0 kubenswrapper[7465]: I0320 08:40:45.587237 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 20 08:40:45.587433 master-0 kubenswrapper[7465]: I0320 08:40:45.587252 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e219558-98b7-4528-88cf-97b87cd1eb6c" containerName="installer" Mar 20 08:40:45.587433 master-0 kubenswrapper[7465]: I0320 08:40:45.587273 7465 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 20 08:40:45.589747 master-0 kubenswrapper[7465]: I0320 08:40:45.589664 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:45.590402 master-0 kubenswrapper[7465]: I0320 08:40:45.590324 7465 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 20 08:40:45.591668 master-0 kubenswrapper[7465]: I0320 08:40:45.591620 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.783326 master-0 kubenswrapper[7465]: I0320 08:40:45.783204 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.783660 master-0 kubenswrapper[7465]: I0320 08:40:45.783353 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.783660 master-0 kubenswrapper[7465]: I0320 08:40:45.783449 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.783660 master-0 kubenswrapper[7465]: I0320 08:40:45.783518 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:45.783660 master-0 kubenswrapper[7465]: I0320 08:40:45.783599 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:45.783838 master-0 kubenswrapper[7465]: I0320 08:40:45.783676 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.783838 master-0 kubenswrapper[7465]: I0320 08:40:45.783730 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.783838 master-0 kubenswrapper[7465]: I0320 08:40:45.783802 7465 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:45.885562 master-0 kubenswrapper[7465]: I0320 08:40:45.885325 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.885562 master-0 kubenswrapper[7465]: I0320 08:40:45.885468 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.885562 master-0 kubenswrapper[7465]: I0320 08:40:45.885554 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.886031 master-0 kubenswrapper[7465]: I0320 08:40:45.885640 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.886031 master-0 kubenswrapper[7465]: I0320 08:40:45.885689 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:45.886031 master-0 kubenswrapper[7465]: I0320 08:40:45.885818 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:45.886176 master-0 kubenswrapper[7465]: I0320 08:40:45.886093 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:45.886176 master-0 kubenswrapper[7465]: I0320 08:40:45.886145 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.886318 master-0 kubenswrapper[7465]: I0320 08:40:45.886273 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:45.886318 master-0 kubenswrapper[7465]: I0320 08:40:45.886292 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.886466 master-0 kubenswrapper[7465]: I0320 08:40:45.886334 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.886466 master-0 kubenswrapper[7465]: I0320 08:40:45.886373 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:45.886466 master-0 kubenswrapper[7465]: I0320 08:40:45.886439 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.886608 master-0 kubenswrapper[7465]: I0320 08:40:45.886528 7465 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.886608 master-0 kubenswrapper[7465]: I0320 08:40:45.886542 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:45.886736 master-0 kubenswrapper[7465]: I0320 08:40:45.886697 7465 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: I0320 08:40:45.910674 7465 patch_prober.go:28] interesting pod/bootstrap-kube-apiserver-master-0 container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]log ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]api-openshift-apiserver-available ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]api-openshift-oauth-apiserver-available ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]informer-sync ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/priority-and-fairness-filter ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/start-apiextensions-informers ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/start-apiextensions-controllers ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/crd-informer-synced ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/start-system-namespaces-controller ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/rbac/bootstrap-roles ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/bootstrap-controller ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/start-kube-aggregator-informers ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/apiservice-registration-controller ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/apiservice-discovery-controller ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]autoregister-completion ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/apiservice-openapi-controller ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: [-]shutdown failed: reason withheld Mar 20 08:40:45.910798 master-0 kubenswrapper[7465]: readyz check failed Mar 20 08:40:45.912887 master-0 kubenswrapper[7465]: I0320 08:40:45.910816 7465 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:45.978850 master-0 kubenswrapper[7465]: I0320 08:40:45.978746 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:45.978850 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:45.978850 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:45.978850 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:45.979962 master-0 kubenswrapper[7465]: I0320 08:40:45.978969 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:46.383985 master-0 kubenswrapper[7465]: I0320 08:40:46.383239 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:46.384716 master-0 kubenswrapper[7465]: I0320 08:40:46.384566 7465 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:46.394309 master-0 kubenswrapper[7465]: I0320 08:40:46.393874 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 20 08:40:46.411043 master-0 kubenswrapper[7465]: I0320 08:40:46.408181 7465 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 20 08:40:46.464521 master-0 kubenswrapper[7465]: W0320 08:40:46.464447 7465 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb45ea2ef1cf2bc9d1d994d6538ae0a64.slice/crio-fb1900c9365a08f03e6fa39a54bd69429d8cedb421e2b0d1e4f977c7a6faf417 WatchSource:0}: Error finding container fb1900c9365a08f03e6fa39a54bd69429d8cedb421e2b0d1e4f977c7a6faf417: Status 404 returned error can't find the container with id fb1900c9365a08f03e6fa39a54bd69429d8cedb421e2b0d1e4f977c7a6faf417 Mar 20 08:40:46.472113 master-0 kubenswrapper[7465]: E0320 08:40:46.471913 7465 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.189e800156d855b6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:b45ea2ef1cf2bc9d1d994d6538ae0a64,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:40:46.47089503 +0000 UTC m=+272.114210520,LastTimestamp:2026-03-20 08:40:46.47089503 +0000 UTC m=+272.114210520,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:40:46.710725 master-0 kubenswrapper[7465]: I0320 08:40:46.710646 7465 generic.go:334] "Generic (PLEG): container finished" podID="d245e5b2-a30d-45c8-9b79-6e8096765c14" containerID="215282b36f0afcf690dcc7252b249b0c54821b459a2fe5a0ff25640fd36b6290" exitCode=0 Mar 20 08:40:46.710973 master-0 kubenswrapper[7465]: I0320 08:40:46.710724 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"d245e5b2-a30d-45c8-9b79-6e8096765c14","Type":"ContainerDied","Data":"215282b36f0afcf690dcc7252b249b0c54821b459a2fe5a0ff25640fd36b6290"} Mar 20 08:40:46.714120 master-0 kubenswrapper[7465]: I0320 08:40:46.714046 7465 status_manager.go:851] "Failed to get status for pod" podUID="d245e5b2-a30d-45c8-9b79-6e8096765c14" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:40:46.715484 master-0 kubenswrapper[7465]: I0320 08:40:46.714894 7465 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:40:46.716980 master-0 kubenswrapper[7465]: I0320 08:40:46.716688 7465 status_manager.go:851] "Failed to get status for pod" podUID="8e7a82869988463543d3d8dd1f0b5fe3" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:40:46.716980 master-0 kubenswrapper[7465]: I0320 08:40:46.716819 7465 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="21ccef18afe96346c593d227394cf1225a9a87bf9c404fb2038be61860ddf492" exitCode=0 Mar 20 08:40:46.720557 master-0 kubenswrapper[7465]: I0320 08:40:46.720407 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"fb1900c9365a08f03e6fa39a54bd69429d8cedb421e2b0d1e4f977c7a6faf417"} Mar 20 08:40:46.722929 master-0 kubenswrapper[7465]: I0320 08:40:46.721973 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"8e7a82869988463543d3d8dd1f0b5fe3","Type":"ContainerStarted","Data":"9da40744da0c1f755b7ca8d13405871816427a42b29bf11d678dd70f488e5c6a"} Mar 20 08:40:46.977771 master-0 kubenswrapper[7465]: I0320 08:40:46.977621 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:46.977771 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:46.977771 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:46.977771 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:46.977771 master-0 kubenswrapper[7465]: I0320 08:40:46.977707 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:47.734934 master-0 kubenswrapper[7465]: I0320 08:40:47.734873 7465 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="d11c59462bae131a903a2325334d63a1ab5222611dbbb14b008e128c7a2a34a9" exitCode=0 Mar 20 08:40:47.735510 master-0 kubenswrapper[7465]: I0320 08:40:47.734954 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerDied","Data":"d11c59462bae131a903a2325334d63a1ab5222611dbbb14b008e128c7a2a34a9"} Mar 20 08:40:47.736605 master-0 kubenswrapper[7465]: I0320 08:40:47.736258 7465 status_manager.go:851] "Failed to get status for pod" podUID="8e7a82869988463543d3d8dd1f0b5fe3" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:40:47.737152 master-0 kubenswrapper[7465]: I0320 08:40:47.737053 7465 status_manager.go:851] "Failed to get status for pod" podUID="d245e5b2-a30d-45c8-9b79-6e8096765c14" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:40:47.738145 master-0 kubenswrapper[7465]: I0320 08:40:47.738125 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"8e7a82869988463543d3d8dd1f0b5fe3","Type":"ContainerStarted","Data":"2f574795ee9d934844b92324a83362cb7abdf8cc28431e8355456d552139443f"} Mar 20 08:40:47.739610 master-0 kubenswrapper[7465]: I0320 08:40:47.739570 7465 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:40:47.740644 master-0 kubenswrapper[7465]: I0320 08:40:47.740602 7465 status_manager.go:851] "Failed to get status for pod" podUID="d245e5b2-a30d-45c8-9b79-6e8096765c14" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:40:47.741464 master-0 kubenswrapper[7465]: I0320 08:40:47.741423 7465 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:40:47.742225 master-0 kubenswrapper[7465]: I0320 08:40:47.742163 7465 status_manager.go:851] "Failed to get status for pod" podUID="8e7a82869988463543d3d8dd1f0b5fe3" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:40:47.978683 master-0 kubenswrapper[7465]: I0320 08:40:47.978599 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:47.978683 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:47.978683 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:47.978683 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:47.978969 master-0 kubenswrapper[7465]: I0320 08:40:47.978721 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:48.127555 master-0 kubenswrapper[7465]: I0320 08:40:48.127426 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:40:48.128954 master-0 kubenswrapper[7465]: I0320 08:40:48.128804 7465 status_manager.go:851] "Failed to get status for pod" podUID="d245e5b2-a30d-45c8-9b79-6e8096765c14" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:40:48.129868 master-0 kubenswrapper[7465]: I0320 08:40:48.129788 7465 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:40:48.130616 master-0 kubenswrapper[7465]: I0320 08:40:48.130575 7465 status_manager.go:851] "Failed to get status for pod" podUID="8e7a82869988463543d3d8dd1f0b5fe3" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:40:48.225513 master-0 kubenswrapper[7465]: I0320 08:40:48.225154 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-kubelet-dir\") pod \"d245e5b2-a30d-45c8-9b79-6e8096765c14\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " Mar 20 08:40:48.225513 master-0 kubenswrapper[7465]: I0320 08:40:48.225342 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d245e5b2-a30d-45c8-9b79-6e8096765c14" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:48.225513 master-0 kubenswrapper[7465]: I0320 08:40:48.225456 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-var-lock\") pod \"d245e5b2-a30d-45c8-9b79-6e8096765c14\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " Mar 20 08:40:48.225513 master-0 kubenswrapper[7465]: I0320 08:40:48.225493 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"d245e5b2-a30d-45c8-9b79-6e8096765c14\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " Mar 20 08:40:48.225740 master-0 kubenswrapper[7465]: I0320 08:40:48.225597 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-var-lock" (OuterVolumeSpecName: "var-lock") pod "d245e5b2-a30d-45c8-9b79-6e8096765c14" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:48.225924 master-0 kubenswrapper[7465]: I0320 08:40:48.225882 7465 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:48.225984 master-0 kubenswrapper[7465]: I0320 08:40:48.225910 7465 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:48.229151 master-0 kubenswrapper[7465]: I0320 08:40:48.229088 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d245e5b2-a30d-45c8-9b79-6e8096765c14" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:40:48.328057 master-0 kubenswrapper[7465]: I0320 08:40:48.327975 7465 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:48.758461 master-0 kubenswrapper[7465]: I0320 08:40:48.755251 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"d245e5b2-a30d-45c8-9b79-6e8096765c14","Type":"ContainerDied","Data":"20814875b6be1da3d4e673bd4cab493f3904bdc8689013ab1822b3b670087e6a"} Mar 20 08:40:48.758461 master-0 kubenswrapper[7465]: I0320 08:40:48.755318 7465 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20814875b6be1da3d4e673bd4cab493f3904bdc8689013ab1822b3b670087e6a" Mar 20 08:40:48.758461 master-0 kubenswrapper[7465]: I0320 08:40:48.755348 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:40:48.759522 master-0 kubenswrapper[7465]: I0320 08:40:48.758547 7465 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="c7ff704cef5e82a8995a139ddd4e2496d1fd9c707ed823bbd9e67f8d259c2ea7" exitCode=0 Mar 20 08:40:48.764249 master-0 kubenswrapper[7465]: I0320 08:40:48.763105 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"f3b6bb748ad127ad783c317a9ec2edeaa6139af683fa7cda66b138823bc535c8"} Mar 20 08:40:48.764249 master-0 kubenswrapper[7465]: I0320 08:40:48.763221 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"d0111435fee5f861beed3e54093288b80fe796da20879b74d333fc90fd130f88"} Mar 20 08:40:48.979040 master-0 kubenswrapper[7465]: I0320 08:40:48.978136 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:48.979040 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:48.979040 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:48.979040 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:48.979040 master-0 kubenswrapper[7465]: I0320 08:40:48.978253 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:49.161473 master-0 kubenswrapper[7465]: I0320 08:40:49.161327 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:40:49.270952 master-0 kubenswrapper[7465]: I0320 08:40:49.270693 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 20 08:40:49.270952 master-0 kubenswrapper[7465]: I0320 08:40:49.270739 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 20 08:40:49.270952 master-0 kubenswrapper[7465]: I0320 08:40:49.270797 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 20 08:40:49.270952 master-0 kubenswrapper[7465]: I0320 08:40:49.270812 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 20 08:40:49.270952 master-0 kubenswrapper[7465]: I0320 08:40:49.270840 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config" (OuterVolumeSpecName: "config") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:49.270952 master-0 kubenswrapper[7465]: I0320 08:40:49.270867 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 20 08:40:49.270952 master-0 kubenswrapper[7465]: I0320 08:40:49.270878 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets" (OuterVolumeSpecName: "secrets") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:49.270952 master-0 kubenswrapper[7465]: I0320 08:40:49.270892 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:49.270952 master-0 kubenswrapper[7465]: I0320 08:40:49.270888 7465 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 20 08:40:49.271807 master-0 kubenswrapper[7465]: I0320 08:40:49.270906 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs" (OuterVolumeSpecName: "logs") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:49.271807 master-0 kubenswrapper[7465]: I0320 08:40:49.270889 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:49.271807 master-0 kubenswrapper[7465]: I0320 08:40:49.270918 7465 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:49.271807 master-0 kubenswrapper[7465]: I0320 08:40:49.271325 7465 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:49.271807 master-0 kubenswrapper[7465]: I0320 08:40:49.271348 7465 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:49.271807 master-0 kubenswrapper[7465]: I0320 08:40:49.271362 7465 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:49.271807 master-0 kubenswrapper[7465]: I0320 08:40:49.271374 7465 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:49.271807 master-0 kubenswrapper[7465]: I0320 08:40:49.271388 7465 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:49.271807 master-0 kubenswrapper[7465]: I0320 08:40:49.271401 7465 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:49.822990 master-0 kubenswrapper[7465]: I0320 08:40:49.822930 7465 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 20 08:40:49.823692 master-0 kubenswrapper[7465]: I0320 08:40:49.823658 7465 scope.go:117] "RemoveContainer" containerID="21ccef18afe96346c593d227394cf1225a9a87bf9c404fb2038be61860ddf492" Mar 20 08:40:49.862206 master-0 kubenswrapper[7465]: I0320 08:40:49.858594 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"6ab22b3f2d37bf5775a5ff7b8bee43ed84a6b6e971e88c5aa5c7bea50f737d96"} Mar 20 08:40:49.862206 master-0 kubenswrapper[7465]: I0320 08:40:49.858661 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"0f63c13521e5ca3a0dcdf96b62fa9e56532f32f43af31de53a30911e593c568a"} Mar 20 08:40:49.862206 master-0 kubenswrapper[7465]: I0320 08:40:49.858674 7465 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"3030fbaae53d50269bd5e2ef3f7474bc1b880c795eb0e6c078b59b5741beb732"} Mar 20 08:40:49.862206 master-0 kubenswrapper[7465]: I0320 08:40:49.859406 7465 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:49.937213 master-0 kubenswrapper[7465]: I0320 08:40:49.927664 7465 scope.go:117] "RemoveContainer" containerID="c7ff704cef5e82a8995a139ddd4e2496d1fd9c707ed823bbd9e67f8d259c2ea7" Mar 20 08:40:50.127408 master-0 kubenswrapper[7465]: I0320 08:40:50.125282 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:50.127408 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:50.127408 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:50.127408 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:50.127408 master-0 kubenswrapper[7465]: I0320 08:40:50.125359 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:50.160262 master-0 kubenswrapper[7465]: I0320 08:40:50.158544 7465 scope.go:117] "RemoveContainer" containerID="81fdbea135dce13afe4433f7d61b259980b46bfdce14d456ee42556d90e1cda4" Mar 20 08:40:50.551097 master-0 kubenswrapper[7465]: I0320 08:40:50.551035 7465 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49fac1b46a11e49501805e891baae4a9" path="/var/lib/kubelet/pods/49fac1b46a11e49501805e891baae4a9/volumes" Mar 20 08:40:50.551528 master-0 kubenswrapper[7465]: I0320 08:40:50.551501 7465 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 20 08:40:50.978929 master-0 kubenswrapper[7465]: I0320 08:40:50.978782 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:50.978929 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:50.978929 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:50.978929 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:50.978929 master-0 kubenswrapper[7465]: I0320 08:40:50.978874 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:51.979094 master-0 kubenswrapper[7465]: I0320 08:40:51.979005 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:51.979094 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:51.979094 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:51.979094 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:51.979869 master-0 kubenswrapper[7465]: I0320 08:40:51.979104 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:52.978591 master-0 kubenswrapper[7465]: I0320 08:40:52.978498 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:52.978591 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:52.978591 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:52.978591 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:52.979018 master-0 kubenswrapper[7465]: I0320 08:40:52.978591 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:53.977717 master-0 kubenswrapper[7465]: I0320 08:40:53.977627 7465 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-xmvwz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:40:53.977717 master-0 kubenswrapper[7465]: [-]has-synced failed: reason withheld Mar 20 08:40:53.977717 master-0 kubenswrapper[7465]: [+]process-running ok Mar 20 08:40:53.977717 master-0 kubenswrapper[7465]: healthz check failed Mar 20 08:40:53.978411 master-0 kubenswrapper[7465]: I0320 08:40:53.977722 7465 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" podUID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:40:54.701978 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 20 08:40:54.737604 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 20 08:40:54.737930 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 20 08:40:54.740795 master-0 systemd[1]: kubelet.service: Consumed 49.408s CPU time. Mar 20 08:40:54.757795 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 20 08:40:54.925200 master-0 kubenswrapper[18707]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:40:54.925200 master-0 kubenswrapper[18707]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 08:40:54.925200 master-0 kubenswrapper[18707]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:40:54.925857 master-0 kubenswrapper[18707]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:40:54.925857 master-0 kubenswrapper[18707]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 08:40:54.925857 master-0 kubenswrapper[18707]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:40:54.925857 master-0 kubenswrapper[18707]: I0320 08:40:54.925546 18707 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 08:40:54.929084 master-0 kubenswrapper[18707]: W0320 08:40:54.929056 18707 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 20 08:40:54.929084 master-0 kubenswrapper[18707]: W0320 08:40:54.929082 18707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:40:54.929084 master-0 kubenswrapper[18707]: W0320 08:40:54.929090 18707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929097 18707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929104 18707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929109 18707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929114 18707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929119 18707 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929124 18707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929129 18707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929135 18707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929140 18707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929145 18707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929150 18707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929155 18707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929163 18707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929170 18707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929178 18707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929205 18707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929214 18707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929224 18707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:40:54.929260 master-0 kubenswrapper[18707]: W0320 08:40:54.929230 18707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929235 18707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929241 18707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929248 18707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929255 18707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929261 18707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929267 18707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929274 18707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929281 18707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929288 18707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929296 18707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929304 18707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929312 18707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929319 18707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929326 18707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929333 18707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929341 18707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929347 18707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929355 18707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:40:54.929785 master-0 kubenswrapper[18707]: W0320 08:40:54.929362 18707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929370 18707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929376 18707 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929381 18707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929386 18707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929393 18707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929402 18707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929409 18707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929415 18707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929421 18707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929426 18707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929431 18707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929437 18707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929443 18707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929448 18707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929454 18707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929459 18707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929464 18707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929469 18707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929474 18707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:40:54.930350 master-0 kubenswrapper[18707]: W0320 08:40:54.929479 18707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: W0320 08:40:54.929484 18707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: W0320 08:40:54.929490 18707 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: W0320 08:40:54.929495 18707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: W0320 08:40:54.929500 18707 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: W0320 08:40:54.929505 18707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: W0320 08:40:54.929510 18707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: W0320 08:40:54.929517 18707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: W0320 08:40:54.929523 18707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: W0320 08:40:54.929528 18707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: W0320 08:40:54.929533 18707 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: W0320 08:40:54.929538 18707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: I0320 08:40:54.929696 18707 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: I0320 08:40:54.929711 18707 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: I0320 08:40:54.929725 18707 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: I0320 08:40:54.929733 18707 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: I0320 08:40:54.929759 18707 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: I0320 08:40:54.929766 18707 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: I0320 08:40:54.929774 18707 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: I0320 08:40:54.929782 18707 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: I0320 08:40:54.929789 18707 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 08:40:54.932646 master-0 kubenswrapper[18707]: I0320 08:40:54.929796 18707 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929803 18707 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929811 18707 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929818 18707 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929849 18707 flags.go:64] FLAG: --cgroup-root="" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929856 18707 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929862 18707 flags.go:64] FLAG: --client-ca-file="" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929868 18707 flags.go:64] FLAG: --cloud-config="" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929874 18707 flags.go:64] FLAG: --cloud-provider="" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929880 18707 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929887 18707 flags.go:64] FLAG: --cluster-domain="" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929894 18707 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929899 18707 flags.go:64] FLAG: --config-dir="" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929905 18707 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929912 18707 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929919 18707 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929925 18707 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929931 18707 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929938 18707 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929943 18707 flags.go:64] FLAG: --contention-profiling="false" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929956 18707 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929962 18707 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929969 18707 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929976 18707 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929984 18707 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 08:40:54.934050 master-0 kubenswrapper[18707]: I0320 08:40:54.929990 18707 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.929996 18707 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930002 18707 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930009 18707 flags.go:64] FLAG: --enable-server="true" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930015 18707 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930023 18707 flags.go:64] FLAG: --event-burst="100" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930029 18707 flags.go:64] FLAG: --event-qps="50" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930037 18707 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930043 18707 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930049 18707 flags.go:64] FLAG: --eviction-hard="" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930058 18707 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930064 18707 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930071 18707 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930077 18707 flags.go:64] FLAG: --eviction-soft="" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930083 18707 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930089 18707 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930095 18707 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930101 18707 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930107 18707 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930113 18707 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930138 18707 flags.go:64] FLAG: --feature-gates="" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930146 18707 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930153 18707 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930159 18707 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930166 18707 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930172 18707 flags.go:64] FLAG: --healthz-port="10248" Mar 20 08:40:54.934798 master-0 kubenswrapper[18707]: I0320 08:40:54.930178 18707 flags.go:64] FLAG: --help="false" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930207 18707 flags.go:64] FLAG: --hostname-override="" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930216 18707 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930224 18707 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930230 18707 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930236 18707 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930242 18707 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930250 18707 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930256 18707 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930262 18707 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930269 18707 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930289 18707 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930297 18707 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930303 18707 flags.go:64] FLAG: --kube-reserved="" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930310 18707 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930317 18707 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930323 18707 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930330 18707 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930336 18707 flags.go:64] FLAG: --lock-file="" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930342 18707 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930349 18707 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930355 18707 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930365 18707 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930373 18707 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930379 18707 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 08:40:54.935740 master-0 kubenswrapper[18707]: I0320 08:40:54.930418 18707 flags.go:64] FLAG: --logging-format="text" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930425 18707 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930432 18707 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930439 18707 flags.go:64] FLAG: --manifest-url="" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930445 18707 flags.go:64] FLAG: --manifest-url-header="" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930454 18707 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930460 18707 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930468 18707 flags.go:64] FLAG: --max-pods="110" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930478 18707 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930485 18707 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930491 18707 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930498 18707 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930504 18707 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930510 18707 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930517 18707 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930532 18707 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930539 18707 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930545 18707 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930552 18707 flags.go:64] FLAG: --pod-cidr="" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930559 18707 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930569 18707 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930574 18707 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930581 18707 flags.go:64] FLAG: --pods-per-core="0" Mar 20 08:40:54.936483 master-0 kubenswrapper[18707]: I0320 08:40:54.930587 18707 flags.go:64] FLAG: --port="10250" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930593 18707 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930599 18707 flags.go:64] FLAG: --provider-id="" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930605 18707 flags.go:64] FLAG: --qos-reserved="" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930611 18707 flags.go:64] FLAG: --read-only-port="10255" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930617 18707 flags.go:64] FLAG: --register-node="true" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930623 18707 flags.go:64] FLAG: --register-schedulable="true" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930629 18707 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930639 18707 flags.go:64] FLAG: --registry-burst="10" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930645 18707 flags.go:64] FLAG: --registry-qps="5" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930651 18707 flags.go:64] FLAG: --reserved-cpus="" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930658 18707 flags.go:64] FLAG: --reserved-memory="" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930665 18707 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930671 18707 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930678 18707 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930684 18707 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930689 18707 flags.go:64] FLAG: --runonce="false" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930698 18707 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930704 18707 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930710 18707 flags.go:64] FLAG: --seccomp-default="false" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930716 18707 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930723 18707 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930729 18707 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930734 18707 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930740 18707 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 08:40:54.938233 master-0 kubenswrapper[18707]: I0320 08:40:54.930746 18707 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930753 18707 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930759 18707 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930765 18707 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930771 18707 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930778 18707 flags.go:64] FLAG: --system-cgroups="" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930784 18707 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930794 18707 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930800 18707 flags.go:64] FLAG: --tls-cert-file="" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930806 18707 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930814 18707 flags.go:64] FLAG: --tls-min-version="" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930820 18707 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930826 18707 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930832 18707 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930838 18707 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930844 18707 flags.go:64] FLAG: --v="2" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930852 18707 flags.go:64] FLAG: --version="false" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930869 18707 flags.go:64] FLAG: --vmodule="" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930877 18707 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: I0320 08:40:54.930883 18707 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: W0320 08:40:54.931034 18707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: W0320 08:40:54.931042 18707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: W0320 08:40:54.931048 18707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:40:54.939394 master-0 kubenswrapper[18707]: W0320 08:40:54.931055 18707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931066 18707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931071 18707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931076 18707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931082 18707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931087 18707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931093 18707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931098 18707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931103 18707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931109 18707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931114 18707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931120 18707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931125 18707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931131 18707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931136 18707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931141 18707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931148 18707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931155 18707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931161 18707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:40:54.940218 master-0 kubenswrapper[18707]: W0320 08:40:54.931170 18707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931175 18707 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931181 18707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931206 18707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931213 18707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931219 18707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931226 18707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931235 18707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931241 18707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931246 18707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931252 18707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931257 18707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931263 18707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931268 18707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931276 18707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931281 18707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931286 18707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931291 18707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931297 18707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931303 18707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931308 18707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:40:54.940730 master-0 kubenswrapper[18707]: W0320 08:40:54.931313 18707 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931319 18707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931325 18707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931332 18707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931338 18707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931344 18707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931350 18707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931356 18707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931361 18707 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931367 18707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931372 18707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931378 18707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931385 18707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931391 18707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931397 18707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931405 18707 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931410 18707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931416 18707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931424 18707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:40:54.941365 master-0 kubenswrapper[18707]: W0320 08:40:54.931429 18707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:40:54.942011 master-0 kubenswrapper[18707]: W0320 08:40:54.931434 18707 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 20 08:40:54.942011 master-0 kubenswrapper[18707]: W0320 08:40:54.931440 18707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:40:54.942011 master-0 kubenswrapper[18707]: W0320 08:40:54.931445 18707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:40:54.942011 master-0 kubenswrapper[18707]: W0320 08:40:54.931451 18707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:40:54.942011 master-0 kubenswrapper[18707]: W0320 08:40:54.931456 18707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:40:54.942011 master-0 kubenswrapper[18707]: W0320 08:40:54.931464 18707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:40:54.942011 master-0 kubenswrapper[18707]: W0320 08:40:54.931469 18707 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:40:54.942011 master-0 kubenswrapper[18707]: W0320 08:40:54.931475 18707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:40:54.942011 master-0 kubenswrapper[18707]: W0320 08:40:54.931480 18707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:40:54.942011 master-0 kubenswrapper[18707]: I0320 08:40:54.931498 18707 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 08:40:54.942011 master-0 kubenswrapper[18707]: I0320 08:40:54.936583 18707 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 20 08:40:54.942011 master-0 kubenswrapper[18707]: I0320 08:40:54.936614 18707 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 08:40:54.942011 master-0 kubenswrapper[18707]: W0320 08:40:54.936712 18707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:40:54.942011 master-0 kubenswrapper[18707]: W0320 08:40:54.936724 18707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936734 18707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936743 18707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936752 18707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936758 18707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936765 18707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936771 18707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936778 18707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936785 18707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936791 18707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936797 18707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936804 18707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936810 18707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936816 18707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936823 18707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936829 18707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936835 18707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936842 18707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936850 18707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:40:54.942400 master-0 kubenswrapper[18707]: W0320 08:40:54.936858 18707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936865 18707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936873 18707 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936880 18707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936887 18707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936896 18707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936902 18707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936909 18707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936916 18707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936923 18707 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936930 18707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936937 18707 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936945 18707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936952 18707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936961 18707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936968 18707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936974 18707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936981 18707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936988 18707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.936995 18707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:40:54.942882 master-0 kubenswrapper[18707]: W0320 08:40:54.937004 18707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937012 18707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937018 18707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937024 18707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937031 18707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937038 18707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937044 18707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937052 18707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937058 18707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937065 18707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937071 18707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937078 18707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937085 18707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937092 18707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937098 18707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937104 18707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937111 18707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937118 18707 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937125 18707 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937132 18707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:40:54.943584 master-0 kubenswrapper[18707]: W0320 08:40:54.937140 18707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:40:54.944090 master-0 kubenswrapper[18707]: W0320 08:40:54.937150 18707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:40:54.944090 master-0 kubenswrapper[18707]: W0320 08:40:54.937159 18707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:40:54.944090 master-0 kubenswrapper[18707]: W0320 08:40:54.937167 18707 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:40:54.944090 master-0 kubenswrapper[18707]: W0320 08:40:54.937175 18707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:40:54.944090 master-0 kubenswrapper[18707]: W0320 08:40:54.937205 18707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:40:54.944090 master-0 kubenswrapper[18707]: W0320 08:40:54.937213 18707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:40:54.944090 master-0 kubenswrapper[18707]: W0320 08:40:54.937219 18707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:40:54.944090 master-0 kubenswrapper[18707]: W0320 08:40:54.937226 18707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:40:54.944090 master-0 kubenswrapper[18707]: W0320 08:40:54.937233 18707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:40:54.944090 master-0 kubenswrapper[18707]: W0320 08:40:54.937242 18707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:40:54.944090 master-0 kubenswrapper[18707]: W0320 08:40:54.937249 18707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:40:54.944090 master-0 kubenswrapper[18707]: I0320 08:40:54.937261 18707 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 08:40:54.944090 master-0 kubenswrapper[18707]: W0320 08:40:54.937476 18707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:40:54.944090 master-0 kubenswrapper[18707]: W0320 08:40:54.937496 18707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:40:54.944090 master-0 kubenswrapper[18707]: W0320 08:40:54.937504 18707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937512 18707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937519 18707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937527 18707 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937534 18707 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937541 18707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937547 18707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937554 18707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937560 18707 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937566 18707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937572 18707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937579 18707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937585 18707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937591 18707 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937599 18707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937606 18707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937613 18707 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937620 18707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937626 18707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937633 18707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:40:54.944573 master-0 kubenswrapper[18707]: W0320 08:40:54.937640 18707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937647 18707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937654 18707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937663 18707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937675 18707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937684 18707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937692 18707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937700 18707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937708 18707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937716 18707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937723 18707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937733 18707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937740 18707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937747 18707 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937754 18707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937767 18707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937776 18707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937783 18707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937790 18707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937797 18707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:40:54.945372 master-0 kubenswrapper[18707]: W0320 08:40:54.937803 18707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937810 18707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937816 18707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937851 18707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937856 18707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937866 18707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937872 18707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937880 18707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937888 18707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937896 18707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937902 18707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937908 18707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937913 18707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937918 18707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937923 18707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937928 18707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937933 18707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937938 18707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937943 18707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:40:54.946065 master-0 kubenswrapper[18707]: W0320 08:40:54.937949 18707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:40:54.946753 master-0 kubenswrapper[18707]: W0320 08:40:54.937954 18707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:40:54.946753 master-0 kubenswrapper[18707]: W0320 08:40:54.937959 18707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:40:54.946753 master-0 kubenswrapper[18707]: W0320 08:40:54.937964 18707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:40:54.946753 master-0 kubenswrapper[18707]: W0320 08:40:54.937968 18707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:40:54.946753 master-0 kubenswrapper[18707]: W0320 08:40:54.937974 18707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:40:54.946753 master-0 kubenswrapper[18707]: W0320 08:40:54.937979 18707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:40:54.946753 master-0 kubenswrapper[18707]: W0320 08:40:54.937984 18707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:40:54.946753 master-0 kubenswrapper[18707]: W0320 08:40:54.937994 18707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:40:54.946753 master-0 kubenswrapper[18707]: W0320 08:40:54.938000 18707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:40:54.946753 master-0 kubenswrapper[18707]: W0320 08:40:54.938005 18707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:40:54.946753 master-0 kubenswrapper[18707]: I0320 08:40:54.938014 18707 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 08:40:54.946753 master-0 kubenswrapper[18707]: I0320 08:40:54.938957 18707 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 08:40:54.946753 master-0 kubenswrapper[18707]: I0320 08:40:54.942656 18707 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 20 08:40:54.946753 master-0 kubenswrapper[18707]: I0320 08:40:54.942774 18707 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 08:40:54.947282 master-0 kubenswrapper[18707]: I0320 08:40:54.943071 18707 server.go:997] "Starting client certificate rotation" Mar 20 08:40:54.947282 master-0 kubenswrapper[18707]: I0320 08:40:54.943090 18707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 08:40:54.947282 master-0 kubenswrapper[18707]: I0320 08:40:54.943268 18707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-21 08:25:35 +0000 UTC, rotation deadline is 2026-03-21 05:58:49.578347501 +0000 UTC Mar 20 08:40:54.947282 master-0 kubenswrapper[18707]: I0320 08:40:54.943341 18707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 21h17m54.635009902s for next certificate rotation Mar 20 08:40:54.947282 master-0 kubenswrapper[18707]: I0320 08:40:54.943917 18707 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 08:40:54.948080 master-0 kubenswrapper[18707]: I0320 08:40:54.948028 18707 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 08:40:54.953544 master-0 kubenswrapper[18707]: I0320 08:40:54.953505 18707 log.go:25] "Validated CRI v1 runtime API" Mar 20 08:40:54.960532 master-0 kubenswrapper[18707]: I0320 08:40:54.957997 18707 log.go:25] "Validated CRI v1 image API" Mar 20 08:40:54.960532 master-0 kubenswrapper[18707]: I0320 08:40:54.959132 18707 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 08:40:54.968829 master-0 kubenswrapper[18707]: I0320 08:40:54.968768 18707 fs.go:135] Filesystem UUIDs: map[4a66d702-cf3e-4c68-968a-18f659b89ac6:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 20 08:40:54.969613 master-0 kubenswrapper[18707]: I0320 08:40:54.968817 18707 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0262d134f60647f6e04ff950df203ce5bc3f1656b20c1e15f442731269c3be76/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0262d134f60647f6e04ff950df203ce5bc3f1656b20c1e15f442731269c3be76/userdata/shm major:0 minor:445 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/02b8b46e9f6cf48ded279c24ec1e51a94bbe25b122e72584be4a8549a6a9d74b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/02b8b46e9f6cf48ded279c24ec1e51a94bbe25b122e72584be4a8549a6a9d74b/userdata/shm major:0 minor:441 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/07c77bfc7ae9afd423082de8b582bc56bb7322d1ee441fcf749b3f543d9311b5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/07c77bfc7ae9afd423082de8b582bc56bb7322d1ee441fcf749b3f543d9311b5/userdata/shm major:0 minor:372 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0885038f20b8301926b3fbf1704c402c0cf75d3a9c64af977d4466bc2f1fe1b6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0885038f20b8301926b3fbf1704c402c0cf75d3a9c64af977d4466bc2f1fe1b6/userdata/shm major:0 minor:1049 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0e8cf476b590f62cf998ca26bf5d07d7ec24559896e92dec128d4a67e132990a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0e8cf476b590f62cf998ca26bf5d07d7ec24559896e92dec128d4a67e132990a/userdata/shm major:0 minor:104 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1038dded4ac6146a3ef7e05fc425b32ac120e0351ec2aaee7b8ebe45679034dd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1038dded4ac6146a3ef7e05fc425b32ac120e0351ec2aaee7b8ebe45679034dd/userdata/shm major:0 minor:323 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/16961d83ade56434cc5d6847f42954f60a19529cc2fd922d6eec9e1e749ea461/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/16961d83ade56434cc5d6847f42954f60a19529cc2fd922d6eec9e1e749ea461/userdata/shm major:0 minor:282 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1cdce851761fa32d1bf31f749f94743d1cc0bca1a250ad08c114ccc4c2f77b9b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1cdce851761fa32d1bf31f749f94743d1cc0bca1a250ad08c114ccc4c2f77b9b/userdata/shm major:0 minor:105 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/21791b9b344da8b052097bc3f6be11ec8238d51625fab3e6901854f679a950ba/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/21791b9b344da8b052097bc3f6be11ec8238d51625fab3e6901854f679a950ba/userdata/shm major:0 minor:609 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/285790bb4eeaea0e1399502a5e31c8d8bf1bd484bccae96128ad9795ef9ca21a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/285790bb4eeaea0e1399502a5e31c8d8bf1bd484bccae96128ad9795ef9ca21a/userdata/shm major:0 minor:835 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3092eb7a16220393b74c3ca8c6aedf7058f62f9313af91e571c5d2e31d050e35/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3092eb7a16220393b74c3ca8c6aedf7058f62f9313af91e571c5d2e31d050e35/userdata/shm major:0 minor:438 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/37ba8dc9671f8fe4b5333f2dc9ab294ad8d004712a5f4bddaf2f6742452a4b3c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/37ba8dc9671f8fe4b5333f2dc9ab294ad8d004712a5f4bddaf2f6742452a4b3c/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3f9c2dbd6bdf8182b597345f8c7fea11c09d5e650fe0f55bf00a3c9f8887aa52/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3f9c2dbd6bdf8182b597345f8c7fea11c09d5e650fe0f55bf00a3c9f8887aa52/userdata/shm major:0 minor:442 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3fd8857a2c2302ef6094cc86660ac7c76904e438bf13dc986cbdd24f1c5f29f3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3fd8857a2c2302ef6094cc86660ac7c76904e438bf13dc986cbdd24f1c5f29f3/userdata/shm major:0 minor:361 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4308310cb66871b8d038228451f6cb347ae9fe6f0a69349b4baf59dc1b20775d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4308310cb66871b8d038228451f6cb347ae9fe6f0a69349b4baf59dc1b20775d/userdata/shm major:0 minor:773 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/46a12190f11c7c4d27d18246d17e718f0fd8c23ea7d374844186aa5295484abf/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/46a12190f11c7c4d27d18246d17e718f0fd8c23ea7d374844186aa5295484abf/userdata/shm major:0 minor:284 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/48ea9b1e1ed051eaf5386ce4d24d2d55f57d357f51f1c79f94723fc2aed83c0f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/48ea9b1e1ed051eaf5386ce4d24d2d55f57d357f51f1c79f94723fc2aed83c0f/userdata/shm major:0 minor:622 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/48eeec4a0ddb4a6c0b04997cccaac081db5581507160a9804ec767ba66cbdb34/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/48eeec4a0ddb4a6c0b04997cccaac081db5581507160a9804ec767ba66cbdb34/userdata/shm major:0 minor:449 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/49245723e92395f35c0b36240f7d6fbf94cef777cdd886e69b77ece8cf42ea5f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/49245723e92395f35c0b36240f7d6fbf94cef777cdd886e69b77ece8cf42ea5f/userdata/shm major:0 minor:847 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/49cf279e25f73f2651832183fcd1e966f6a074b295d2f5e34a59a5be738d2377/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/49cf279e25f73f2651832183fcd1e966f6a074b295d2f5e34a59a5be738d2377/userdata/shm major:0 minor:272 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/49f836c59184940ba77efca8e250a75cfca3ff592389502fe02c69990736b85f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/49f836c59184940ba77efca8e250a75cfca3ff592389502fe02c69990736b85f/userdata/shm major:0 minor:776 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4d1e15f7704367048583514d8253b668db81d2da2b51801dfb366fd0e7170679/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4d1e15f7704367048583514d8253b668db81d2da2b51801dfb366fd0e7170679/userdata/shm major:0 minor:281 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4d57d4740bcb6c82be46e50156208d72579c0c7e7b87226d99fe9e3e9e1ff1bb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4d57d4740bcb6c82be46e50156208d72579c0c7e7b87226d99fe9e3e9e1ff1bb/userdata/shm major:0 minor:619 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/51fe4cded0c2312a6b3bfd1f48ff9089513eaa05b60208bd1ced0f39d8ffe36e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/51fe4cded0c2312a6b3bfd1f48ff9089513eaa05b60208bd1ced0f39d8ffe36e/userdata/shm major:0 minor:265 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5702154693e32d84807189cf18ed2f8ceb28029864edaaaff188dc529b9551c9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5702154693e32d84807189cf18ed2f8ceb28029864edaaaff188dc529b9551c9/userdata/shm major:0 minor:316 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5a9e26d5feffb5d032a36611c5da4c454dba6454147ff1cf841070d4b4feeb67/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5a9e26d5feffb5d032a36611c5da4c454dba6454147ff1cf841070d4b4feeb67/userdata/shm major:0 minor:842 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5d21afac0935094080df835d139dd20efdf51fad3502782c60bb38ee7294f13b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5d21afac0935094080df835d139dd20efdf51fad3502782c60bb38ee7294f13b/userdata/shm major:0 minor:241 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/60bbe3130a4d3341abf02d519702749eab204c658db33e134e22b746126c5516/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/60bbe3130a4d3341abf02d519702749eab204c658db33e134e22b746126c5516/userdata/shm major:0 minor:306 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/630f3ef68fb2ab037a83499120027474c94dfe12bf91c1a5c52579bd6c878cbf/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/630f3ef68fb2ab037a83499120027474c94dfe12bf91c1a5c52579bd6c878cbf/userdata/shm major:0 minor:839 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6860ec0c6307c0854099262d2b68eee9cef0172599ec80b28a89c6d016fb4071/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6860ec0c6307c0854099262d2b68eee9cef0172599ec80b28a89c6d016fb4071/userdata/shm major:0 minor:972 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6cc2d27a03b36826decc5cc4343612194df412f00fd1e83d62bd9da95cdaba5c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6cc2d27a03b36826decc5cc4343612194df412f00fd1e83d62bd9da95cdaba5c/userdata/shm major:0 minor:464 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/760e18ae3700547257879a5def4bd8a3e6845f819368981331d76400fe97af9e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/760e18ae3700547257879a5def4bd8a3e6845f819368981331d76400fe97af9e/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/89e373acd0a6c2bca2cd563de1cef238723b46b416d83b3242f9ebb18d644754/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/89e373acd0a6c2bca2cd563de1cef238723b46b416d83b3242f9ebb18d644754/userdata/shm major:0 minor:53 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/89ef9ca9615c3facb086051e9be1597bf7ab0b883ad4dc7992533c4b3196f4e1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/89ef9ca9615c3facb086051e9be1597bf7ab0b883ad4dc7992533c4b3196f4e1/userdata/shm major:0 minor:1092 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8a073909baddafc91ef0c1a8a7d1dde84b7bce6d841c77590b71f4a6754c9117/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8a073909baddafc91ef0c1a8a7d1dde84b7bce6d841c77590b71f4a6754c9117/userdata/shm major:0 minor:810 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8a61c21711f690cdda83fe881555e8ad64b01a2f6d1c312d8da79d83d36082f5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8a61c21711f690cdda83fe881555e8ad64b01a2f6d1c312d8da79d83d36082f5/userdata/shm major:0 minor:610 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/91607e86857f069937668c70d7500f8b27325b07177f03b7a7ab831b3600ac2a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/91607e86857f069937668c70d7500f8b27325b07177f03b7a7ab831b3600ac2a/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/919c09e620d76c77a9425f79c1d3a73bcd978ef6fd46e99de901376458ec1273/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/919c09e620d76c77a9425f79c1d3a73bcd978ef6fd46e99de901376458ec1273/userdata/shm major:0 minor:239 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/93a839a770be652f7d12315bd6dada21638d2838dbf7e7ad327b9d696e2d3e07/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/93a839a770be652f7d12315bd6dada21638d2838dbf7e7ad327b9d696e2d3e07/userdata/shm major:0 minor:848 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/948f733f9e7fc399ff3028ac75f39dbd9ac2f6622b269cc750e23eb9c88dedb1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/948f733f9e7fc399ff3028ac75f39dbd9ac2f6622b269cc750e23eb9c88dedb1/userdata/shm major:0 minor:612 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/972f78b16f9b47d7c2f794578bdc02049da4958a2a754da51e80ca31806a5573/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/972f78b16f9b47d7c2f794578bdc02049da4958a2a754da51e80ca31806a5573/userdata/shm major:0 minor:109 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9786d6c6139bd8ce0f2d0b4a8dd76068f202fd2531fb86bd7b58ce95ed53f64f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9786d6c6139bd8ce0f2d0b4a8dd76068f202fd2531fb86bd7b58ce95ed53f64f/userdata/shm major:0 minor:1161 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/989d132822ac99b97c52492bc7539dcc4d25a3a8fbced6fed73e66c9b3f74f8d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/989d132822ac99b97c52492bc7539dcc4d25a3a8fbced6fed73e66c9b3f74f8d/userdata/shm major:0 minor:548 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9a8426b4146cf2f000b30dafab20a28003c24ae65a16f62f890c575d2f9770d9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9a8426b4146cf2f000b30dafab20a28003c24ae65a16f62f890c575d2f9770d9/userdata/shm major:0 minor:237 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9af8ad1671806bd505a390180b2388970cc534101cf6a5cf64e76e13311c0cb9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9af8ad1671806bd505a390180b2388970cc534101cf6a5cf64e76e13311c0cb9/userdata/shm major:0 minor:268 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9da40744da0c1f755b7ca8d13405871816427a42b29bf11d678dd70f488e5c6a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9da40744da0c1f755b7ca8d13405871816427a42b29bf11d678dd70f488e5c6a/userdata/shm major:0 minor:97 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a07ae992a49295676f3184ce503f903e0b4447cd57b0d7e0c91d07d9a0f3bc30/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a07ae992a49295676f3184ce503f903e0b4447cd57b0d7e0c91d07d9a0f3bc30/userdata/shm major:0 minor:1088 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a49d9b3d48753f2f078ac0130c6af5ca5c251ec6fd9f68a980d833a929f8987e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a49d9b3d48753f2f078ac0130c6af5ca5c251ec6fd9f68a980d833a929f8987e/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a56a69cfc23cf8add77dfc1a237e33143ff59495f1a2048a86a1759c1954faee/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a56a69cfc23cf8add77dfc1a237e33143ff59495f1a2048a86a1759c1954faee/userdata/shm major:0 minor:845 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ac63789abc1163c5f9db4788ba7a388635d218ac761e65be8043709a0019d115/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ac63789abc1163c5f9db4788ba7a388635d218ac761e65be8043709a0019d115/userdata/shm major:0 minor:616 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b77a1ff885b1de50033d17a32311fabaefba7e39efce419b16febb35e5db2498/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b77a1ff885b1de50033d17a32311fabaefba7e39efce419b16febb35e5db2498/userdata/shm major:0 minor:235 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b8d362586f6fb451267fc09fb3ebee6bcd03ae4f33abd963a5893c4d677a7fc5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b8d362586f6fb451267fc09fb3ebee6bcd03ae4f33abd963a5893c4d677a7fc5/userdata/shm major:0 minor:447 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bc3668412459475b58df22c5952b6fe210803ae27cac46ab11b8236701860e95/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bc3668412459475b58df22c5952b6fe210803ae27cac46ab11b8236701860e95/userdata/shm major:0 minor:582 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c1a730f8ab11fc1de19f12e0bc73ab4daa97d0d1b64ae152032167057be533ed/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c1a730f8ab11fc1de19f12e0bc73ab4daa97d0d1b64ae152032167057be533ed/userdata/shm major:0 minor:618 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cadfc06b46a2370e89939aa270eb81d7b99f5548dca07c84a3c027dea72e913b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cadfc06b46a2370e89939aa270eb81d7b99f5548dca07c84a3c027dea72e913b/userdata/shm major:0 minor:248 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d1f4c3462eb562d7885b549a3182d1636527f9d646efb4fbbe9ff562004c787d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d1f4c3462eb562d7885b549a3182d1636527f9d646efb4fbbe9ff562004c787d/userdata/shm major:0 minor:615 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d3032285e1cfcfd919da168e10b18ee5ee2720e85e2457d64bfd97de17bf8050/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d3032285e1cfcfd919da168e10b18ee5ee2720e85e2457d64bfd97de17bf8050/userdata/shm major:0 minor:756 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d48534fe1c98270494577c8d49aed8602c14ccc175395517708a7b89389db471/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d48534fe1c98270494577c8d49aed8602c14ccc175395517708a7b89389db471/userdata/shm major:0 minor:339 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/dc885deb2f8a42b2a9647ca9a7a52b0dd31aa93757cc5e47ba821f5cab2f46b8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/dc885deb2f8a42b2a9647ca9a7a52b0dd31aa93757cc5e47ba821f5cab2f46b8/userdata/shm major:0 minor:252 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ddcba86a9171baf183956cb4886e6b2242be15b32412346569edbcdcef731dfb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ddcba86a9171baf183956cb4886e6b2242be15b32412346569edbcdcef731dfb/userdata/shm major:0 minor:1090 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/df1df4af888713c77332d729a24c1e1fdb472ce369b8165f8ad6dfbe7c60bbd6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/df1df4af888713c77332d729a24c1e1fdb472ce369b8165f8ad6dfbe7c60bbd6/userdata/shm major:0 minor:305 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/df43cdf08fb65d06b9db1ef59770a000ce2240e1f2e40f1d6a1619ba0e8b03df/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/df43cdf08fb65d06b9db1ef59770a000ce2240e1f2e40f1d6a1619ba0e8b03df/userdata/shm major:0 minor:258 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e78425b2d05a4bc50aa45d6948d8f1a6996398b351024eb248c91be99ea577d1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e78425b2d05a4bc50aa45d6948d8f1a6996398b351024eb248c91be99ea577d1/userdata/shm major:0 minor:561 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e9d1c009ab8bfc1b5d9e5ffc8e765a713719b93c5edffb019bac7a72776dcb9e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e9d1c009ab8bfc1b5d9e5ffc8e765a713719b93c5edffb019bac7a72776dcb9e/userdata/shm major:0 minor:1047 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e9d5f349b622bea576ae3dd04cdf2c2da1c82af6b9e42a0b5011a9e0e2cc47e6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e9d5f349b622bea576ae3dd04cdf2c2da1c82af6b9e42a0b5011a9e0e2cc47e6/userdata/shm major:0 minor:882 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ead213f06d6e13b0b8afce02cff25edfe82c583b53f661ee9bdc498f394f53a9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ead213f06d6e13b0b8afce02cff25edfe82c583b53f661ee9bdc498f394f53a9/userdata/shm major:0 minor:164 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ec3f7a57e8d7aa7239f51fc0b75ccf091bb42e503457a1919c637dd65b9da53e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ec3f7a57e8d7aa7239f51fc0b75ccf091bb42e503457a1919c637dd65b9da53e/userdata/shm major:0 minor:293 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/edc62dc83d0212adeb196aa9fb63d28b17a6054a019750eef25f143d8b2816f1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/edc62dc83d0212adeb196aa9fb63d28b17a6054a019750eef25f143d8b2816f1/userdata/shm major:0 minor:617 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ef2664ab7688b2c180ba8801467801f25c2ef381f1cb268907fa44549876a809/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ef2664ab7688b2c180ba8801467801f25c2ef381f1cb268907fa44549876a809/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ef4e9117db9997ed5777e4eb0ac97e214bfa4a8d0f2ec4d63af464bd290bf782/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ef4e9117db9997ed5777e4eb0ac97e214bfa4a8d0f2ec4d63af464bd290bf782/userdata/shm major:0 minor:784 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f63327bd315d50a90d1b9877abdb96105e01d6fdb0e17116424c13ecbe62dbc3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f63327bd315d50a90d1b9877abdb96105e01d6fdb0e17116424c13ecbe62dbc3/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f826050a5c784de5b331265098c5bf62987ce32e6de55cf05cab0f3f76894ea8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f826050a5c784de5b331265098c5bf62987ce32e6de55cf05cab0f3f76894ea8/userdata/shm major:0 minor:262 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f8c21c05090492f9afafa02ead2a469af0d1260ed484823064a0610864bf15d8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f8c21c05090492f9afafa02ead2a469af0d1260ed484823064a0610864bf15d8/userdata/shm major:0 minor:431 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fb1900c9365a08f03e6fa39a54bd69429d8cedb421e2b0d1e4f977c7a6faf417/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fb1900c9365a08f03e6fa39a54bd69429d8cedb421e2b0d1e4f977c7a6faf417/userdata/shm major:0 minor:89 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fd67ef0d5263721af2c9563f40007bfa8118e9b8e56c96c99f48f239bd0b7044/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fd67ef0d5263721af2c9563f40007bfa8118e9b8e56c96c99f48f239bd0b7044/userdata/shm major:0 minor:416 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fee33178d398a85728734b8702eecb787d89c780d680fd9fa904a7591c14e420/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fee33178d398a85728734b8702eecb787d89c780d680fd9fa904a7591c14e420/userdata/shm major:0 minor:344 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/12e1d9e5-96b5-4367-81a5-d87b3f93d8da/volumes/kubernetes.io~projected/kube-api-access-g2qf7:{mountpoint:/var/lib/kubelet/pods/12e1d9e5-96b5-4367-81a5-d87b3f93d8da/volumes/kubernetes.io~projected/kube-api-access-g2qf7 major:0 minor:544 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/12e1d9e5-96b5-4367-81a5-d87b3f93d8da/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/12e1d9e5-96b5-4367-81a5-d87b3f93d8da/volumes/kubernetes.io~secret/metrics-tls major:0 minor:539 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1375da42-ecaf-4d86-b554-25fd1c3d00bd/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/1375da42-ecaf-4d86-b554-25fd1c3d00bd/volumes/kubernetes.io~projected/kube-api-access major:0 minor:755 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1375da42-ecaf-4d86-b554-25fd1c3d00bd/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/1375da42-ecaf-4d86-b554-25fd1c3d00bd/volumes/kubernetes.io~secret/serving-cert major:0 minor:728 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1db4d695-5a6a-4fbe-b610-3777bfebed79/volumes/kubernetes.io~projected/kube-api-access-whmmk:{mountpoint:/var/lib/kubelet/pods/1db4d695-5a6a-4fbe-b610-3777bfebed79/volumes/kubernetes.io~projected/kube-api-access-whmmk major:0 minor:1076 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1db4d695-5a6a-4fbe-b610-3777bfebed79/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/1db4d695-5a6a-4fbe-b610-3777bfebed79/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config major:0 minor:1075 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1db4d695-5a6a-4fbe-b610-3777bfebed79/volumes/kubernetes.io~secret/openshift-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/1db4d695-5a6a-4fbe-b610-3777bfebed79/volumes/kubernetes.io~secret/openshift-state-metrics-tls major:0 minor:1074 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/21bebade-17fa-444e-92a9-eea53d6cd673/volumes/kubernetes.io~projected/kube-api-access-zsht7:{mountpoint:/var/lib/kubelet/pods/21bebade-17fa-444e-92a9-eea53d6cd673/volumes/kubernetes.io~projected/kube-api-access-zsht7 major:0 minor:910 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/21bebade-17fa-444e-92a9-eea53d6cd673/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/21bebade-17fa-444e-92a9-eea53d6cd673/volumes/kubernetes.io~secret/cert major:0 minor:909 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volumes/kubernetes.io~projected/kube-api-access-mtc2p:{mountpoint:/var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volumes/kubernetes.io~projected/kube-api-access-mtc2p major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/volumes/kubernetes.io~projected/kube-api-access-s69rd:{mountpoint:/var/lib/kubelet/pods/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/volumes/kubernetes.io~projected/kube-api-access-s69rd major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/volumes/kubernetes.io~secret/serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2bf90db0-f943-464c-8599-e36b4fc32e1c/volumes/kubernetes.io~projected/kube-api-access-qns9g:{mountpoint:/var/lib/kubelet/pods/2bf90db0-f943-464c-8599-e36b4fc32e1c/volumes/kubernetes.io~projected/kube-api-access-qns9g major:0 minor:360 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2f844652-225b-4713-a9ad-cf9bcc348f47/volumes/kubernetes.io~projected/kube-api-access-jdwvw:{mountpoint:/var/lib/kubelet/pods/2f844652-225b-4713-a9ad-cf9bcc348f47/volumes/kubernetes.io~projected/kube-api-access-jdwvw major:0 minor:387 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2f844652-225b-4713-a9ad-cf9bcc348f47/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/2f844652-225b-4713-a9ad-cf9bcc348f47/volumes/kubernetes.io~secret/signing-key major:0 minor:386 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/325f0a83-d56d-4b62-977b-088a7d5f0e00/volumes/kubernetes.io~projected/kube-api-access-lqdlf:{mountpoint:/var/lib/kubelet/pods/325f0a83-d56d-4b62-977b-088a7d5f0e00/volumes/kubernetes.io~projected/kube-api-access-lqdlf major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/325f0a83-d56d-4b62-977b-088a7d5f0e00/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/325f0a83-d56d-4b62-977b-088a7d5f0e00/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3de37144-a9ab-45fb-a23f-2287a5198edf/volumes/kubernetes.io~projected/kube-api-access-zr8br:{mountpoint:/var/lib/kubelet/pods/3de37144-a9ab-45fb-a23f-2287a5198edf/volumes/kubernetes.io~projected/kube-api-access-zr8br major:0 minor:572 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3de37144-a9ab-45fb-a23f-2287a5198edf/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/3de37144-a9ab-45fb-a23f-2287a5198edf/volumes/kubernetes.io~secret/encryption-config major:0 minor:567 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3de37144-a9ab-45fb-a23f-2287a5198edf/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/3de37144-a9ab-45fb-a23f-2287a5198edf/volumes/kubernetes.io~secret/etcd-client major:0 minor:573 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3de37144-a9ab-45fb-a23f-2287a5198edf/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3de37144-a9ab-45fb-a23f-2287a5198edf/volumes/kubernetes.io~secret/serving-cert major:0 minor:571 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f471ecc-922c-4cb1-9bdd-fdb5da08c592/volumes/kubernetes.io~projected/kube-api-access-224dg:{mountpoint:/var/lib/kubelet/pods/3f471ecc-922c-4cb1-9bdd-fdb5da08c592/volumes/kubernetes.io~projected/kube-api-access-224dg major:0 minor:276 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f471ecc-922c-4cb1-9bdd-fdb5da08c592/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/3f471ecc-922c-4cb1-9bdd-fdb5da08c592/volumes/kubernetes.io~secret/metrics-tls major:0 minor:435 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97/volumes/kubernetes.io~projected/kube-api-access-wrws5:{mountpoint:/var/lib/kubelet/pods/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97/volumes/kubernetes.io~projected/kube-api-access-wrws5 major:0 minor:271 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97/volumes/kubernetes.io~secret/proxy-tls major:0 minor:600 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/45e8b72b-564c-4bb1-b911-baff2d6c87ad/volumes/kubernetes.io~projected/kube-api-access-9zpkn:{mountpoint:/var/lib/kubelet/pods/45e8b72b-564c-4bb1-b911-baff2d6c87ad/volumes/kubernetes.io~projected/kube-api-access-9zpkn major:0 minor:261 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/45e8b72b-564c-4bb1-b911-baff2d6c87ad/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/45e8b72b-564c-4bb1-b911-baff2d6c87ad/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:601 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/469183dd-dc54-467d-82a1-611132ae8ec4/volumes/kubernetes.io~projected/kube-api-access-8dtbl:{mountpoint:/var/lib/kubelet/pods/469183dd-dc54-467d-82a1-611132ae8ec4/volumes/kubernetes.io~projected/kube-api-access-8dtbl major:0 minor:892 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/469183dd-dc54-467d-82a1-611132ae8ec4/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/469183dd-dc54-467d-82a1-611132ae8ec4/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:891 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46de2acc-9f5d-4ecf-befe-a480f86466f5/volumes/kubernetes.io~projected/kube-api-access-4f9vt:{mountpoint:/var/lib/kubelet/pods/46de2acc-9f5d-4ecf-befe-a480f86466f5/volumes/kubernetes.io~projected/kube-api-access-4f9vt major:0 minor:578 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46de2acc-9f5d-4ecf-befe-a480f86466f5/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/46de2acc-9f5d-4ecf-befe-a480f86466f5/volumes/kubernetes.io~secret/encryption-config major:0 minor:577 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46de2acc-9f5d-4ecf-befe-a480f86466f5/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/46de2acc-9f5d-4ecf-befe-a480f86466f5/volumes/kubernetes.io~secret/etcd-client major:0 minor:574 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46de2acc-9f5d-4ecf-befe-a480f86466f5/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/46de2acc-9f5d-4ecf-befe-a480f86466f5/volumes/kubernetes.io~secret/serving-cert major:0 minor:575 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/47eadda0-35a6-4b5c-a96c-24854be15098/volumes/kubernetes.io~secret/tls-certificates:{mountpoint:/var/lib/kubelet/pods/47eadda0-35a6-4b5c-a96c-24854be15098/volumes/kubernetes.io~secret/tls-certificates major:0 minor:837 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4ddac301-a604-4f07-8849-5928befd336e/volumes/kubernetes.io~projected/kube-api-access-27j9q:{mountpoint:/var/lib/kubelet/pods/4ddac301-a604-4f07-8849-5928befd336e/volumes/kubernetes.io~projected/kube-api-access-27j9q major:0 minor:880 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4ddac301-a604-4f07-8849-5928befd336e/volumes/kubernetes.io~secret/certs:{mountpoint:/var/lib/kubelet/pods/4ddac301-a604-4f07-8849-5928befd336e/volumes/kubernetes.io~secret/certs major:0 minor:879 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4ddac301-a604-4f07-8849-5928befd336e/volumes/kubernetes.io~secret/node-bootstrap-token:{mountpoint:/var/lib/kubelet/pods/4ddac301-a604-4f07-8849-5928befd336e/volumes/kubernetes.io~secret/node-bootstrap-token major:0 minor:878 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4fea9b05-222e-4b58-95c8-735fc1cf3a8b/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/4fea9b05-222e-4b58-95c8-735fc1cf3a8b/volumes/kubernetes.io~projected/ca-certs major:0 minor:467 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4fea9b05-222e-4b58-95c8-735fc1cf3a8b/volumes/kubernetes.io~projected/kube-api-access-qstvb:{mountpoint:/var/lib/kubelet/pods/4fea9b05-222e-4b58-95c8-735fc1cf3a8b/volumes/kubernetes.io~projected/kube-api-access-qstvb major:0 minor:514 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4fea9b05-222e-4b58-95c8-735fc1cf3a8b/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/4fea9b05-222e-4b58-95c8-735fc1cf3a8b/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:466 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5e3ddf9e-eeb5-4266-b675-092fd4e27623/volumes/kubernetes.io~projected/kube-api-access-xd6vv:{mountpoint:/var/lib/kubelet/pods/5e3ddf9e-eeb5-4266-b675-092fd4e27623/volumes/kubernetes.io~projected/kube-api-access-xd6vv major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5e3ddf9e-eeb5-4266-b675-092fd4e27623/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/5e3ddf9e-eeb5-4266-b675-092fd4e27623/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7/volumes/kubernetes.io~projected/kube-api-access-fxg4z:{mountpoint:/var/lib/kubelet/pods/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7/volumes/kubernetes.io~projected/kube-api-access-fxg4z major:0 minor:103 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/654b5b1c-2764-415c-bb13-aa06899f4076/volumes/kubernetes.io~projected/kube-api-access-xcp8t:{mountpoint:/var/lib/kubelet/pods/654b5b1c-2764-415c-bb13-aa06899f4076/volumes/kubernetes.io~projected/kube-api-access-xcp8t major:0 minor:937 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/68252533-bd64-4fc5-838a-cc350cbe77f0/volumes/kubernetes.io~projected/kube-api-access-75r7k:{mountpoint:/var/lib/kubelet/pods/68252533-bd64-4fc5-838a-cc350cbe77f0/volumes/kubernetes.io~projected/kube-api-access-75r7k major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/68252533-bd64-4fc5-838a-cc350cbe77f0/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/68252533-bd64-4fc5-838a-cc350cbe77f0/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a80bd6f-2263-4251-8197-5173193f8afc/volumes/kubernetes.io~projected/kube-api-access-tqd2v:{mountpoint:/var/lib/kubelet/pods/6a80bd6f-2263-4251-8197-5173193f8afc/volumes/kubernetes.io~projected/kube-api-access-tqd2v major:0 minor:256 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a80bd6f-2263-4251-8197-5173193f8afc/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/6a80bd6f-2263-4251-8197-5173193f8afc/volumes/kubernetes.io~secret/webhook-certs major:0 minor:608 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/70020125-af49-47d7-8853-fb951c561dc4/volumes/kubernetes.io~projected/kube-api-access-9s2tb:{mountpoint:/var/lib/kubelet/pods/70020125-af49-47d7-8853-fb951c561dc4/volumes/kubernetes.io~projected/kube-api-access-9s2tb major:0 minor:556 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/75e3e2cc-aa56-41f3-8859-1c086f419d05/volumes/kubernetes.io~projected/kube-api-access-clz6w:{mountpoint:/var/lib/kubelet/pods/75e3e2cc-aa56-41f3-8859-1c086f419d05/volumes/kubernetes.io~projected/kube-api-access-clz6w major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/75e3e2cc-aa56-41f3-8859-1c086f419d05/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/75e3e2cc-aa56-41f3-8859-1c086f419d05/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7b489385-2c96-4a97-8b31-362162de020e/volumes/kubernetes.io~projected/kube-api-access-pg6th:{mountpoint:/var/lib/kubelet/pods/7b489385-2c96-4a97-8b31-362162de020e/volumes/kubernetes.io~projected/kube-api-access-pg6th major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7b489385-2c96-4a97-8b31-362162de020e/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/7b489385-2c96-4a97-8b31-362162de020e/volumes/kubernetes.io~secret/srv-cert major:0 minor:605 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7c4e7e57-43be-4d31-b523-f7e4d316dce3/volumes/kubernetes.io~projected/kube-api-access-bvjkr:{mountpoint:/var/lib/kubelet/pods/7c4e7e57-43be-4d31-b523-f7e4d316dce3/volumes/kubernetes.io~projected/kube-api-access-bvjkr major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7e451189-850e-4d19-a40c-40f642d08511/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/7e451189-850e-4d19-a40c-40f642d08511/volumes/kubernetes.io~projected/ca-certs major:0 minor:511 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7e451189-850e-4d19-a40c-40f642d08511/volumes/kubernetes.io~projected/kube-api-access-snmpq:{mountpoint:/var/lib/kubelet/pods/7e451189-850e-4d19-a40c-40f642d08511/volumes/kubernetes.io~projected/kube-api-access-snmpq major:0 minor:407 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/813f91c2-2b37-4681-968d-4217e286e22f/volumes/kubernetes.io~projected/kube-api-access-njjkt:{mountpoint:/var/lib/kubelet/pods/813f91c2-2b37-4681-968d-4217e286e22f/volumes/kubernetes.io~projected/kube-api-access-njjkt major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/813f91c2-2b37-4681-968d-4217e286e22f/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/813f91c2-2b37-4681-968d-4217e286e22f/volumes/kubernetes.io~secret/metrics-certs major:0 minor:596 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/86cb5d23-df7f-4f67-8086-1789d8e68544/volumes/kubernetes.io~projected/kube-api-access-j7k8m:{mountpoint:/var/lib/kubelet/pods/86cb5d23-df7f-4f67-8086-1789d8e68544/volumes/kubernetes.io~projected/kube-api-access-j7k8m major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/86cb5d23-df7f-4f67-8086-1789d8e68544/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/86cb5d23-df7f-4f67-8086-1789d8e68544/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/91b2899e-8d24-41a0-bec8-d11c67b8f955/volumes/kubernetes.io~projected/kube-api-access-hqtvp:{mountpoint:/var/lib/kubelet/pods/91b2899e-8d24-41a0-bec8-d11c67b8f955/volumes/kubernetes.io~projected/kube-api-access-hqtvp major:0 minor:834 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/91b2899e-8d24-41a0-bec8-d11c67b8f955/volumes/kubernetes.io~secret/default-certificate:{mountpoint:/var/lib/kubelet/pods/91b2899e-8d24-41a0-bec8-d11c67b8f955/volumes/kubernetes.io~secret/default-certificate major:0 minor:832 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/91b2899e-8d24-41a0-bec8-d11c67b8f955/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/91b2899e-8d24-41a0-bec8-d11c67b8f955/volumes/kubernetes.io~secret/metrics-certs major:0 minor:833 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/91b2899e-8d24-41a0-bec8-d11c67b8f955/volumes/kubernetes.io~secret/stats-auth:{mountpoint:/var/lib/kubelet/pods/91b2899e-8d24-41a0-bec8-d11c67b8f955/volumes/kubernetes.io~secret/stats-auth major:0 minor:831 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/96de6024-e20f-4b52-9294-b330d65e4153/volumes/kubernetes.io~projected/kube-api-access-z8bxz:{mountpoint:/var/lib/kubelet/pods/96de6024-e20f-4b52-9294-b330d65e4153/volumes/kubernetes.io~projected/kube-api-access-z8bxz major:0 minor:367 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a18b9230-de78-41b8-a61e-361b8bb1fbb3/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/a18b9230-de78-41b8-a61e-361b8bb1fbb3/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:516 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a18b9230-de78-41b8-a61e-361b8bb1fbb3/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/a18b9230-de78-41b8-a61e-361b8bb1fbb3/volumes/kubernetes.io~empty-dir/tmp major:0 minor:515 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a18b9230-de78-41b8-a61e-361b8bb1fbb3/volumes/kubernetes.io~projected/kube-api-access-8crkc:{mountpoint:/var/lib/kubelet/pods/a18b9230-de78-41b8-a61e-361b8bb1fbb3/volumes/kubernetes.io~projected/kube-api-access-8crkc major:0 minor:518 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a25248c0-8de7-4624-b785-f053665fcb23/volumes/kubernetes.io~projected/kube-api-access-prcgg:{mountpoint:/var/lib/kubelet/pods/a25248c0-8de7-4624-b785-f053665fcb23/volumes/kubernetes.io~projected/kube-api-access-prcgg major:0 minor:1087 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a25248c0-8de7-4624-b785-f053665fcb23/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/a25248c0-8de7-4624-b785-f053665fcb23/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config major:0 minor:1080 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a25248c0-8de7-4624-b785-f053665fcb23/volumes/kubernetes.io~secret/kube-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/a25248c0-8de7-4624-b785-f053665fcb23/volumes/kubernetes.io~secret/kube-state-metrics-tls major:0 minor:1081 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a57854ac-809a-4745-aaa1-774f0a08a560/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/a57854ac-809a-4745-aaa1-774f0a08a560/volumes/kubernetes.io~projected/kube-api-access major:0 minor:267 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a57854ac-809a-4745-aaa1-774f0a08a560/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a57854ac-809a-4745-aaa1-774f0a08a560/volumes/kubernetes.io~secret/serving-cert major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a638c468-010c-4da3-ad62-26f5f2bbdbb9/volumes/kubernetes.io~projected/kube-api-access-sdbds:{mountpoint:/var/lib/kubelet/pods/a638c468-010c-4da3-ad62-26f5f2bbdbb9/volumes/kubernetes.io~projected/kube-api-access-sdbds major:0 minor:1035 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a638c468-010c-4da3-ad62-26f5f2bbdbb9/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a638c468-010c-4da3-ad62-26f5f2bbdbb9/volumes/kubernetes.io~secret/serving-cert major:0 minor:1034 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a69e8d3a-a0b1-4688-8631-d9f265aa4c69/volumes/kubernetes.io~projected/kube-api-access-c6p8v:{mountpoint:/var/lib/kubelet/pods/a69e8d3a-a0b1-4688-8631-d9f265aa4c69/volumes/kubernetes.io~projected/kube-api-access-c6p8v major:0 minor:1160 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a69e8d3a-a0b1-4688-8631-d9f265aa4c69/volumes/kubernetes.io~secret/client-ca-bundle:{mountpoint:/var/lib/kubelet/pods/a69e8d3a-a0b1-4688-8631-d9f265aa4c69/volumes/kubernetes.io~secret/client-ca-bundle major:0 minor:1158 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a69e8d3a-a0b1-4688-8631-d9f265aa4c69/volumes/kubernetes.io~secret/secret-metrics-client-certs:{mountpoint:/var/lib/kubelet/pods/a69e8d3a-a0b1-4688-8631-d9f265aa4c69/volumes/kubernetes.io~secret/secret-metrics-client-certs major:0 minor:1153 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a69e8d3a-a0b1-4688-8631-d9f265aa4c69/volumes/kubernetes.io~secret/secret-metrics-server-tls:{mountpoint:/var/lib/kubelet/pods/a69e8d3a-a0b1-4688-8631-d9f265aa4c69/volumes/kubernetes.io~secret/secret-metrics-server-tls major:0 minor:1159 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9a9ecf2-c476-4962-8333-21f242dbcb89/volumes/kubernetes.io~projected/kube-api-access-d5v9f:{mountpoint:/var/lib/kubelet/pods/a9a9ecf2-c476-4962-8333-21f242dbcb89/volumes/kubernetes.io~projected/kube-api-access-d5v9f major:0 minor:1046 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9a9ecf2-c476-4962-8333-21f242dbcb89/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a9a9ecf2-c476-4962-8333-21f242dbcb89/volumes/kubernetes.io~secret/serving-cert major:0 minor:1033 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aa16c3bf-2350-46d1-afa0-9477b3ec8877/volumes/kubernetes.io~projected/kube-api-access-qmndg:{mountpoint:/var/lib/kubelet/pods/aa16c3bf-2350-46d1-afa0-9477b3ec8877/volumes/kubernetes.io~projected/kube-api-access-qmndg major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aa16c3bf-2350-46d1-afa0-9477b3ec8877/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/aa16c3bf-2350-46d1-afa0-9477b3ec8877/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ab175f7e-a5e8-4fda-98c9-6d052a221a83/volumes/kubernetes.io~projected/kube-api-access-zmrxh:{mountpoint:/var/lib/kubelet/pods/ab175f7e-a5e8-4fda-98c9-6d052a221a83/volumes/kubernetes.io~projected/kube-api-access-zmrxh major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ab175f7e-a5e8-4fda-98c9-6d052a221a83/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ab175f7e-a5e8-4fda-98c9-6d052a221a83/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/acb704a9-6c8d-4378-ae93-e7095b1fce85/volumes/kubernetes.io~projected/kube-api-access-xvx6w:{mountpoint:/var/lib/kubelet/pods/acb704a9-6c8d-4378-ae93-e7095b1fce85/volumes/kubernetes.io~projected/kube-api-access-xvx6w major:0 minor:260 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/acb704a9-6c8d-4378-ae93-e7095b1fce85/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/acb704a9-6c8d-4378-ae93-e7095b1fce85/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:604 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ad692349-5089-4afc-85b2-9b6e7997567c/volumes/kubernetes.io~projected/kube-api-access-h6mb5:{mountpoint:/var/lib/kubelet/pods/ad692349-5089-4afc-85b2-9b6e7997567c/volumes/kubernetes.io~projected/kube-api-access-h6mb5 major:0 minor:102 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ad692349-5089-4afc-85b2-9b6e7997567c/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/ad692349-5089-4afc-85b2-9b6e7997567c/volumes/kubernetes.io~secret/metrics-tls major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ae39c09b-7aef-4615-8ced-0dcad39f23a5/volumes/kubernetes.io~projected/kube-api-access-rxn2f:{mountpoint:/var/lib/kubelet/pods/ae39c09b-7aef-4615-8ced-0dcad39f23a5/volumes/kubernetes.io~projected/kube-api-access-rxn2f major:0 minor:824 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ae39c09b-7aef-4615-8ced-0dcad39f23a5/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/ae39c09b-7aef-4615-8ced-0dcad39f23a5/volumes/kubernetes.io~secret/machine-approver-tls major:0 minor:803 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b4291bfd-53d9-4c78-b7cb-d7eb46560528/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/b4291bfd-53d9-4c78-b7cb-d7eb46560528/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b4291bfd-53d9-4c78-b7cb-d7eb46560528/volumes/kubernetes.io~projected/kube-api-access-9nvl4:{mountpoint:/var/lib/kubelet/pods/b4291bfd-53d9-4c78-b7cb-d7eb46560528/volumes/kubernetes.io~projected/kube-api-access-9nvl4 major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kub Mar 20 08:40:54.970025 master-0 kubenswrapper[18707]: elet/pods/b4291bfd-53d9-4c78-b7cb-d7eb46560528/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/b4291bfd-53d9-4c78-b7cb-d7eb46560528/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:437 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b543f82e-683d-47c1-af73-4dcede4cf4df/volumes/kubernetes.io~projected/kube-api-access-4c2rq:{mountpoint:/var/lib/kubelet/pods/b543f82e-683d-47c1-af73-4dcede4cf4df/volumes/kubernetes.io~projected/kube-api-access-4c2rq major:0 minor:830 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b543f82e-683d-47c1-af73-4dcede4cf4df/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/b543f82e-683d-47c1-af73-4dcede4cf4df/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:825 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b543f82e-683d-47c1-af73-4dcede4cf4df/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/b543f82e-683d-47c1-af73-4dcede4cf4df/volumes/kubernetes.io~secret/webhook-cert major:0 minor:829 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b639e578-628e-404d-b759-8b6e84e771d9/volumes/kubernetes.io~projected/kube-api-access-9zp8f:{mountpoint:/var/lib/kubelet/pods/b639e578-628e-404d-b759-8b6e84e771d9/volumes/kubernetes.io~projected/kube-api-access-9zp8f major:0 minor:318 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b98b4efc-6117-487f-9cfc-38ce66dd9570/volumes/kubernetes.io~projected/kube-api-access-6c5ch:{mountpoint:/var/lib/kubelet/pods/b98b4efc-6117-487f-9cfc-38ce66dd9570/volumes/kubernetes.io~projected/kube-api-access-6c5ch major:0 minor:117 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bbc0b783-28d5-4554-b49d-c66082546f44/volumes/kubernetes.io~projected/kube-api-access-27qvw:{mountpoint:/var/lib/kubelet/pods/bbc0b783-28d5-4554-b49d-c66082546f44/volumes/kubernetes.io~projected/kube-api-access-27qvw major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bbc0b783-28d5-4554-b49d-c66082546f44/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/bbc0b783-28d5-4554-b49d-c66082546f44/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:606 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c0142d4e-9fd4-4375-a773-bb89b38af654/volumes/kubernetes.io~projected/kube-api-access-w2zzd:{mountpoint:/var/lib/kubelet/pods/c0142d4e-9fd4-4375-a773-bb89b38af654/volumes/kubernetes.io~projected/kube-api-access-w2zzd major:0 minor:315 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c0a17669-a122-44aa-bdda-581bf1fc4649/volumes/kubernetes.io~projected/kube-api-access-xf485:{mountpoint:/var/lib/kubelet/pods/c0a17669-a122-44aa-bdda-581bf1fc4649/volumes/kubernetes.io~projected/kube-api-access-xf485 major:0 minor:356 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c1854ea4-c8e2-4289-84b6-1f18b2ac684f/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/c1854ea4-c8e2-4289-84b6-1f18b2ac684f/volumes/kubernetes.io~projected/kube-api-access major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c1854ea4-c8e2-4289-84b6-1f18b2ac684f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c1854ea4-c8e2-4289-84b6-1f18b2ac684f/volumes/kubernetes.io~secret/serving-cert major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c2a23d24-9e09-431e-8c3b-8456ff51a8d0/volumes/kubernetes.io~projected/kube-api-access-btdjt:{mountpoint:/var/lib/kubelet/pods/c2a23d24-9e09-431e-8c3b-8456ff51a8d0/volumes/kubernetes.io~projected/kube-api-access-btdjt major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c2a23d24-9e09-431e-8c3b-8456ff51a8d0/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c2a23d24-9e09-431e-8c3b-8456ff51a8d0/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c593e31d-82b5-4d42-992e-6b050ccf3019/volumes/kubernetes.io~projected/kube-api-access-gxmkh:{mountpoint:/var/lib/kubelet/pods/c593e31d-82b5-4d42-992e-6b050ccf3019/volumes/kubernetes.io~projected/kube-api-access-gxmkh major:0 minor:80 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c7f5e6cd-e093-409a-8758-d3db7a7eb32c/volumes/kubernetes.io~projected/kube-api-access-plc2q:{mountpoint:/var/lib/kubelet/pods/c7f5e6cd-e093-409a-8758-d3db7a7eb32c/volumes/kubernetes.io~projected/kube-api-access-plc2q major:0 minor:355 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c7f5e6cd-e093-409a-8758-d3db7a7eb32c/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/c7f5e6cd-e093-409a-8758-d3db7a7eb32c/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:354 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d88ba8e1-ee42-423f-9839-e71cb0265c6c/volumes/kubernetes.io~projected/kube-api-access-lw4np:{mountpoint:/var/lib/kubelet/pods/d88ba8e1-ee42-423f-9839-e71cb0265c6c/volumes/kubernetes.io~projected/kube-api-access-lw4np major:0 minor:749 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d88ba8e1-ee42-423f-9839-e71cb0265c6c/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/d88ba8e1-ee42-423f-9839-e71cb0265c6c/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:688 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/de6078d7-2aad-46fe-b17a-b6b38e4eaa41/volumes/kubernetes.io~projected/kube-api-access-66kz7:{mountpoint:/var/lib/kubelet/pods/de6078d7-2aad-46fe-b17a-b6b38e4eaa41/volumes/kubernetes.io~projected/kube-api-access-66kz7 major:0 minor:805 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/de6078d7-2aad-46fe-b17a-b6b38e4eaa41/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/de6078d7-2aad-46fe-b17a-b6b38e4eaa41/volumes/kubernetes.io~secret/proxy-tls major:0 minor:802 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/df428d5a-c722-4536-8e7f-cdd85c560481/volumes/kubernetes.io~projected/kube-api-access-dtcnq:{mountpoint:/var/lib/kubelet/pods/df428d5a-c722-4536-8e7f-cdd85c560481/volumes/kubernetes.io~projected/kube-api-access-dtcnq major:0 minor:255 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/df428d5a-c722-4536-8e7f-cdd85c560481/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/df428d5a-c722-4536-8e7f-cdd85c560481/volumes/kubernetes.io~secret/srv-cert major:0 minor:607 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466/volumes/kubernetes.io~projected/kube-api-access-gkccn:{mountpoint:/var/lib/kubelet/pods/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466/volumes/kubernetes.io~projected/kube-api-access-gkccn major:0 minor:901 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:900 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97/volumes/kubernetes.io~projected/kube-api-access-6wl7f:{mountpoint:/var/lib/kubelet/pods/e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97/volumes/kubernetes.io~projected/kube-api-access-6wl7f major:0 minor:838 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e/volumes/kubernetes.io~projected/kube-api-access-htv9s:{mountpoint:/var/lib/kubelet/pods/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e/volumes/kubernetes.io~projected/kube-api-access-htv9s major:0 minor:908 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:907 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:257 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/volumes/kubernetes.io~projected/kube-api-access-8d57k:{mountpoint:/var/lib/kubelet/pods/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/volumes/kubernetes.io~projected/kube-api-access-8d57k major:0 minor:254 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/volumes/kubernetes.io~secret/metrics-tls major:0 minor:436 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7/volumes/kubernetes.io~projected/kube-api-access-lpdk5:{mountpoint:/var/lib/kubelet/pods/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7/volumes/kubernetes.io~projected/kube-api-access-lpdk5 major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:425 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:434 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ee3cc021-67d8-4b7f-b443-16f18228712e/volumes/kubernetes.io~projected/kube-api-access-7l4b7:{mountpoint:/var/lib/kubelet/pods/ee3cc021-67d8-4b7f-b443-16f18228712e/volumes/kubernetes.io~projected/kube-api-access-7l4b7 major:0 minor:263 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~projected/kube-api-access-xhkh7:{mountpoint:/var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~projected/kube-api-access-xhkh7 major:0 minor:270 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~secret/etcd-client major:0 minor:247 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~secret/serving-cert major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f1468ec0-2aa4-461c-a62f-e9f067be490f/volumes/kubernetes.io~projected/kube-api-access-bqmv5:{mountpoint:/var/lib/kubelet/pods/f1468ec0-2aa4-461c-a62f-e9f067be490f/volumes/kubernetes.io~projected/kube-api-access-bqmv5 major:0 minor:1086 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f1468ec0-2aa4-461c-a62f-e9f067be490f/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/f1468ec0-2aa4-461c-a62f-e9f067be490f/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config major:0 minor:1077 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f1468ec0-2aa4-461c-a62f-e9f067be490f/volumes/kubernetes.io~secret/node-exporter-tls:{mountpoint:/var/lib/kubelet/pods/f1468ec0-2aa4-461c-a62f-e9f067be490f/volumes/kubernetes.io~secret/node-exporter-tls major:0 minor:1078 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f2217de0-7805-4f5f-8ea5-93b81b7e0236/volumes/kubernetes.io~projected/kube-api-access-82x7p:{mountpoint:/var/lib/kubelet/pods/f2217de0-7805-4f5f-8ea5-93b81b7e0236/volumes/kubernetes.io~projected/kube-api-access-82x7p major:0 minor:912 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f2217de0-7805-4f5f-8ea5-93b81b7e0236/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f2217de0-7805-4f5f-8ea5-93b81b7e0236/volumes/kubernetes.io~secret/serving-cert major:0 minor:911 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f53bc282-5937-49ac-ac98-2ee37ccb268d/volumes/kubernetes.io~projected/kube-api-access-b2d4r:{mountpoint:/var/lib/kubelet/pods/f53bc282-5937-49ac-ac98-2ee37ccb268d/volumes/kubernetes.io~projected/kube-api-access-b2d4r major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f53bc282-5937-49ac-ac98-2ee37ccb268d/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/f53bc282-5937-49ac-ac98-2ee37ccb268d/volumes/kubernetes.io~secret/cert major:0 minor:603 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f53bc282-5937-49ac-ac98-2ee37ccb268d/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/f53bc282-5937-49ac-ac98-2ee37ccb268d/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls major:0 minor:602 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f5782718-9118-4682-a287-7998cd0304b3/volumes/kubernetes.io~projected/kube-api-access-bbvtp:{mountpoint:/var/lib/kubelet/pods/f5782718-9118-4682-a287-7998cd0304b3/volumes/kubernetes.io~projected/kube-api-access-bbvtp major:0 minor:657 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f5782718-9118-4682-a287-7998cd0304b3/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/f5782718-9118-4682-a287-7998cd0304b3/volumes/kubernetes.io~secret/proxy-tls major:0 minor:580 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f91d1788-027d-432b-be33-ca952a95046a/volumes/kubernetes.io~projected/kube-api-access-lnm6c:{mountpoint:/var/lib/kubelet/pods/f91d1788-027d-432b-be33-ca952a95046a/volumes/kubernetes.io~projected/kube-api-access-lnm6c major:0 minor:792 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f91d1788-027d-432b-be33-ca952a95046a/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/f91d1788-027d-432b-be33-ca952a95046a/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config major:0 minor:769 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f91d1788-027d-432b-be33-ca952a95046a/volumes/kubernetes.io~secret/prometheus-operator-tls:{mountpoint:/var/lib/kubelet/pods/f91d1788-027d-432b-be33-ca952a95046a/volumes/kubernetes.io~secret/prometheus-operator-tls major:0 minor:823 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fa759777-de22-4440-a3d3-ad429a3b8e7b/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/fa759777-de22-4440-a3d3-ad429a3b8e7b/volumes/kubernetes.io~projected/kube-api-access major:0 minor:251 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fa759777-de22-4440-a3d3-ad429a3b8e7b/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/fa759777-de22-4440-a3d3-ad429a3b8e7b/volumes/kubernetes.io~secret/serving-cert major:0 minor:249 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb0fc10f-5796-4cd5-b8f5-72d678054c24/volumes/kubernetes.io~projected/kube-api-access-lh47j:{mountpoint:/var/lib/kubelet/pods/fb0fc10f-5796-4cd5-b8f5-72d678054c24/volumes/kubernetes.io~projected/kube-api-access-lh47j major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb0fc10f-5796-4cd5-b8f5-72d678054c24/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/fb0fc10f-5796-4cd5-b8f5-72d678054c24/volumes/kubernetes.io~secret/webhook-cert major:0 minor:138 fsType:tmpfs blockSize:0} overlay_0-1003:{mountpoint:/var/lib/containers/storage/overlay/d473c3de870ae61d33862f5c79ae716e21fb1a8f9e65134d08954abf3a95aa71/merged major:0 minor:1003 fsType:overlay blockSize:0} overlay_0-1005:{mountpoint:/var/lib/containers/storage/overlay/d16783f5aed3bfb6bc6bf964e723c527f6b1c8eea84fada6bdd7ccbe67c46c73/merged major:0 minor:1005 fsType:overlay blockSize:0} overlay_0-1013:{mountpoint:/var/lib/containers/storage/overlay/d37602ac20fd91148e721f266baa8f368af876db8a39314ae39051cf155384f9/merged major:0 minor:1013 fsType:overlay blockSize:0} overlay_0-1018:{mountpoint:/var/lib/containers/storage/overlay/ff1b69a11349a9aef5f5b87aa3497593454da7cf773e262dd728dfb4d5c52885/merged major:0 minor:1018 fsType:overlay blockSize:0} overlay_0-1027:{mountpoint:/var/lib/containers/storage/overlay/e44fbbf28d36458d7e01d0281128fd6eba023d586cf9730dbfe0637da4278d09/merged major:0 minor:1027 fsType:overlay blockSize:0} overlay_0-1029:{mountpoint:/var/lib/containers/storage/overlay/7634602cb04326e308a4f38d874b79456da4fe18e8bbb33ee751fdda72bb0ee0/merged major:0 minor:1029 fsType:overlay blockSize:0} overlay_0-1031:{mountpoint:/var/lib/containers/storage/overlay/5dd8e26ec88ee3442cfc6f16a46ebc2aedf3a9f8fbbdc9f79d0e83fb89f265f0/merged major:0 minor:1031 fsType:overlay blockSize:0} overlay_0-1051:{mountpoint:/var/lib/containers/storage/overlay/14beb0f1a72e7b26964bd73bbb419edb36401b727252945c750a54c6ade72a3f/merged major:0 minor:1051 fsType:overlay blockSize:0} overlay_0-1053:{mountpoint:/var/lib/containers/storage/overlay/8781dfa4b5f3f61d32ffd587bd8b909a37dbf20c61a2315fb300b6e89740ca41/merged major:0 minor:1053 fsType:overlay blockSize:0} overlay_0-1055:{mountpoint:/var/lib/containers/storage/overlay/46bb9eae3275f6d811db824174e4c1d2dce3521357188a946ad894b7430699ab/merged major:0 minor:1055 fsType:overlay blockSize:0} overlay_0-1058:{mountpoint:/var/lib/containers/storage/overlay/3016ce96e6d1f3374a4bc1c53a85516f3a9352495fd5b35207f8eb5f28af0532/merged major:0 minor:1058 fsType:overlay blockSize:0} overlay_0-1069:{mountpoint:/var/lib/containers/storage/overlay/b67642973e858b4881c02c6a8fde2f9fd5438faefcddff54e7382ad8a88a73ed/merged major:0 minor:1069 fsType:overlay blockSize:0} overlay_0-107:{mountpoint:/var/lib/containers/storage/overlay/9db0af0ee74e589c618a5907551df643aec9ff9c31360ef4454e4f03b64c32fe/merged major:0 minor:107 fsType:overlay blockSize:0} overlay_0-1094:{mountpoint:/var/lib/containers/storage/overlay/2676ac47d81efeaeb0876c21058ea0cdbc0ef90176e2cda5e594596d5599d67f/merged major:0 minor:1094 fsType:overlay blockSize:0} overlay_0-1096:{mountpoint:/var/lib/containers/storage/overlay/febedb3ba416b9bd008baf1870727ab9ae0f8878539af44e8972523f21eae71a/merged major:0 minor:1096 fsType:overlay blockSize:0} overlay_0-1098:{mountpoint:/var/lib/containers/storage/overlay/6f77d634a28bfe98e14b9d49e2d9dd4a5aea3872901b867e806fc64e132522ef/merged major:0 minor:1098 fsType:overlay blockSize:0} overlay_0-1104:{mountpoint:/var/lib/containers/storage/overlay/0ab29b04b5704ed351a27e7bc30fe15343de8926ee3296ab355ac4bb73d53f05/merged major:0 minor:1104 fsType:overlay blockSize:0} overlay_0-1106:{mountpoint:/var/lib/containers/storage/overlay/8b47bfca91899b04939aeeb305142a9628db80190b61af6a59dcada8ad0af0d1/merged major:0 minor:1106 fsType:overlay blockSize:0} overlay_0-111:{mountpoint:/var/lib/containers/storage/overlay/f353bad261577ae807d0fe63002a5331ee0614ad70c503a063b254537fa8f2c7/merged major:0 minor:111 fsType:overlay blockSize:0} overlay_0-1111:{mountpoint:/var/lib/containers/storage/overlay/3feced1cae54b3c43d22b5cd8e24b0151060275b0f0dc2ea693661a483fdf9c3/merged major:0 minor:1111 fsType:overlay blockSize:0} overlay_0-1114:{mountpoint:/var/lib/containers/storage/overlay/b31df15cb3d3fa1c4f324fcb3b465976579f599a561feb371fbb16091ed7d3c8/merged major:0 minor:1114 fsType:overlay blockSize:0} overlay_0-1119:{mountpoint:/var/lib/containers/storage/overlay/3f60f6691f7a78faa4c4ec3cd94ab02a552c3021bee1a8923043be4bcfcaa1ab/merged major:0 minor:1119 fsType:overlay blockSize:0} overlay_0-1121:{mountpoint:/var/lib/containers/storage/overlay/733238925e69a074099d514393a8a6725c8addc53f2a3d40fb8acd95661646d8/merged major:0 minor:1121 fsType:overlay blockSize:0} overlay_0-1123:{mountpoint:/var/lib/containers/storage/overlay/e0b591acb62c77fe776db9dda7b9ce3fbd527e489b3a1d1b3e13a05ea38696ed/merged major:0 minor:1123 fsType:overlay blockSize:0} overlay_0-1125:{mountpoint:/var/lib/containers/storage/overlay/2939214fc0bcc5c7996cb29b4924eca38fd3966fdafc4397f93445f7ed023b9b/merged major:0 minor:1125 fsType:overlay blockSize:0} overlay_0-1137:{mountpoint:/var/lib/containers/storage/overlay/d29f8ca440ad8fe91916d7b3ea2970c54dd45c1aa97390037b4206d21fad2066/merged major:0 minor:1137 fsType:overlay blockSize:0} overlay_0-115:{mountpoint:/var/lib/containers/storage/overlay/0734f775fa28bfbd9cec595b8f602918ccbb3b5b686fd3401ca260cad5c3cc16/merged major:0 minor:115 fsType:overlay blockSize:0} overlay_0-1155:{mountpoint:/var/lib/containers/storage/overlay/073e6e0b2dc2ae6bb5c592153f55cbfe4ae4a6099ebd9aa9a84d5b941240f642/merged major:0 minor:1155 fsType:overlay blockSize:0} overlay_0-1163:{mountpoint:/var/lib/containers/storage/overlay/98faffd9361bd4002609192c9a3875f5cc718b8a48c23dddba5a1c29ee21eb29/merged major:0 minor:1163 fsType:overlay blockSize:0} overlay_0-1165:{mountpoint:/var/lib/containers/storage/overlay/e51cbec2c0e8cdacad4cb297d500703d4ec456b675e3ad8f471035a764ae8fab/merged major:0 minor:1165 fsType:overlay blockSize:0} overlay_0-1175:{mountpoint:/var/lib/containers/storage/overlay/20ac89eb2178fb0d5bdeb71e93c1cfbb42da7e796f71059458d39686d8e2983e/merged major:0 minor:1175 fsType:overlay blockSize:0} overlay_0-1180:{mountpoint:/var/lib/containers/storage/overlay/dc153717ad31db6afc51a339135ab6a6e907e0d8aa4174ea2a66aebc7eb665d1/merged major:0 minor:1180 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/a7f959166927aaf5a68b8bee14c0c3ca51daf8094ac6078bfa36b9df0e3f259d/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-132:{mountpoint:/var/lib/containers/storage/overlay/0a193133cf6dc512dd6222a88a26bfa79f23cba05b757da85d1ae39761098c7d/merged major:0 minor:132 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/9d997b980b356ddfdf630c398182bb77689fd36cf113748842cb52c7facca8d3/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/5092d95303c57ab1f1b4fe40cae1afd6909a307da9ade2856b6ed3956b2db708/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/357b3035e33baca71cc95b2521d479c36fbdeacd30ada35e763197476d643c5a/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/4590e312f2d394851c17306adc7ea6c048de2ad3d3dbc1c10201bdd57bfb696d/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/601abd4478b5aaf2eff8ec78e3d163d8396c0c1d0313041b4ef8f1ca95497a08/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/0095722db58997c16462a52bfadd816bdb0d045cdee7262bf3843448cff9a1dc/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-157:{mountpoint:/var/lib/containers/storage/overlay/14d729e1e4d85c607b5fb243df298694984538e9b7f895cb4dd8e9060fea1af9/merged major:0 minor:157 fsType:overlay blockSize:0} overlay_0-160:{mountpoint:/var/lib/containers/storage/overlay/75a62a8fc8fb3648b712820272eb17860a758eb622e562d69887bfdbbcf67f24/merged major:0 minor:160 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/e82eeb503773060b0acb45dac9496642b42ae08f15da57c6cedeacb7bfd41c92/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/4df5be66be6b873cfd1c551b5b0b9fc6f4ea25268a299b477b758a8e1736c2cc/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/8c11e423afdce4615b05e2c97ec56e82ea9e222a51dad5e9b9cadefc7a50fbd4/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/7fa6ef038f39cce3ddd045f6808a08b8e20de3f00d0f3ffb70268473ebccfc8a/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/4bbfd60178e02b53b381dbc1196c28665eefccf695d461a5ce89ffdccece3a34/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/4b857198661048ccff5f5a12f7cd9bd7f3f3cd1ac178a2c27e29bb34ec87d33d/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/c9dffbbd4eb7ea11960e150211e802ef310e3b0e1f8bbd40baf231c77d324648/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-274:{mountpoint:/var/lib/containers/storage/overlay/0680bd6b5ea798ae9cb5f825ab60e8ceb1eae6c9ade662692469f0aed930b3c9/merged major:0 minor:274 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/4380b0f5133a3f0da51585ccae3689a80729f0158e1e3bbf3dda13c89ebd327b/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-279:{mountpoint:/var/lib/containers/storage/overlay/ba139f06c62cf9c8d42d01d80bf13caf3d1b2638660e16086eb9b0795fdc4d53/merged major:0 minor:279 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/072a2a473587fe1e44374ccad1f1d05e0ae690f58cfe27e3661336b9140e6d1e/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-289:{mountpoint:/var/lib/containers/storage/overlay/6d63eb109e8673fd69abf0f81fb4a96e2e680f52d43bd16ccfd5a19ccde395be/merged major:0 minor:289 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/88611e7d39caa0378811472c501e49c94f5a1ecb55ce4769e0eb33fa70530882/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/435a1c816845cf50d5a1531e07ff1be53a2522a86b57431fe92b5479ad77d97a/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/cab4552b7af734c9353bdcf574b4b45a19d3ad74266a8a2becae706d8209ec66/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-299:{mountpoint:/var/lib/containers/storage/overlay/888de48e1a561193d529eede40671dd5a1712bd9de9e6f3a9ac14f82c14c48d5/merged major:0 minor:299 fsType:overlay blockSize:0} overlay_0-300:{mountpoint:/var/lib/containers/storage/overlay/588b06418ad2bc65f5ea0486df355a89168e6cbfaedf5d2474926902564dad13/merged major:0 minor:300 fsType:overlay blockSize:0} overlay_0-302:{mountpoint:/var/lib/containers/storage/overlay/fba8af5043b0bd6f4de2d28415ff223d19a7b80f5290f36fc84a3cafc64c4f2d/merged major:0 minor:302 fsType:overlay blockSize:0} overlay_0-308:{mountpoint:/var/lib/containers/storage/overlay/81e75bc80a1e863a817509ea99bde2a809cb2a95e9680b9a461ccd13b04c4299/merged major:0 minor:308 fsType:overlay blockSize:0} overlay_0-311:{mountpoint:/var/lib/containers/storage/overlay/6dcc5d1173eb99d5846cc948a47570220d59e143dc1eec45d1d07c870df16bcd/merged major:0 minor:311 fsType:overlay blockSize:0} overlay_0-313:{mountpoint:/var/lib/containers/storage/overlay/550ed381e076eb3d67669006b48b26c2afb7f60a8b28e0a67e25d4b3983c4a36/merged major:0 minor:313 fsType:overlay blockSize:0} overlay_0-324:{mountpoint:/var/lib/containers/storage/overlay/df7110b76e117b8055cfb56d6be0898de0714e79d220b79799a158aec68970da/merged major:0 minor:324 fsType:overlay blockSize:0} overlay_0-326:{mountpoint:/var/lib/containers/storage/overlay/627fd7a682dbbbf8c885754373627e75fa152bd1ae5ac018b788e36340f07826/merged major:0 minor:326 fsType:overlay blockSize:0} overlay_0-328:{mountpoint:/var/lib/containers/storage/overlay/fc13c5f402a7c44dee0f6b96095779f782501194c4f046280308e55df9fa8bca/merged major:0 minor:328 fsType:overlay blockSize:0} overlay_0-331:{mountpoint:/var/lib/containers/storage/overlay/7a26a94d6919ba956e2d181aba8fecc99ec61754f8da4a9aa582f6e9390055a3/merged major:0 minor:331 fsType:overlay blockSize:0} overlay_0-335:{mountpoint:/var/lib/containers/storage/overlay/6e99dec938508f659c65851840ce67b60ae846b5e3554c9b224d305276ed3810/merged major:0 minor:335 fsType:overlay blockSize:0} overlay_0-341:{mountpoint:/var/lib/containers/storage/overlay/852cfd57e83765bb3a685efadfa46032d4fffd49bfc374e46026f8fd058d3ed6/merged major:0 minor:341 fsType:overlay blockSize:0} overlay_0-345:{mountpoint:/var/lib/containers/storage/overlay/958a1f3fad9bb9148bf1f13b0644fbdc76e5ef12a801c4fa927f8a930d115002/merged major:0 minor:345 fsType:overlay blockSize:0} overlay_0-350:{mountpoint:/var/lib/containers/storage/overlay/d18916952d3ed0ace8f7912dc7533931461254a028db3ab2ec675562d68744c4/merged major:0 minor:350 fsType:overlay blockSize:0} overlay_0-352:{mountpoint:/var/lib/containers/storage/overlay/a3852f842ffb07cea54364f28c788691874f48f6ba657309dc47f28d4e00ed1b/merged major:0 minor:352 fsType:overlay blockSize:0} overlay_0-363:{mountpoint:/var/lib/containers/storage/overlay/22650bfac4bfe00c3f268d0fcb0d4a625c32f41936c279dba584294add6f65f1/merged major:0 minor:363 fsType:overlay blockSize:0} overlay_0-364:{mountpoint:/var/lib/containers/storage/overlay/1894dd4600c29ffaf1da67610f9e3bc17a506df278e26eeac6138b324356633c/merged major:0 minor:364 fsType:overlay blockSize:0} overlay_0-370:{mountpoint:/var/lib/containers/storage/overlay/19e695d46f9a6c543e7f31cf7f3bf6e717516e4707432cd34a3d81eb0e34fbab/merged major:0 minor:370 fsType:overlay blockSize:0} overlay_0-371:{mountpoint:/var/lib/containers/storage/overlay/81416fad8353415fd16d72542cd34641785fa85451b595349b3dd1244f16a846/merged major:0 minor:371 fsType:overlay blockSize:0} overlay_0-375:{mountpoint:/var/lib/containers/storage/overlay/ba59924cec098a108d07752669b119d85b30ed381a3ed3be34a0a94969183893/merged major:0 minor:375 fsType:overlay blockSize:0} overlay_0-377:{mountpoint:/var/lib/containers/storage/overlay/0a5cf0f8cd3debd2ad14705513c7f6a8b794022f874b6e8897db9b32621ead6a/merged major:0 minor:377 fsType:overlay blockSize:0} overlay_0-382:{mountpoint:/var/lib/containers/storage/overlay/03390710dfdbc7f47bfa332868e0637a47acb03314e9f16ea54c8610d0d91120/merged major:0 minor:382 fsType:overlay blockSize:0} overlay_0-385:{mountpoint:/var/lib/containers/storage/overlay/023f60c0562e7d80034b51eb25a91bbedeb0e24973a9e33420e3de06bbb16eea/merged major:0 minor:385 fsType:overlay blockSize:0} overlay_0-391:{mountpoint:/var/lib/containers/storage/overlay/71de8a45e4acdc309c53787ec05be3a9133d9185bf7abe8c9cd7e0cc0f1608a6/merged major:0 minor:391 fsType:overlay blockSize:0} overlay_0-393:{mountpoint:/var/lib/containers/storage/overlay/103eab69544d2bc11a419a16e6039b2448a5adb1e3829e5b8fdcb1149a627df3/merged major:0 minor:393 fsType:overlay blockSize:0} overlay_0-395:{mountpoint:/var/lib/containers/storage/overlay/442db849c8f1469f15518e634b3a332f9d252f148a30481e4cc9e9a610385c6f/merged major:0 minor:395 fsType:overlay blockSize:0} overlay_0-396:{mountpoint:/var/lib/containers/storage/overlay/2456577ed35e3d297dd04c24584d55470a4bb8c825b074e7cca388b499429cae/merged major:0 minor:396 fsType:overlay blockSize:0} overlay_0-406:{mountpoint:/var/lib/containers/storage/overlay/dba2115a9c92926ff7580cb828f553ce25d985e142a26b92c61d6b70b2384bd0/merged major:0 minor:406 fsType:overlay blockSize:0} overlay_0-408:{mountpoint:/var/lib/containers/storage/overlay/e6f5c4c2acac670dac40bcb0e1084001f85ce641953eea7009e480ee2e4cf9d4/merged major:0 minor:408 fsType:overlay blockSize:0} overlay_0-42:{mountpoint:/var/lib/containers/storage/overlay/60205608fd8bc6d22e83530a7a9e057a74f1882e89abfd02fc67d4c40069bd3a/merged major:0 minor:42 fsType:overlay blockSize:0} overlay_0-422:{mountpoint:/var/lib/containers/storage/overlay/04691090586c5343162df76550f39bde13012a706e47565dea319507211a902b/merged major:0 minor:422 fsType:overlay blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/b95e28baf4997215c4ff7d883d25640c2eaa5bfdb32df03d38098cdc61f03b9d/merged major:0 minor:43 fsType:overlay blockSize:0} overlay_0-432:{mountpoint:/var/lib/containers/storage/overlay/624d9d33ee01a8f6a1c1bf1aa920ed98a3efeb6f08321d646ba4ee55b846e9fb/merged major:0 minor:432 fsType:overlay blockSize:0} overlay_0-439:{mountpoint:/var/lib/containers/storage/overlay/ac97cb6eaf678f89b84b67c8314bc340b6b226cb69dc9dade551224e822d0efa/merged major:0 minor:439 fsType:overlay blockSize:0} overlay_0-446:{mountpoint:/var/lib/containers/storage/overlay/e5347f48131f15c264f7a71266b9f8ad5e7a24a66b6d74e8b413f7158e14bfef/merged major:0 minor:446 fsType:overlay blockSize:0} overlay_0-452:{mountpoint:/var/lib/containers/storage/overlay/6d7cca7a81110b30ff76abbb9d03decfa334fe64cb4d834a44643ef790c346f9/merged major:0 minor:452 fsType:overlay blockSize:0} overlay_0-458:{mountpoint:/var/lib/containers/storage/overlay/368c3f19f0fabccd65593373ddac613e62117c23f23c0dbfdbc2efc032bd3ac4/merged major:0 minor:458 fsType:overlay blockSize:0} overlay_0-46:{mountpoint:/var/lib/containers/storage/overlay/2d7af4c4e87ea34c57ac0e207bb8401f8d08354ec3d98361daf4bc3e156d12e6/merged major:0 minor:46 fsType:overlay blockSize:0} overlay_0-460:{mountpoint:/var/lib/containers/storage/overlay/9f6a74cac7ea75cbd3588e69c7bad399150efb1bae276e6c746f9ac22502bc87/merged major:0 minor:460 fsType:overlay blockSize:0} overlay_0-462:{mountpoint:/var/lib/containers/storage/overlay/ac16d65d8ea0e8cdb899f7a3408f2ffe3e0f3fde616b6c689e6c5681e8454d4c/merged major:0 minor:462 fsType:overlay blockSize:0} overlay_0-469:{mountpoint:/var/lib/containers/storage/overlay/f88858e29c0dfa1061f79edc0cf536ed8d58d26d3e69b2eb16c7ca5f3e9e76ae/merged major:0 minor:469 fsType:overlay blockSize:0} overlay_0-473:{mountpoint:/var/lib/containers/storage/overlay/2d0512fb7cf33666da43c3afd12bec518b4f19e2520bed444fde1980062b67eb/merged major:0 minor:473 fsType:overlay blockSize:0} overlay_0-475:{mountpoint:/var/lib/containers/storage/overlay/7e008fa918366db5e44e8ec7bdadc6fb7111fecc8f8314e813bd3a52e79efbde/merged major:0 minor:475 fsType:overlay blockSize:0} overlay_0-477:{mountpoint:/var/lib/containers/storage/overlay/9796215291bf32260d0ebe2073b67a3115c3a6f5d92f28012afd4b173e9337b5/merged major:0 minor:477 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/1b225ed43e15fc31c8b59cbff658cd6082424c12716726356ea6a74e6c221884/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-480:{mountpoint:/var/lib/containers/storage/overlay/4f91ca65c4c13161010926629293f36ad47869da3800cc90d462e01eccfaf1d7/merged major:0 minor:480 fsType:overlay blockSize:0} overlay_0-493:{mountpoint:/var/lib/containers/storage/overlay/9dd01ba7f290ae6044064f947ff34d7dce83a735a1a3ad047ec20bb56c64c9e1/merged major:0 minor:493 fsType:overlay blockSize:0} overlay_0-50:{mountpoint:/var/lib/containers/storage/overlay/4365237e82d2a7dbd74fff49a782244070e5f4faf2d4a2d001c756d0016c00cb/merged major:0 minor:50 fsType:overlay blockSize:0} overlay_0-501:{mountpoint:/var/lib/containers/storage/overlay/b45dd6bc38a935dd156defc3aa9303e124ff9c15007ff00e6126ee0bb3a5ce5f/merged major:0 minor:501 fsType:overlay blockSize:0} overlay_0-503:{mountpoint:/var/lib/containers/storage/overlay/e809f2bcae6e66e11d1d37a13553773d5e6ac5f2d0f9aa97a1c5296b0866c159/merged major:0 minor:503 fsType:overlay blockSize:0} overlay_0-519:{mountpoint:/var/lib/containers/storage/overlay/3170a16f6a018013190773736f8373b1cc07b6a8d12f0b2f13d0bbb66cf16e8c/merged major:0 minor:519 fsType:overlay blockSize:0} overlay_0-521:{mountpoint:/var/lib/containers/storage/overlay/1ae7413c4fca7448cd0f6a4c82a096914c0b50d456abfa99b19e7622905b45b1/merged major:0 minor:521 fsType:overlay blockSize:0} overlay_0-523:{mountpoint:/var/lib/containers/storage/overlay/1158aabfde36cd97017991c46c04f0a392dd3eb7a3b2b5aa8ae2e929b63fad41/merged major:0 minor:523 fsType:overlay blockSize:0} overlay_0-525:{mountpoint:/var/lib/containers/storage/overlay/631c35f6a0e6411a7595b38b34fdf28ff45a11253cafc750fc05d4aba802df00/merged major:0 minor:525 fsType:overlay blockSize:0} overlay_0-527:{mountpoint:/var/lib/containers/storage/overlay/714b7e102b52770aaf0edebbe5b674558839a45b4cd2d5b49dd08536a6a85791/merged major:0 minor:527 fsType:overlay blockSize:0} overlay_0-529:{mountpoint:/var/lib/containers/storage/overlay/77a140f6086fa19ad4c2a34d4eb5482a10ce862011078a28c2bfd95f30480b3d/merged major:0 minor:529 fsType:overlay blockSize:0} overlay_0-533:{mountpoint:/var/lib/containers/storage/overlay/97ee6af58f821971a76e4df337660dc4dcb5de27f22aed0a3ab15df60deeeafc/merged major:0 minor:533 fsType:overlay blockSize:0} overlay_0-545:{mountpoint:/var/lib/containers/storage/overlay/56f85b67829372c3c4906375793e1a8e80aa9e9b2d10fe97b26019537d10acfb/merged major:0 minor:545 fsType:overlay blockSize:0} overlay_0-551:{mountpoint:/var/lib/containers/storage/overlay/9833829d185c6ffe35dc420335ef17ac476728e362ced188e4d87d5aa116fd79/merged major:0 minor:551 fsType:overlay blockSize:0} overlay_0-559:{mountpoint:/var/lib/containers/storage/overlay/d2e4afc921350b18ddb3fe4df76bba0950204b399bc59b2ebbfc8b01ea48d457/merged major:0 minor:559 fsType:overlay blockSize:0} overlay_0-563:{mountpoint:/var/lib/containers/storage/overlay/c95bff09a6052ea8ea0313d13f8eacd1dc709e73b06c1ea19af9ab66c7b42877/merged major:0 minor:563 fsType:overlay blockSize:0} overlay_0-565:{mountpoint:/var/lib/containers/storage/overlay/c4752dea37a090872e17a7c4ae8842744b4e14412728c75ba6909a7cb5421924/merged major:0 minor:565 fsType:overlay blockSize:0} overlay_0-584:{mountpoint:/var/lib/containers/storage/overlay/63ec7b5efb848dea1c096297faafd455ad1122d4096fce139c7682b53220b53f/merged major:0 minor:584 fsType:overlay blockSize:0} overlay_0-586:{mountpoint:/var/lib/containers/storage/overlay/85ac3fe8ef8be7f0bb4a9a756245965a88616a06dc094ae43491c33f74e7a2d9/merged major:0 minor:586 fsType:overlay blockSize:0} overlay_0-592:{mountpoint:/var/lib/containers/storage/overlay/28d4f67947b16f532152733395e3e59fcf97b478f0bb879e43933104d73274cb/merged major:0 minor:592 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/949b36f308165d6d5be22a182f7d8be132f80f0fa5e0b965ac7fcf72b09788db/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/536a82a5cf941dd0db384568821f6dedf52dd50058ddfac2d9d5cbff2abd08e2/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-627:{mountpoint:/var/lib/containers/storage/overlay/54be402493b14679b14b8adf36cfcae60bd7ac8af68a470788bf4868fe7da0cd/merged major:0 minor:627 fsType:overlay blockSize:0} overlay_0-629:{mountpoint:/var/lib/containers/storage/overlay/089bbafd3144f50d6683c73a65bc982afdcdd429a45a1cb6cf3c631f5c206eed/merged major:0 minor:629 fsType:overlay blockSize:0} overlay_0-631:{mountpoint:/var/lib/containers/storage/overlay/f2e4c73605584c14f123f960f07cc787ec5a31bebbb17db4e1cc21c752d829eb/merged major:0 minor:631 fsType:overlay blockSize:0} overlay_0-637:{mountpoint:/var/lib/containers/storage/overlay/0d1fbba07403f7c37a852f2414bcef9c050669c63c85915aab95e6267c31ee6b/merged major:0 minor:637 fsType:overlay blockSize:0} overlay_0-639:{mountpoint:/var/lib/containers/storage/overlay/2dbf0b0301bc2220e6fc06e693049088295ec0ee603a417a0727a32db5818a47/merged major:0 minor:639 fsType:overlay blockSize:0} overlay_0-641:{mountpoint:/var/lib/containers/storage/overlay/f0c78b8793827aee2114090fb09afce2e0b5d23070d7f0f2897f2f458baaa405/merged major:0 minor:641 fsType:overlay blockSize:0} overlay_0-643:{mountpoint:/var/lib/containers/storage/overlay/cf0d336482fd365b672ab395a692303340cb28fdd514bbba0c9ee4391d80ff7b/merged major:0 minor:643 fsType:overlay blockSize:0} overlay_0-645:{mountpoint:/var/lib/containers/storage/overlay/ae90bf42dc6dbcfb4e4fd982964d7391343106627f0b947d3ea0e93d559f1de7/merged major:0 minor:645 fsType:overlay blockSize:0} overlay_0-647:{mountpoint:/var/lib/containers/storage/overlay/e8869d2f4db140d3ee4d9083b80da06ed5575972748087ec6df6b9b5e46b62f2/merged major:0 minor:647 fsType:overlay blockSize:0} overlay_0-649:{mountpoint:/var/lib/containers/storage/overlay/16f2755c04445c384b8d1894bae9d69e703ea449bfa67dfbb49b58e150d3134f/merged major:0 minor:649 fsType:overlay blockSize:0} overlay_0-651:{mountpoint:/var/lib/containers/storage/overlay/b31ece430e3a067e4a79987154436bf6d02b8981b29ce61d38442a7f76123f00/merged major:0 minor:651 fsType:overlay blockSize:0} overlay_0-653:{mountpoint:/var/lib/containers/storage/overlay/9b33ab30b39772f3a2f99cd5062f1baad129afc305d87ff3d070d3d9a75daab0/merged major:0 minor:653 fsType:overlay blockSize:0} overlay_0-655:{mountpoint:/var/lib/containers/storage/overlay/bb55b159693629c0e01eb58db50e25fce1c0e570aa8ac820de3ab710614602d7/merged major:0 minor:655 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/b7e4e4cbf45c88b3310416fcd0df766da3cc542901624ef2b5675687475cbc12/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-663:{mountpoint:/var/lib/containers/storage/overlay/bd779f05f68efefc82f74261712b5c50197b9899ac597119a53dd46a5c27fe38/merged major:0 minor:663 fsType:overlay blockSize:0} overlay_0-668:{mountpoint:/var/lib/containers/storage/overlay/45d1440b5f045c96e56493c99e3390207c9f662745671ec5133d414b61e750ac/merged major:0 minor:668 fsType:overlay blockSize:0} overlay_0-670:{mountpoint:/var/lib/containers/storage/overlay/6f98b388b3b3eb8ea6ff6a40011e61d261ab406bd950c0a3b5ce8f8efa80bf5a/merged major:0 minor:670 fsType:overlay blockSize:0} overlay_0-672:{mountpoint:/var/lib/containers/storage/overlay/d6ddcb5330aeb3ee109f1523d8b1ee61762ba60e9d789b39652ecb2c5b6c470d/merged major:0 minor:672 fsType:overlay blockSize:0} overlay_0-675:{mountpoint:/var/lib/containers/storage/overlay/e5562b42f45b47e9e2e11c0442431ae0e09f37c3ef7da7b1a8dee131a83c3ce7/merged major:0 minor:675 fsType:overlay blockSize:0} overlay_0-677:{mountpoint:/var/lib/containers/storage/overlay/4edaa1c7dbea8e587edef24348c93668b1bbd5793d105bbf1cb6a26e9ef567c5/merged major:0 minor:677 fsType:overlay blockSize:0} overlay_0-679:{mountpoint:/var/lib/containers/storage/overlay/92d528122c28bba5c3f9244f4f5eef66257b83f6df285c788e90d6ea9700a9a2/merged major:0 minor:679 fsType:overlay blockSize:0} overlay_0-686:{mountpoint:/var/lib/containers/storage/overlay/772cea7e2c718c40f3448afaf08af1f6559d3f5e2c5b4f733d782df13504b624/merged major:0 minor:686 fsType:overlay blockSize:0} overlay_0-701:{mountpoint:/var/lib/containers/storage/overlay/098904ca79bb48c0a871afe2caa191a7175520aec2e63b5c9230b79bf50eda48/merged major:0 minor:701 fsType:overlay blockSize:0} overlay_0-714:{mountpoint:/var/lib/containers/storage/overlay/6c3828357f76d67c2d602ab09a071ef7f64f16512ed393de2d17fdee472ae700/merged major:0 minor:714 fsType:overlay blockSize:0} overlay_0-716:{mountpoint:/var/lib/containers/storage/overlay/6e3dc45d4df9a14481ac9dacfc0a846b7ed7a6809d4c9107d827bae640532ce2/merged major:0 minor:716 fsType:overlay blockSize:0} overlay_0-718:{mountpoint:/var/lib/containers/storage/overlay/e04a9d66280e49d1ae6ed62762b82a533ec5597a7846aafb4a8c8a12989305eb/merged major:0 minor:718 fsType:overlay blockSize:0} overlay_0-72:{mountpoint:/var/lib/containers/storage/overlay/bcde076cc0057beb0cc3994806c3e5f3042ca6c60de4b081d36c02352c438c3b/merged major:0 minor:72 fsType:overlay blockSize:0} overlay_0-723:{mountpoint:/var/lib/containers/storage/overlay/c4971e81c90daa3194b3d065f8a935b336b682fc3058aafc1083467990ee98ed/merged major:0 minor:723 fsType:overlay blockSize:0} overlay_0-729:{mountpoint:/var/lib/containers/storage/overlay/8650e7d747e336dc621fafd969b6ca8a4cd9622322712be6c152271b959c1393/merged major:0 minor:729 fsType:overlay blockSize:0} overlay_0-730:{mountpoint:/var/lib/containers/storage/overlay/e55e32cf01675dc231b6da210cbc04f770fb9d1d039ab15ce4f5ed95495b3d57/merged major:0 minor:730 fsType:overlay blockSize:0} overlay_0-739:{mountpoint:/var/lib/containers/storage/overlay/ddd346dcb747aa560471f283478d367270382337384dff1fe1301adee980cb2a/merged major:0 minor:739 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/3847f2eb730434d8da1b6a373a4adde377a1156e78bc71bcfb04aa6ff458883e/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-747:{mountpoint:/var/lib/containers/storage/overlay/5c62279d50dfe5446915927733e77415a90e411b9e80915b78dc8cb78ec4db32/merged major:0 minor:747 fsType:overlay blockSize:0} overlay_0-750:{mountpoint:/var/lib/containers/storage/overlay/b63d2a3b33cb74a5533d6d27e6526782496eaea08e23c8654d67c8142bf585b4/merged major:0 minor:750 fsType:overlay blockSize:0} overlay_0-765:{mountpoint:/var/lib/containers/storage/overlay/18cd4ef8f04de6a385901c18230254cd9026c1c6227425a41a3a735d19fd48fe/merged major:0 minor:765 fsType:overlay blockSize:0} overlay_0-768:{mountpoint:/var/lib/containers/storage/overlay/d4dda332ceda5fd922460750157207ff3a0bcd274de42b87eb7113f85362d5f8/merged major:0 minor:768 fsType:overlay blockSize:0} overlay_0-770:{mountpoint:/var/lib/containers/storage/overlay/af7ada57b7bce762e5da2e713a48463929c464e05e027d29a520f2ec7fc26c9b/merged major:0 minor:770 fsType:overlay blockSize:0} overlay_0-774:{mountpoint:/var/lib/containers/storage/overlay/62a7c433083f2294a8150418c2e02aa4c0c2fc174564c0d6fc1bef2608e317d1/merged major:0 minor:774 fsType:overlay blockSize:0} overlay_0-775:{mountpoint:/var/lib/containers/storage/overlay/7a3df7486c27116e8a4dca8716e54b5b82e15a7c68ea2feab38e8e2c69591db4/merged major:0 minor:775 fsType:overlay blockSize:0} overlay_0-777:{mountpoint:/var/lib/containers/storage/overlay/5415b76f55d3b4d012bfae8c190e1d23d286f23901f6befbc6bc743a794991c3/merged major:0 minor:777 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/ac82333b5d96d5b4e35a973b99e3547256b2679f7298c476324ad369d7ccf983/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-789:{mountpoint:/var/lib/containers/storage/overlay/b812c209ec518fab0f7f6eea29f39a323bf6af1ae8bb08a28f0f2392656c9d78/merged major:0 minor:789 fsType:overlay blockSize:0} overlay_0-791:{mountpoint:/var/lib/containers/storage/overlay/bbf4c600952054076302373ab3410cfde84649cac0755fe00313f6052359fd99/merged major:0 minor:791 fsType:overlay blockSize:0} overlay_0-793:{mountpoint:/var/lib/containers/storage/overlay/3bc906863ed1f4d09af19f2f60ca5e584e4bbd5e8b84ad69a104473112d92cdc/merged major:0 minor:793 fsType:overlay blockSize:0} overlay_0-799:{mountpoint:/var/lib/containers/storage/overlay/d5056d99d1f4c3ad0f646180c2b1ce00157d0bc69c1b1b552f915220b58aadcd/merged major:0 minor:799 fsType:overlay blockSize:0} overlay_0-800:{mountpoint:/var/lib/containers/storage/overlay/d32e06c8b42b8b491026aaec6edadc7b17b584a4bb601d756bd093af431abe69/merged major:0 minor:800 fsType:overlay blockSize:0} overlay_0-806:{mountpoint:/var/lib/containers/storage/overlay/2164202b5c8b3dcb465f2c1188c1e29aef7a05533460a8b115fe0fb947d99a8e/merged major:0 minor:806 fsType:overlay blockSize:0} overlay_0-809:{mountpoint:/var/lib/containers/storage/overlay/971a8aee40829d4409987dbdc8686fbc39c884816c2c1e922d06deb40d9a1012/merged major:0 minor:809 fsType:overlay blockSize:0} overlay_0-81:{mountpoint:/var/lib/containers/storage/overlay/d4d81d66576806379c90f07f69c44971136a8cf7925e76b8959a95ede1baac22/merged major:0 minor:81 fsType:overlay blockSize:0} overlay_0-812:{mountpoint:/var/lib/containers/storage/overlay/850c961b79df70eb747a27feef839d3a90b427131343fbe30ed9d9d7f62aab31/merged major:0 minor:812 fsType:overlay blockSize:0} overlay_0-814:{mountpoint:/var/lib/containers/storage/overlay/2c7732e61cb961e7029a9e0c56b1990a5b3b647dd4de34b770adc3ec05dc149b/merged major:0 minor:814 fsType:overlay blockSize:0} overlay_0-816:{mountpoint:/var/lib/containers/storage/overlay/36086cf797de9e3732829986b32c51710cc71d1cd2a7abfd74ca11cc24d2387c/merged major:0 minor:816 fsType:overlay blockSize:0} overlay_0-841:{mountpoint:/var/lib/containers/storage/overlay/802ee215550ed52298cb112981a6001f5b2e4496aee88876a3a59e8d4d7deccd/merged major:0 minor:841 fsType:overlay blockSize:0} overlay_0-851:{mountpoint:/var/lib/containers/storage/overlay/0a20b58215fba5957d69e3d325a4afcdd1e833fd7b020075671c6a6798d960a2/merged major:0 minor:851 fsType:overlay blockSize:0} overlay_0-852:{mountpoint:/var/lib/containers/storage/overlay/f5b3c325f8e5a95818df0a41f4bfb2d88406b857bdb5db1f8fc9b8b09ecc394f/merged major:0 minor:852 fsType:overlay blockSize:0} overlay_0-859:{mountpoint:/var/lib/containers/storage/overlay/1f5040cbec4662b8675c0a7ebaca63bc7ee8a2ab589635a1ee86ef4aa6a63e55/merged major:0 minor:859 fsType:overlay blockSize:0} overlay_0-862:{mountpoint:/var/lib/containers/storage/overlay/4c3e4b84e35b8dbdac9849a7616215c682838bd47f5c2b42918c4b8083973f8c/merged major:0 minor:862 fsType:overlay blockSize:0} overlay_0-864:{mountpoint:/var/lib/containers/storage/overlay/ad875220fd9725579410d0bf1b1b969f524e8210c799752811eb911f5cb2d6c7/merged major:0 minor:864 fsType:overlay blockSize:0} overlay_0-866:{mountpoint:/var/lib/containers/storage/overlay/f2eca4bb2aa67d39cde0b0a5b7290de4abd9fd807efe4b21a72e5b1fb7c34535/merged major:0 minor:866 fsType:overlay blockSize:0} overlay_0-875:{mountpoint:/var/lib/containers/storage/overlay/e647e4d8cc145877a7c394fb8681f78672cc25419ad6d0d3b21e9827786d602b/merged major:0 minor:875 fsType:overlay blockSize:0} overlay_0-88:{mountpoint:/var/lib/containers/storage/overlay/960045cd812f39b504d5063919bd4ccaac5d3f0f2d0e59d64f1819babde7a390/merged major:0 minor:88 fsType:overlay blockSize:0} overlay_0-885:{mountpoint:/var/lib/containers/storage/overlay/f5350f12b432b931b1efc448ef10e2b6e5e9923093de84d6826eb281194dc574/merged major:0 minor:885 fsType:overlay blockSize:0} overlay_0-889:{mountpoint:/var/lib/containers/storage/overlay/2aaacf8c06f04f5266fb471de953012b2170a761f6ce730bdc801f8ead0642f5/merged major:0 minor:889 fsType:overlay blockSize:0} overlay_0-899:{mountpoint:/var/lib/containers/storage/overlay/da0eea025cdbab57f6307f8a1f1836e5cd6d87935cdf494d7087593b6e3b446d/merged major:0 minor:899 fsType:overlay blockSize:0} overlay_0-913:{mountpoint:/var/lib/containers/storage/overlay/510c82a0994ed7 Mar 20 08:40:54.970330 master-0 kubenswrapper[18707]: ecd2bedc6455af4e986b7f79446ec69f989c95c2c95933753e/merged major:0 minor:913 fsType:overlay blockSize:0} overlay_0-914:{mountpoint:/var/lib/containers/storage/overlay/f187f2b96108922e83f259dc12b8090d35a7d6268a92f6ac4cdab4a4aa198317/merged major:0 minor:914 fsType:overlay blockSize:0} overlay_0-922:{mountpoint:/var/lib/containers/storage/overlay/d410c98dfeb63765670760c175424aaefe6e56883f7de6b49a5be7125f8e96f4/merged major:0 minor:922 fsType:overlay blockSize:0} overlay_0-929:{mountpoint:/var/lib/containers/storage/overlay/cd2f9c51696dcd28c95bc27d4da83d78c063de30a9b03953b29ad184a49d1642/merged major:0 minor:929 fsType:overlay blockSize:0} overlay_0-934:{mountpoint:/var/lib/containers/storage/overlay/bbd85fba4fe218ee44a0229783c7098986552c272a6745e652d88257dd1a4449/merged major:0 minor:934 fsType:overlay blockSize:0} overlay_0-939:{mountpoint:/var/lib/containers/storage/overlay/f322ea74e5b3f1ced96751116150c7fef73828ea3367c1710cd2e5c17925f747/merged major:0 minor:939 fsType:overlay blockSize:0} overlay_0-95:{mountpoint:/var/lib/containers/storage/overlay/9997026ddd615e454ade1f7ef4e51edca84fd32bca11973c82e630cf4b024c10/merged major:0 minor:95 fsType:overlay blockSize:0} overlay_0-954:{mountpoint:/var/lib/containers/storage/overlay/03251a465434279848803b678032372010e5f5087de03b859e373a6a4a14f538/merged major:0 minor:954 fsType:overlay blockSize:0} overlay_0-956:{mountpoint:/var/lib/containers/storage/overlay/9512d4374387f8ecf46b599181ef9379d4e627bf6b5bad5b657816a15dc90f0d/merged major:0 minor:956 fsType:overlay blockSize:0} overlay_0-960:{mountpoint:/var/lib/containers/storage/overlay/8809e1e60d8f877f48764af52fb76d5fc60112ea589fae146551d56bd0b53af8/merged major:0 minor:960 fsType:overlay blockSize:0} overlay_0-968:{mountpoint:/var/lib/containers/storage/overlay/0e2939b8134642b11bc1b1f7c9421cec4855435d881e4aaee6a1e0e7abdeca6f/merged major:0 minor:968 fsType:overlay blockSize:0} overlay_0-974:{mountpoint:/var/lib/containers/storage/overlay/b04765e67adf89b596675bdc171f6e27da770e7014f716f7b2d41ef0d5b668bb/merged major:0 minor:974 fsType:overlay blockSize:0} overlay_0-994:{mountpoint:/var/lib/containers/storage/overlay/8e9ec09ed92f134249a146b23823c5b72fb70c728ab450f23d63578768cd9a6c/merged major:0 minor:994 fsType:overlay blockSize:0} overlay_0-997:{mountpoint:/var/lib/containers/storage/overlay/1230ccfad857c91a41a7fd098c82006badc9b3af694277a9f39887f3d3d2d759/merged major:0 minor:997 fsType:overlay blockSize:0}] Mar 20 08:40:55.013383 master-0 kubenswrapper[18707]: I0320 08:40:55.011527 18707 manager.go:217] Machine: {Timestamp:2026-03-20 08:40:55.009963774 +0000 UTC m=+0.166144150 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:7cbba5bd4cad48d397925286776799f2 SystemUUID:7cbba5bd-4cad-48d3-9792-5286776799f2 BootID:2d4df506-7881-4563-b01f-2840d2bdb60b Filesystems:[{Device:/var/lib/kubelet/pods/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:436 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-475 DeviceMajor:0 DeviceMinor:475 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-375 DeviceMajor:0 DeviceMinor:375 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ab175f7e-a5e8-4fda-98c9-6d052a221a83/volumes/kubernetes.io~projected/kube-api-access-zmrxh DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b639e578-628e-404d-b759-8b6e84e771d9/volumes/kubernetes.io~projected/kube-api-access-9zp8f DeviceMajor:0 DeviceMinor:318 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-493 DeviceMajor:0 DeviceMinor:493 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/325f0a83-d56d-4b62-977b-088a7d5f0e00/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/fa759777-de22-4440-a3d3-ad429a3b8e7b/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:251 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-313 DeviceMajor:0 DeviceMinor:313 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c7f5e6cd-e093-409a-8758-d3db7a7eb32c/volumes/kubernetes.io~projected/kube-api-access-plc2q DeviceMajor:0 DeviceMinor:355 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1111 DeviceMajor:0 DeviceMinor:1111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-641 DeviceMajor:0 DeviceMinor:641 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8a073909baddafc91ef0c1a8a7d1dde84b7bce6d841c77590b71f4a6754c9117/userdata/shm DeviceMajor:0 DeviceMinor:810 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/93a839a770be652f7d12315bd6dada21638d2838dbf7e7ad327b9d696e2d3e07/userdata/shm DeviceMajor:0 DeviceMinor:848 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-954 DeviceMajor:0 DeviceMinor:954 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1137 DeviceMajor:0 DeviceMinor:1137 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ab175f7e-a5e8-4fda-98c9-6d052a221a83/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-395 DeviceMajor:0 DeviceMinor:395 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e/volumes/kubernetes.io~projected/kube-api-access-htv9s DeviceMajor:0 DeviceMinor:908 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-328 DeviceMajor:0 DeviceMinor:328 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1055 DeviceMajor:0 DeviceMinor:1055 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f53bc282-5937-49ac-ac98-2ee37ccb268d/volumes/kubernetes.io~projected/kube-api-access-b2d4r DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3092eb7a16220393b74c3ca8c6aedf7058f62f9313af91e571c5d2e31d050e35/userdata/shm DeviceMajor:0 DeviceMinor:438 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/60bbe3130a4d3341abf02d519702749eab204c658db33e134e22b746126c5516/userdata/shm DeviceMajor:0 DeviceMinor:306 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6a80bd6f-2263-4251-8197-5173193f8afc/volumes/kubernetes.io~projected/kube-api-access-tqd2v DeviceMajor:0 DeviceMinor:256 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8a61c21711f690cdda83fe881555e8ad64b01a2f6d1c312d8da79d83d36082f5/userdata/shm DeviceMajor:0 DeviceMinor:610 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-670 DeviceMajor:0 DeviceMinor:670 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-864 DeviceMajor:0 DeviceMinor:864 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:900 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f2217de0-7805-4f5f-8ea5-93b81b7e0236/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:911 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-46 DeviceMajor:0 DeviceMinor:46 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/89e373acd0a6c2bca2cd563de1cef238723b46b416d83b3242f9ebb18d644754/userdata/shm DeviceMajor:0 DeviceMinor:53 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:242 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a18b9230-de78-41b8-a61e-361b8bb1fbb3/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:515 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ac63789abc1163c5f9db4788ba7a388635d218ac761e65be8043709a0019d115/userdata/shm DeviceMajor:0 DeviceMinor:616 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-672 DeviceMajor:0 DeviceMinor:672 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-503 DeviceMajor:0 DeviceMinor:503 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7e451189-850e-4d19-a40c-40f642d08511/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:511 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/12e1d9e5-96b5-4367-81a5-d87b3f93d8da/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:539 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/469183dd-dc54-467d-82a1-611132ae8ec4/volumes/kubernetes.io~projected/kube-api-access-8dtbl DeviceMajor:0 DeviceMinor:892 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1005 DeviceMajor:0 DeviceMinor:1005 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f91d1788-027d-432b-be33-ca952a95046a/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:769 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-115 DeviceMajor:0 DeviceMinor:115 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cadfc06b46a2370e89939aa270eb81d7b99f5548dca07c84a3c027dea72e913b/userdata/shm DeviceMajor:0 DeviceMinor:248 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-350 DeviceMajor:0 DeviceMinor:350 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-643 DeviceMajor:0 DeviceMinor:643 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-716 DeviceMajor:0 DeviceMinor:716 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-889 DeviceMajor:0 DeviceMinor:889 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/df43cdf08fb65d06b9db1ef59770a000ce2240e1f2e40f1d6a1619ba0e8b03df/userdata/shm DeviceMajor:0 DeviceMinor:258 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0262d134f60647f6e04ff950df203ce5bc3f1656b20c1e15f442731269c3be76/userdata/shm DeviceMajor:0 DeviceMinor:445 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/acb704a9-6c8d-4378-ae93-e7095b1fce85/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:604 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1114 DeviceMajor:0 DeviceMinor:1114 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1121 DeviceMajor:0 DeviceMinor:1121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/46a12190f11c7c4d27d18246d17e718f0fd8c23ea7d374844186aa5295484abf/userdata/shm DeviceMajor:0 DeviceMinor:284 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-371 DeviceMajor:0 DeviceMinor:371 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/813f91c2-2b37-4681-968d-4217e286e22f/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:596 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a18b9230-de78-41b8-a61e-361b8bb1fbb3/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:516 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-385 DeviceMajor:0 DeviceMinor:385 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1013 DeviceMajor:0 DeviceMinor:1013 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c0a17669-a122-44aa-bdda-581bf1fc4649/volumes/kubernetes.io~projected/kube-api-access-xf485 DeviceMajor:0 DeviceMinor:356 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-866 DeviceMajor:0 DeviceMinor:866 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-473 DeviceMajor:0 DeviceMinor:473 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/46de2acc-9f5d-4ecf-befe-a480f86466f5/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:574 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7b489385-2c96-4a97-8b31-362162de020e/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:605 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-629 DeviceMajor:0 DeviceMinor:629 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ec3f7a57e8d7aa7239f51fc0b75ccf091bb42e503457a1919c637dd65b9da53e/userdata/shm DeviceMajor:0 DeviceMinor:293 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d88ba8e1-ee42-423f-9839-e71cb0265c6c/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:688 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a07ae992a49295676f3184ce503f903e0b4447cd57b0d7e0c91d07d9a0f3bc30/userdata/shm DeviceMajor:0 DeviceMinor:1088 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/68252533-bd64-4fc5-838a-cc350cbe77f0/volumes/kubernetes.io~projected/kube-api-access-75r7k DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-308 DeviceMajor:0 DeviceMinor:308 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b543f82e-683d-47c1-af73-4dcede4cf4df/volumes/kubernetes.io~projected/kube-api-access-4c2rq DeviceMajor:0 DeviceMinor:830 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/49f836c59184940ba77efca8e250a75cfca3ff592389502fe02c69990736b85f/userdata/shm DeviceMajor:0 DeviceMinor:776 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-480 DeviceMajor:0 DeviceMinor:480 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3de37144-a9ab-45fb-a23f-2287a5198edf/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:571 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-701 DeviceMajor:0 DeviceMinor:701 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/de6078d7-2aad-46fe-b17a-b6b38e4eaa41/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:802 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-432 DeviceMajor:0 DeviceMinor:432 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-800 DeviceMajor:0 DeviceMinor:800 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/37ba8dc9671f8fe4b5333f2dc9ab294ad8d004712a5f4bddaf2f6742452a4b3c/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/86cb5d23-df7f-4f67-8086-1789d8e68544/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-458 DeviceMajor:0 DeviceMinor:458 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-677 DeviceMajor:0 DeviceMinor:677 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/46de2acc-9f5d-4ecf-befe-a480f86466f5/volumes/kubernetes.io~projected/kube-api-access-4f9vt DeviceMajor:0 DeviceMinor:578 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ead213f06d6e13b0b8afce02cff25edfe82c583b53f661ee9bdc498f394f53a9/userdata/shm DeviceMajor:0 DeviceMinor:164 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a638c468-010c-4da3-ad62-26f5f2bbdbb9/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:1034 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/469183dd-dc54-467d-82a1-611132ae8ec4/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:891 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d1f4c3462eb562d7885b549a3182d1636527f9d646efb4fbbe9ff562004c787d/userdata/shm DeviceMajor:0 DeviceMinor:615 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3f471ecc-922c-4cb1-9bdd-fdb5da08c592/volumes/kubernetes.io~projected/kube-api-access-224dg DeviceMajor:0 DeviceMinor:276 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/df1df4af888713c77332d729a24c1e1fdb472ce369b8165f8ad6dfbe7c60bbd6/userdata/shm DeviceMajor:0 DeviceMinor:305 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/edc62dc83d0212adeb196aa9fb63d28b17a6054a019750eef25f143d8b2816f1/userdata/shm DeviceMajor:0 DeviceMinor:617 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-645 DeviceMajor:0 DeviceMinor:645 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a25248c0-8de7-4624-b785-f053665fcb23/volumes/kubernetes.io~projected/kube-api-access-prcgg DeviceMajor:0 DeviceMinor:1087 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/45e8b72b-564c-4bb1-b911-baff2d6c87ad/volumes/kubernetes.io~projected/kube-api-access-9zpkn DeviceMajor:0 DeviceMinor:261 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/46de2acc-9f5d-4ecf-befe-a480f86466f5/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:577 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-627 DeviceMajor:0 DeviceMinor:627 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-774 DeviceMajor:0 DeviceMinor:774 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-668 DeviceMajor:0 DeviceMinor:668 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/285790bb4eeaea0e1399502a5e31c8d8bf1bd484bccae96128ad9795ef9ca21a/userdata/shm DeviceMajor:0 DeviceMinor:835 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-960 DeviceMajor:0 DeviceMinor:960 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4ddac301-a604-4f07-8849-5928befd336e/volumes/kubernetes.io~projected/kube-api-access-27j9q DeviceMajor:0 DeviceMinor:880 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-335 DeviceMajor:0 DeviceMinor:335 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a69e8d3a-a0b1-4688-8631-d9f265aa4c69/volumes/kubernetes.io~secret/secret-metrics-server-tls DeviceMajor:0 DeviceMinor:1159 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9da40744da0c1f755b7ca8d13405871816427a42b29bf11d678dd70f488e5c6a/userdata/shm DeviceMajor:0 DeviceMinor:97 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4d57d4740bcb6c82be46e50156208d72579c0c7e7b87226d99fe9e3e9e1ff1bb/userdata/shm DeviceMajor:0 DeviceMinor:619 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-653 DeviceMajor:0 DeviceMinor:653 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1031 DeviceMajor:0 DeviceMinor:1031 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b4291bfd-53d9-4c78-b7cb-d7eb46560528/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:222 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/kubelet/pods/bbc0b783-28d5-4554-b49d-c66082546f44/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:606 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ae39c09b-7aef-4615-8ced-0dcad39f23a5/volumes/kubernetes.io~projected/kube-api-access-rxn2f DeviceMajor:0 DeviceMinor:824 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1094 DeviceMajor:0 DeviceMinor:1094 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b543f82e-683d-47c1-af73-4dcede4cf4df/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:829 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1104 DeviceMajor:0 DeviceMinor:1104 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fb0fc10f-5796-4cd5-b8f5-72d678054c24/volumes/kubernetes.io~projected/kube-api-access-lh47j DeviceMajor:0 DeviceMinor:139 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~projected/kube-api-access-xhkh7 DeviceMajor:0 DeviceMinor:270 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a9a9ecf2-c476-4962-8333-21f242dbcb89/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:1033 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/21791b9b344da8b052097bc3f6be11ec8238d51625fab3e6901854f679a950ba/userdata/shm DeviceMajor:0 DeviceMinor:609 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f046860d-2d54-4746-8ba2-f8e90fa55e38/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:247 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/volumes/kubernetes.io~projected/kube-api-access-8d57k DeviceMajor:0 DeviceMinor:254 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-584 DeviceMajor:0 DeviceMinor:584 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a56a69cfc23cf8add77dfc1a237e33143ff59495f1a2048a86a1759c1954faee/userdata/shm DeviceMajor:0 DeviceMinor:845 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-299 DeviceMajor:0 DeviceMinor:299 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f53bc282-5937-49ac-ac98-2ee37ccb268d/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:602 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c1854ea4-c8e2-4289-84b6-1f18b2ac684f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/45e8b72b-564c-4bb1-b911-baff2d6c87ad/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:601 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-739 DeviceMajor:0 DeviceMinor:739 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1029 DeviceMajor:0 DeviceMinor:1029 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-913 DeviceMajor:0 DeviceMinor:913 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-799 DeviceMajor:0 DeviceMinor:799 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-939 DeviceMajor:0 DeviceMinor:939 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/volumes/kubernetes.io~projected/kube-api-access-s69rd DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7e451189-850e-4d19-a40c-40f642d08511/volumes/kubernetes.io~projected/kube-api-access-snmpq DeviceMajor:0 DeviceMinor:407 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c7f5e6cd-e093-409a-8758-d3db7a7eb32c/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:354 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-663 DeviceMajor:0 DeviceMinor:663 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4fea9b05-222e-4b58-95c8-735fc1cf3a8b/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:467 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1155 DeviceMajor:0 DeviceMinor:1155 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/972f78b16f9b47d7c2f794578bdc02049da4958a2a754da51e80ca31806a5573/userdata/shm DeviceMajor:0 DeviceMinor:109 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/07c77bfc7ae9afd423082de8b582bc56bb7322d1ee441fcf749b3f543d9311b5/userdata/shm DeviceMajor:0 DeviceMinor:372 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a18b9230-de78-41b8-a61e-361b8bb1fbb3/volumes/kubernetes.io~projected/kube-api-access-8crkc DeviceMajor:0 DeviceMinor:518 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-565 DeviceMajor:0 DeviceMinor:565 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-649 DeviceMajor:0 DeviceMinor:649 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-81 DeviceMajor:0 DeviceMinor:81 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-160 DeviceMajor:0 DeviceMinor:160 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/91b2899e-8d24-41a0-bec8-d11c67b8f955/volumes/kubernetes.io~secret/stats-auth DeviceMajor:0 DeviceMinor:831 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1163 DeviceMajor:0 DeviceMinor:1163 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7/volumes/kubernetes.io~projected/kube-api-access-lpdk5 DeviceMajor:0 DeviceMinor:221 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-586 DeviceMajor:0 DeviceMinor:586 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f53bc282-5937-49ac-ac98-2ee37ccb268d/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:603 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bc3668412459475b58df22c5952b6fe210803ae27cac46ab11b8236701860e95/userdata/shm DeviceMajor:0 DeviceMinor:582 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-446 DeviceMajor:0 DeviceMinor:446 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-777 DeviceMajor:0 DeviceMinor:777 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-521 DeviceMajor:0 DeviceMinor:521 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/47eadda0-35a6-4b5c-a96c-24854be15098/volumes/kubernetes.io~secret/tls-certificates DeviceMajor:0 DeviceMinor:837 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1db4d695-5a6a-4fbe-b610-3777bfebed79/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1075 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1db4d695-5a6a-4fbe-b610-3777bfebed79/volumes/kubernetes.io~secret/openshift-state-metrics-tls DeviceMajor:0 DeviceMinor:1074 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5e3ddf9e-eeb5-4266-b675-092fd4e27623/volumes/kubernetes.io~projected/kube-api-access-xd6vv DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c2a23d24-9e09-431e-8c3b-8456ff51a8d0/volumes/kubernetes.io~projected/kube-api-access-btdjt DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-859 DeviceMajor:0 DeviceMinor:859 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-341 DeviceMajor:0 DeviceMinor:341 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d48534fe1c98270494577c8d49aed8602c14ccc175395517708a7b89389db471/userdata/shm DeviceMajor:0 DeviceMinor:339 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a25248c0-8de7-4624-b785-f053665fcb23/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1080 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:425 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4ddac301-a604-4f07-8849-5928befd336e/volumes/kubernetes.io~secret/certs DeviceMajor:0 DeviceMinor:879 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-899 DeviceMajor:0 DeviceMinor:899 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b77a1ff885b1de50033d17a32311fabaefba7e39efce419b16febb35e5db2498/userdata/shm DeviceMajor:0 DeviceMinor:235 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fd67ef0d5263721af2c9563f40007bfa8118e9b8e56c96c99f48f239bd0b7044/userdata/shm DeviceMajor:0 DeviceMinor:416 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6cc2d27a03b36826decc5cc4343612194df412f00fd1e83d62bd9da95cdaba5c/userdata/shm DeviceMajor:0 DeviceMinor:464 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466/volumes/kubernetes.io~projected/kube-api-access-gkccn DeviceMajor:0 DeviceMinor:901 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f8c21c05090492f9afafa02ead2a469af0d1260ed484823064a0610864bf15d8/userdata/shm DeviceMajor:0 DeviceMinor:431 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-107 DeviceMajor:0 DeviceMinor:107 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1027 DeviceMajor:0 DeviceMinor:1027 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ad692349-5089-4afc-85b2-9b6e7997567c/volumes/kubernetes.io~projected/kube-api-access-h6mb5 DeviceMajor:0 DeviceMinor:102 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4d1e15f7704367048583514d8253b668db81d2da2b51801dfb366fd0e7170679/userdata/shm DeviceMajor:0 DeviceMinor:281 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-655 DeviceMajor:0 DeviceMinor:655 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c1854ea4-c8e2-4289-84b6-1f18b2ac684f/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:228 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3fd8857a2c2302ef6094cc86660ac7c76904e438bf13dc986cbdd24f1c5f29f3/userdata/shm DeviceMajor:0 DeviceMinor:361 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-862 DeviceMajor:0 DeviceMinor:862 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-809 DeviceMajor:0 DeviceMinor:809 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/91607e86857f069937668c70d7500f8b27325b07177f03b7a7ab831b3600ac2a/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3f471ecc-922c-4cb1-9bdd-fdb5da08c592/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:435 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-545 DeviceMajor:0 DeviceMinor:545 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-841 DeviceMajor:0 DeviceMinor:841 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9a8426b4146cf2f000b30dafab20a28003c24ae65a16f62f890c575d2f9770d9/userdata/shm DeviceMajor:0 DeviceMinor:237 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b4291bfd-53d9-4c78-b7cb-d7eb46560528/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:437 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-723 DeviceMajor:0 DeviceMinor:723 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-806 DeviceMajor:0 DeviceMinor:806 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f1468ec0-2aa4-461c-a62f-e9f067be490f/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1077 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97/volumes/kubernetes.io~projected/kube-api-access-wrws5 DeviceMajor:0 DeviceMinor:271 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-525 DeviceMajor:0 DeviceMinor:525 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/df428d5a-c722-4536-8e7f-cdd85c560481/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:607 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e9d5f349b622bea576ae3dd04cdf2c2da1c82af6b9e42a0b5011a9e0e2cc47e6/userdata/shm DeviceMajor:0 DeviceMinor:882 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6860ec0c6307c0854099262d2b68eee9cef0172599ec80b28a89c6d016fb4071/userdata/shm DeviceMajor:0 DeviceMinor:972 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-559 DeviceMajor:0 DeviceMinor:559 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-791 DeviceMajor:0 DeviceMinor:791 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b98b4efc-6117-487f-9cfc-38ce66dd9570/volumes/kubernetes.io~projected/kube-api-access-6c5ch DeviceMajor:0 DeviceMinor:117 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-519 DeviceMajor:0 DeviceMinor:519 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-885 DeviceMajor:0 DeviceMinor:885 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-922 DeviceMajor:0 DeviceMinor:922 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-789 DeviceMajor:0 DeviceMinor:789 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1165 DeviceMajor:0 DeviceMinor:1165 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c2a23d24-9e09-431e-8c3b-8456ff51a8d0/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-393 DeviceMajor:0 DeviceMinor:393 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a69e8d3a-a0b1-4688-8631-d9f265aa4c69/volumes/kubernetes.io~projected/kube-api-access-c6p8v DeviceMajor:0 DeviceMinor:1160 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-956 DeviceMajor:0 DeviceMinor:956 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2bf90db0-f943-464c-8599-e36b4fc32e1c/volumes/kubernetes.io~projected/kube-api-access-qns9g DeviceMajor:0 DeviceMinor:360 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-631 DeviceMajor:0 DeviceMinor:631 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d3032285e1cfcfd919da168e10b18ee5ee2720e85e2457d64bfd97de17bf8050/userdata/shm DeviceMajor:0 DeviceMinor:756 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1051 DeviceMajor:0 DeviceMinor:1051 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1053 DeviceMajor:0 DeviceMinor:1053 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b8d362586f6fb451267fc09fb3ebee6bcd03ae4f33abd963a5893c4d677a7fc5/userdata/shm DeviceMajor:0 DeviceMinor:447 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f91d1788-027d-432b-be33-ca952a95046a/volumes/kubernetes.io~secret/prometheus-operator-tls DeviceMajor:0 DeviceMinor:823 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-88 DeviceMajor:0 DeviceMinor:88 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-816 DeviceMajor:0 DeviceMinor:816 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ee3cc021-67d8-4b7f-b443-16f18228712e/volumes/kubernetes.io~projected/kube-api-access-7l4b7 DeviceMajor:0 DeviceMinor:263 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/96de6024-e20f-4b52-9294-b330d65e4153/volumes/kubernetes.io~projected/kube-api-access-z8bxz DeviceMajor:0 DeviceMinor:367 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-551 DeviceMajor:0 DeviceMinor:551 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bbc0b783-28d5-4554-b49d-c66082546f44/volumes/kubernetes.io~projected/kube-api-access-27qvw DeviceMajor:0 DeviceMinor:246 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/dc885deb2f8a42b2a9647ca9a7a52b0dd31aa93757cc5e47ba821f5cab2f46b8/userdata/shm DeviceMajor:0 DeviceMinor:252 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-311 DeviceMajor:0 DeviceMinor:311 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-469 DeviceMajor:0 DeviceMinor:469 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a25248c0-8de7-4624-b785-f053665fcb23/volumes/kubernetes.io~secret/kube-state-metrics-tls DeviceMajor:0 DeviceMinor:1081 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b4291bfd-53d9-4c78-b7cb-d7eb46560528/volumes/kubernetes.io~projected/kube-api-access-9nvl4 DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a57854ac-809a-4745-aaa1-774f0a08a560/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:245 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-300 DeviceMajor:0 DeviceMinor:300 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2f844652-225b-4713-a9ad-cf9bcc348f47/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:386 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/75e3e2cc-aa56-41f3-8859-1c086f419d05/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/aa16c3bf-2350-46d1-afa0-9477b3ec8877/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/325f0a83-d56d-4b62-977b-088a7d5f0e00/volumes/kubernetes.io~projected/kube-api-access-lqdlf DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-370 DeviceMajor:0 DeviceMinor:370 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/49245723e92395f35c0b36240f7d6fbf94cef777cdd886e69b77ece8cf42ea5f/userdata/shm DeviceMajor:0 DeviceMinor:847 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-72 DeviceMajor:0 DeviceMinor:72 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-793 DeviceMajor:0 DeviceMinor:793 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-462 DeviceMajor:0 DeviceMinor:462 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:600 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4308310cb66871b8d038228451f6cb347ae9fe6f0a69349b4baf59dc1b20775d/userdata/shm DeviceMajor:0 DeviceMinor:773 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/948f733f9e7fc399ff3028ac75f39dbd9ac2f6622b269cc750e23eb9c88dedb1/userdata/shm DeviceMajor:0 DeviceMinor:612 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:907 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a638c468-010c-4da3-ad62-26f5f2bbdbb9/volumes/kubernetes.io~projected/kube-api-access-sdbds DeviceMajor:0 DeviceMinor:1035 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1096 DeviceMajor:0 DeviceMinor:1096 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-391 DeviceMajor:0 DeviceMinor:391 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ef2664ab7688b2c180ba8801467801f25c2ef381f1cb268907fa44549876a809/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2f844652-225b-4713-a9ad-cf9bcc348f47/volumes/kubernetes.io~projected/kube-api-access-jdwvw DeviceMajor:0 DeviceMinor:387 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-501 DeviceMajor:0 DeviceMinor:501 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/989d132822ac99b97c52492bc7539dcc4d25a3a8fbced6fed73e66c9b3f74f8d/userdata/shm DeviceMajor:0 DeviceMinor:548 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-768 DeviceMajor:0 DeviceMinor:768 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-132 DeviceMajor:0 DeviceMinor:132 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-302 DeviceMajor:0 DeviceMinor:302 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c0142d4e-9fd4-4375-a773-bb89b38af654/volumes/kubernetes.io~projected/kube-api-access-w2zzd DeviceMajor:0 DeviceMinor:315 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a9a9ecf2-c476-4962-8333-21f242dbcb89/volumes/kubernetes.io~projected/kube-api-access-d5v9f DeviceMajor:0 DeviceMinor:1046 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/fb0fc10f-5796-4cd5-b8f5-72d678054c24/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:138 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/68252533-bd64-4fc5-838a-cc350cbe77f0/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/919c09e620d76c77a9425f79c1d3a73bcd978ef6fd46e99de901376458ec1273/userdata/shm DeviceMajor:0 DeviceMinor:239 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-533 DeviceMajor:0 DeviceMinor:533 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5a9e26d5feffb5d032a36611c5da4c454dba6454147ff1cf841070d4b4feeb67/userdata/shm DeviceMajor:0 DeviceMinor:842 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-324 DeviceMajor:0 DeviceMinor:324 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5d21afac0935094080df835d139dd20efdf51fad3502782c60bb38ee7294f13b/userdata/shm DeviceMajor:0 DeviceMinor:241 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f826050a5c784de5b331265098c5bf62987ce32e6de55cf05cab0f3f76894ea8/userdata/shm DeviceMajor:0 DeviceMinor:262 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3de37144-a9ab-45fb-a23f-2287a5198edf/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:567 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/91b2899e-8d24-41a0-bec8-d11c67b8f955/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:833 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ae39c09b-7aef-4615-8ced-0dcad39f23a5/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:803 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-679 DeviceMajor:0 DeviceMinor:679 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1069 DeviceMajor:0 DeviceMinor:1069 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f63327bd315d50a90d1b9877abdb96105e01d6fdb0e17116424c13ecbe62dbc3/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-382 DeviceMajor:0 DeviceMinor:382 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/aa16c3bf-2350-46d1-afa0-9477b3ec8877/volumes/kubernetes.io~projected/kube-api-access-qmndg DeviceMajor:0 DeviceMinor:230 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fee33178d398 Mar 20 08:40:55.014034 master-0 kubenswrapper[18707]: a85728734b8702eecb787d89c780d680fd9fa904a7591c14e420/userdata/shm DeviceMajor:0 DeviceMinor:344 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1058 DeviceMajor:0 DeviceMinor:1058 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-770 DeviceMajor:0 DeviceMinor:770 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ad692349-5089-4afc-85b2-9b6e7997567c/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:98 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-637 DeviceMajor:0 DeviceMinor:637 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-714 DeviceMajor:0 DeviceMinor:714 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-914 DeviceMajor:0 DeviceMinor:914 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-639 DeviceMajor:0 DeviceMinor:639 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3de37144-a9ab-45fb-a23f-2287a5198edf/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:573 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/02b8b46e9f6cf48ded279c24ec1e51a94bbe25b122e72584be4a8549a6a9d74b/userdata/shm DeviceMajor:0 DeviceMinor:441 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:257 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-396 DeviceMajor:0 DeviceMinor:396 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/51fe4cded0c2312a6b3bfd1f48ff9089513eaa05b60208bd1ced0f39d8ffe36e/userdata/shm DeviceMajor:0 DeviceMinor:265 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4fea9b05-222e-4b58-95c8-735fc1cf3a8b/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:466 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-50 DeviceMajor:0 DeviceMinor:50 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-363 DeviceMajor:0 DeviceMinor:363 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e9d1c009ab8bfc1b5d9e5ffc8e765a713719b93c5edffb019bac7a72776dcb9e/userdata/shm DeviceMajor:0 DeviceMinor:1047 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a57854ac-809a-4745-aaa1-774f0a08a560/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:267 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f1468ec0-2aa4-461c-a62f-e9f067be490f/volumes/kubernetes.io~projected/kube-api-access-bqmv5 DeviceMajor:0 DeviceMinor:1086 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a69e8d3a-a0b1-4688-8631-d9f265aa4c69/volumes/kubernetes.io~secret/secret-metrics-client-certs DeviceMajor:0 DeviceMinor:1153 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/248a3d2f-3be4-46bf-959c-79d28736c0d6/volumes/kubernetes.io~projected/kube-api-access-mtc2p DeviceMajor:0 DeviceMinor:127 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/46de2acc-9f5d-4ecf-befe-a480f86466f5/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:575 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f2217de0-7805-4f5f-8ea5-93b81b7e0236/volumes/kubernetes.io~projected/kube-api-access-82x7p DeviceMajor:0 DeviceMinor:912 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-730 DeviceMajor:0 DeviceMinor:730 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c1a730f8ab11fc1de19f12e0bc73ab4daa97d0d1b64ae152032167057be533ed/userdata/shm DeviceMajor:0 DeviceMinor:618 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/91b2899e-8d24-41a0-bec8-d11c67b8f955/volumes/kubernetes.io~secret/default-certificate DeviceMajor:0 DeviceMinor:832 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/630f3ef68fb2ab037a83499120027474c94dfe12bf91c1a5c52579bd6c878cbf/userdata/shm DeviceMajor:0 DeviceMinor:839 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-875 DeviceMajor:0 DeviceMinor:875 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1180 DeviceMajor:0 DeviceMinor:1180 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-326 DeviceMajor:0 DeviceMinor:326 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0e8cf476b590f62cf998ca26bf5d07d7ec24559896e92dec128d4a67e132990a/userdata/shm DeviceMajor:0 DeviceMinor:104 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-686 DeviceMajor:0 DeviceMinor:686 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b543f82e-683d-47c1-af73-4dcede4cf4df/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:825 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-364 DeviceMajor:0 DeviceMinor:364 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/48ea9b1e1ed051eaf5386ce4d24d2d55f57d357f51f1c79f94723fc2aed83c0f/userdata/shm DeviceMajor:0 DeviceMinor:622 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-934 DeviceMajor:0 DeviceMinor:934 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ef4e9117db9997ed5777e4eb0ac97e214bfa4a8d0f2ec4d63af464bd290bf782/userdata/shm DeviceMajor:0 DeviceMinor:784 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-422 DeviceMajor:0 DeviceMinor:422 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-718 DeviceMajor:0 DeviceMinor:718 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1018 DeviceMajor:0 DeviceMinor:1018 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7b489385-2c96-4a97-8b31-362162de020e/volumes/kubernetes.io~projected/kube-api-access-pg6th DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:434 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6a80bd6f-2263-4251-8197-5173193f8afc/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:608 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1375da42-ecaf-4d86-b554-25fd1c3d00bd/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:728 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5e3ddf9e-eeb5-4266-b675-092fd4e27623/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/12e1d9e5-96b5-4367-81a5-d87b3f93d8da/volumes/kubernetes.io~projected/kube-api-access-g2qf7 DeviceMajor:0 DeviceMinor:544 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-42 DeviceMajor:0 DeviceMinor:42 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/813f91c2-2b37-4681-968d-4217e286e22f/volumes/kubernetes.io~projected/kube-api-access-njjkt DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/acb704a9-6c8d-4378-ae93-e7095b1fce85/volumes/kubernetes.io~projected/kube-api-access-xvx6w DeviceMajor:0 DeviceMinor:260 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/21bebade-17fa-444e-92a9-eea53d6cd673/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:909 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d88ba8e1-ee42-423f-9839-e71cb0265c6c/volumes/kubernetes.io~projected/kube-api-access-lw4np DeviceMajor:0 DeviceMinor:749 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-477 DeviceMajor:0 DeviceMinor:477 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/70020125-af49-47d7-8853-fb951c561dc4/volumes/kubernetes.io~projected/kube-api-access-9s2tb DeviceMajor:0 DeviceMinor:556 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/16961d83ade56434cc5d6847f42954f60a19529cc2fd922d6eec9e1e749ea461/userdata/shm DeviceMajor:0 DeviceMinor:282 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-408 DeviceMajor:0 DeviceMinor:408 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1125 DeviceMajor:0 DeviceMinor:1125 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a69e8d3a-a0b1-4688-8631-d9f265aa4c69/volumes/kubernetes.io~secret/client-ca-bundle DeviceMajor:0 DeviceMinor:1158 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fb1900c9365a08f03e6fa39a54bd69429d8cedb421e2b0d1e4f977c7a6faf417/userdata/shm DeviceMajor:0 DeviceMinor:89 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/de6078d7-2aad-46fe-b17a-b6b38e4eaa41/volumes/kubernetes.io~projected/kube-api-access-66kz7 DeviceMajor:0 DeviceMinor:805 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1123 DeviceMajor:0 DeviceMinor:1123 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f5782718-9118-4682-a287-7998cd0304b3/volumes/kubernetes.io~projected/kube-api-access-bbvtp DeviceMajor:0 DeviceMinor:657 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-377 DeviceMajor:0 DeviceMinor:377 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1106 DeviceMajor:0 DeviceMinor:1106 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-765 DeviceMajor:0 DeviceMinor:765 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-111 DeviceMajor:0 DeviceMinor:111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7/volumes/kubernetes.io~projected/kube-api-access-fxg4z DeviceMajor:0 DeviceMinor:103 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1cdce851761fa32d1bf31f749f94743d1cc0bca1a250ad08c114ccc4c2f77b9b/userdata/shm DeviceMajor:0 DeviceMinor:105 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-968 DeviceMajor:0 DeviceMinor:968 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-331 DeviceMajor:0 DeviceMinor:331 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-675 DeviceMajor:0 DeviceMinor:675 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-729 DeviceMajor:0 DeviceMinor:729 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97/volumes/kubernetes.io~projected/kube-api-access-6wl7f DeviceMajor:0 DeviceMinor:838 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0885038f20b8301926b3fbf1704c402c0cf75d3a9c64af977d4466bc2f1fe1b6/userdata/shm DeviceMajor:0 DeviceMinor:1049 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-851 DeviceMajor:0 DeviceMinor:851 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7c4e7e57-43be-4d31-b523-f7e4d316dce3/volumes/kubernetes.io~projected/kube-api-access-bvjkr DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-775 DeviceMajor:0 DeviceMinor:775 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/49cf279e25f73f2651832183fcd1e966f6a074b295d2f5e34a59a5be738d2377/userdata/shm DeviceMajor:0 DeviceMinor:272 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-651 DeviceMajor:0 DeviceMinor:651 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-747 DeviceMajor:0 DeviceMinor:747 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-157 DeviceMajor:0 DeviceMinor:157 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-406 DeviceMajor:0 DeviceMinor:406 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-439 DeviceMajor:0 DeviceMinor:439 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/89ef9ca9615c3facb086051e9be1597bf7ab0b883ad4dc7992533c4b3196f4e1/userdata/shm DeviceMajor:0 DeviceMinor:1092 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/fa759777-de22-4440-a3d3-ad429a3b8e7b/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:249 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3f9c2dbd6bdf8182b597345f8c7fea11c09d5e650fe0f55bf00a3c9f8887aa52/userdata/shm DeviceMajor:0 DeviceMinor:442 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9af8ad1671806bd505a390180b2388970cc534101cf6a5cf64e76e13311c0cb9/userdata/shm DeviceMajor:0 DeviceMinor:268 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-529 DeviceMajor:0 DeviceMinor:529 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/21bebade-17fa-444e-92a9-eea53d6cd673/volumes/kubernetes.io~projected/kube-api-access-zsht7 DeviceMajor:0 DeviceMinor:910 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1038dded4ac6146a3ef7e05fc425b32ac120e0351ec2aaee7b8ebe45679034dd/userdata/shm DeviceMajor:0 DeviceMinor:323 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/75e3e2cc-aa56-41f3-8859-1c086f419d05/volumes/kubernetes.io~projected/kube-api-access-clz6w DeviceMajor:0 DeviceMinor:243 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-289 DeviceMajor:0 DeviceMinor:289 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-563 DeviceMajor:0 DeviceMinor:563 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-812 DeviceMajor:0 DeviceMinor:812 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/86cb5d23-df7f-4f67-8086-1789d8e68544/volumes/kubernetes.io~projected/kube-api-access-j7k8m DeviceMajor:0 DeviceMinor:227 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c593e31d-82b5-4d42-992e-6b050ccf3019/volumes/kubernetes.io~projected/kube-api-access-gxmkh DeviceMajor:0 DeviceMinor:80 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1175 DeviceMajor:0 DeviceMinor:1175 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-592 DeviceMajor:0 DeviceMinor:592 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-997 DeviceMajor:0 DeviceMinor:997 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1119 DeviceMajor:0 DeviceMinor:1119 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/48eeec4a0ddb4a6c0b04997cccaac081db5581507160a9804ec767ba66cbdb34/userdata/shm DeviceMajor:0 DeviceMinor:449 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4fea9b05-222e-4b58-95c8-735fc1cf3a8b/volumes/kubernetes.io~projected/kube-api-access-qstvb DeviceMajor:0 DeviceMinor:514 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-527 DeviceMajor:0 DeviceMinor:527 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-647 DeviceMajor:0 DeviceMinor:647 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/760e18ae3700547257879a5def4bd8a3e6845f819368981331d76400fe97af9e/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9786d6c6139bd8ce0f2d0b4a8dd76068f202fd2531fb86bd7b58ce95ed53f64f/userdata/shm DeviceMajor:0 DeviceMinor:1161 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1003 DeviceMajor:0 DeviceMinor:1003 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/df428d5a-c722-4536-8e7f-cdd85c560481/volumes/kubernetes.io~projected/kube-api-access-dtcnq DeviceMajor:0 DeviceMinor:255 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-274 DeviceMajor:0 DeviceMinor:274 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-279 DeviceMajor:0 DeviceMinor:279 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-345 DeviceMajor:0 DeviceMinor:345 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-852 DeviceMajor:0 DeviceMinor:852 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-994 DeviceMajor:0 DeviceMinor:994 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a49d9b3d48753f2f078ac0130c6af5ca5c251ec6fd9f68a980d833a929f8987e/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-523 DeviceMajor:0 DeviceMinor:523 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f1468ec0-2aa4-461c-a62f-e9f067be490f/volumes/kubernetes.io~secret/node-exporter-tls DeviceMajor:0 DeviceMinor:1078 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1098 DeviceMajor:0 DeviceMinor:1098 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4ddac301-a604-4f07-8849-5928befd336e/volumes/kubernetes.io~secret/node-bootstrap-token DeviceMajor:0 DeviceMinor:878 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-460 DeviceMajor:0 DeviceMinor:460 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f5782718-9118-4682-a287-7998cd0304b3/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:580 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-929 DeviceMajor:0 DeviceMinor:929 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ddcba86a9171baf183956cb4886e6b2242be15b32412346569edbcdcef731dfb/userdata/shm DeviceMajor:0 DeviceMinor:1090 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-974 DeviceMajor:0 DeviceMinor:974 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-352 DeviceMajor:0 DeviceMinor:352 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-750 DeviceMajor:0 DeviceMinor:750 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5702154693e32d84807189cf18ed2f8ceb28029864edaaaff188dc529b9551c9/userdata/shm DeviceMajor:0 DeviceMinor:316 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-452 DeviceMajor:0 DeviceMinor:452 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e78425b2d05a4bc50aa45d6948d8f1a6996398b351024eb248c91be99ea577d1/userdata/shm DeviceMajor:0 DeviceMinor:561 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1375da42-ecaf-4d86-b554-25fd1c3d00bd/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:755 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1db4d695-5a6a-4fbe-b610-3777bfebed79/volumes/kubernetes.io~projected/kube-api-access-whmmk DeviceMajor:0 DeviceMinor:1076 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3de37144-a9ab-45fb-a23f-2287a5198edf/volumes/kubernetes.io~projected/kube-api-access-zr8br DeviceMajor:0 DeviceMinor:572 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/91b2899e-8d24-41a0-bec8-d11c67b8f955/volumes/kubernetes.io~projected/kube-api-access-hqtvp DeviceMajor:0 DeviceMinor:834 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-814 DeviceMajor:0 DeviceMinor:814 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f91d1788-027d-432b-be33-ca952a95046a/volumes/kubernetes.io~projected/kube-api-access-lnm6c DeviceMajor:0 DeviceMinor:792 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/654b5b1c-2764-415c-bb13-aa06899f4076/volumes/kubernetes.io~projected/kube-api-access-xcp8t DeviceMajor:0 DeviceMinor:937 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-95 DeviceMajor:0 DeviceMinor:95 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:0262d134f60647f MacAddress:ba:9c:86:9f:7d:46 Speed:10000 Mtu:8900} {Name:02b8b46e9f6cf48 MacAddress:76:b5:7d:86:74:63 Speed:10000 Mtu:8900} {Name:07c77bfc7ae9afd MacAddress:ea:e0:aa:8f:d3:84 Speed:10000 Mtu:8900} {Name:0885038f20b8301 MacAddress:d6:48:7a:e3:a6:c0 Speed:10000 Mtu:8900} {Name:0e8cf476b590f62 MacAddress:fa:1d:8b:fc:60:32 Speed:10000 Mtu:8900} {Name:1038dded4ac6146 MacAddress:5a:1f:c4:39:44:9f Speed:10000 Mtu:8900} {Name:16961d83ade5643 MacAddress:16:eb:94:fc:a4:cd Speed:10000 Mtu:8900} {Name:21791b9b344da8b MacAddress:3a:76:ce:6b:09:75 Speed:10000 Mtu:8900} {Name:285790bb4eeaea0 MacAddress:7a:67:6c:ec:5e:ac Speed:10000 Mtu:8900} {Name:3092eb7a1622039 MacAddress:be:44:dc:86:c5:5b Speed:10000 Mtu:8900} {Name:3f9c2dbd6bdf818 MacAddress:ce:98:03:d0:5f:14 Speed:10000 Mtu:8900} {Name:3fd8857a2c2302e MacAddress:fa:2c:a7:d4:d8:99 Speed:10000 Mtu:8900} {Name:4308310cb66871b MacAddress:ae:74:0e:f5:0f:db Speed:10000 Mtu:8900} {Name:46a12190f11c7c4 MacAddress:2e:6c:63:06:86:8b Speed:10000 Mtu:8900} {Name:48ea9b1e1ed051e MacAddress:66:91:30:45:f0:c1 Speed:10000 Mtu:8900} {Name:49245723e92395f MacAddress:96:c5:31:cd:af:5a Speed:10000 Mtu:8900} {Name:4d1e15f77043670 MacAddress:86:5e:d6:fa:87:f3 Speed:10000 Mtu:8900} {Name:4d57d4740bcb6c8 MacAddress:d2:30:bc:30:ed:0a Speed:10000 Mtu:8900} {Name:51fe4cded0c2312 MacAddress:86:b0:6e:70:0a:d9 Speed:10000 Mtu:8900} {Name:5702154693e32d8 MacAddress:da:b5:73:bf:78:00 Speed:10000 Mtu:8900} {Name:5a9e26d5feffb5d MacAddress:72:0c:9e:e6:bc:3c Speed:10000 Mtu:8900} {Name:5d21afac0935094 MacAddress:6e:ab:09:f1:f7:a2 Speed:10000 Mtu:8900} {Name:60bbe3130a4d334 MacAddress:ae:70:51:da:6d:27 Speed:10000 Mtu:8900} {Name:6cc2d27a03b3682 MacAddress:ba:97:e9:14:64:4c Speed:10000 Mtu:8900} {Name:8a073909baddafc MacAddress:ee:24:4f:06:50:fd Speed:10000 Mtu:8900} {Name:8a61c21711f690c MacAddress:4a:c1:07:77:ef:84 Speed:10000 Mtu:8900} {Name:919c09e620d76c7 MacAddress:86:f0:53:69:aa:b5 Speed:10000 Mtu:8900} {Name:948f733f9e7fc39 MacAddress:7e:a7:54:9c:54:4e Speed:10000 Mtu:8900} {Name:9786d6c6139bd8c MacAddress:9a:95:26:95:ab:1e Speed:10000 Mtu:8900} {Name:989d132822ac99b MacAddress:0a:14:42:d9:3d:85 Speed:10000 Mtu:8900} {Name:9a8426b4146cf2f MacAddress:be:2f:3c:a0:ce:1e Speed:10000 Mtu:8900} {Name:9af8ad1671806bd MacAddress:06:02:82:e8:9c:12 Speed:10000 Mtu:8900} {Name:a07ae992a492956 MacAddress:d6:8e:28:5c:1a:67 Speed:10000 Mtu:8900} {Name:a56a69cfc23cf8a MacAddress:96:06:8e:5b:cd:9f Speed:10000 Mtu:8900} {Name:ac63789abc1163c MacAddress:82:97:96:15:e7:1d Speed:10000 Mtu:8900} {Name:b77a1ff885b1de5 MacAddress:66:c5:28:32:84:9f Speed:10000 Mtu:8900} {Name:b8d362586f6fb45 MacAddress:02:76:36:19:a2:ef Speed:10000 Mtu:8900} {Name:bc3668412459475 MacAddress:4a:41:77:3c:fb:17 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:7a:3a:b9:64:e3:6b Speed:0 Mtu:8900} {Name:c1a730f8ab11fc1 MacAddress:12:04:28:60:74:4d Speed:10000 Mtu:8900} {Name:cadfc06b46a2370 MacAddress:8e:07:e6:48:cb:ed Speed:10000 Mtu:8900} {Name:d1f4c3462eb562d MacAddress:3e:af:14:80:2a:fd Speed:10000 Mtu:8900} {Name:d48534fe1c98270 MacAddress:26:2e:c1:c1:9a:0e Speed:10000 Mtu:8900} {Name:dc885deb2f8a42b MacAddress:5a:90:d2:4f:6f:ba Speed:10000 Mtu:8900} {Name:ddcba86a9171baf MacAddress:aa:01:f8:0b:9f:d1 Speed:10000 Mtu:8900} {Name:df1df4af888713c MacAddress:9e:bf:2f:1f:49:4c Speed:10000 Mtu:8900} {Name:df43cdf08fb65d0 MacAddress:ea:04:6a:10:8e:3f Speed:10000 Mtu:8900} {Name:e9d1c009ab8bfc1 MacAddress:06:17:bd:c1:7d:8e Speed:10000 Mtu:8900} {Name:ead213f06d6e13b MacAddress:b6:30:d8:a0:86:1a Speed:10000 Mtu:8900} {Name:ec3f7a57e8d7aa7 MacAddress:5e:54:f2:bc:0f:eb Speed:10000 Mtu:8900} {Name:edc62dc83d0212a MacAddress:c2:af:3d:0d:ee:36 Speed:10000 Mtu:8900} {Name:ef4e9117db9997e MacAddress:fa:a9:55:86:b8:a0 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:c7:39:2c Speed:-1 Mtu:9000} {Name:f826050a5c784de MacAddress:c2:28:ce:1c:17:22 Speed:10000 Mtu:8900} {Name:f8c21c05090492f MacAddress:12:9f:45:72:a2:cd Speed:10000 Mtu:8900} {Name:fd67ef0d5263721 MacAddress:da:c6:cd:95:df:8f Speed:10000 Mtu:8900} {Name:fee33178d398a85 MacAddress:42:41:97:b0:dd:dd Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:52:c7:57:df:ad:1d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 08:40:55.014034 master-0 kubenswrapper[18707]: I0320 08:40:55.012946 18707 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 08:40:55.014034 master-0 kubenswrapper[18707]: I0320 08:40:55.013024 18707 manager.go:233] Version: {KernelVersion:5.14.0-427.113.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202603021444-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 08:40:55.014034 master-0 kubenswrapper[18707]: I0320 08:40:55.013570 18707 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 08:40:55.014034 master-0 kubenswrapper[18707]: I0320 08:40:55.013757 18707 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 08:40:55.014557 master-0 kubenswrapper[18707]: I0320 08:40:55.013804 18707 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 08:40:55.014557 master-0 kubenswrapper[18707]: I0320 08:40:55.014113 18707 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 08:40:55.014557 master-0 kubenswrapper[18707]: I0320 08:40:55.014126 18707 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 08:40:55.014557 master-0 kubenswrapper[18707]: I0320 08:40:55.014139 18707 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 08:40:55.014557 master-0 kubenswrapper[18707]: I0320 08:40:55.014164 18707 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 08:40:55.014557 master-0 kubenswrapper[18707]: I0320 08:40:55.014274 18707 state_mem.go:36] "Initialized new in-memory state store" Mar 20 08:40:55.014557 master-0 kubenswrapper[18707]: I0320 08:40:55.014397 18707 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 08:40:55.014557 master-0 kubenswrapper[18707]: I0320 08:40:55.014484 18707 kubelet.go:418] "Attempting to sync node with API server" Mar 20 08:40:55.014557 master-0 kubenswrapper[18707]: I0320 08:40:55.014503 18707 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 08:40:55.014557 master-0 kubenswrapper[18707]: I0320 08:40:55.014525 18707 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 08:40:55.014557 master-0 kubenswrapper[18707]: I0320 08:40:55.014564 18707 kubelet.go:324] "Adding apiserver pod source" Mar 20 08:40:55.021793 master-0 kubenswrapper[18707]: I0320 08:40:55.014592 18707 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 08:40:55.022840 master-0 kubenswrapper[18707]: I0320 08:40:55.022807 18707 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 08:40:55.023103 master-0 kubenswrapper[18707]: I0320 08:40:55.023053 18707 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 20 08:40:55.023103 master-0 kubenswrapper[18707]: I0320 08:40:55.023298 18707 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 08:40:55.023103 master-0 kubenswrapper[18707]: I0320 08:40:55.023662 18707 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 08:40:55.024580 master-0 kubenswrapper[18707]: I0320 08:40:55.024553 18707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 08:40:55.024634 master-0 kubenswrapper[18707]: I0320 08:40:55.024584 18707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 08:40:55.024634 master-0 kubenswrapper[18707]: I0320 08:40:55.024603 18707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 08:40:55.024634 master-0 kubenswrapper[18707]: I0320 08:40:55.024612 18707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 08:40:55.024634 master-0 kubenswrapper[18707]: I0320 08:40:55.024621 18707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 08:40:55.024634 master-0 kubenswrapper[18707]: I0320 08:40:55.024632 18707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 08:40:55.024884 master-0 kubenswrapper[18707]: I0320 08:40:55.024641 18707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 08:40:55.024884 master-0 kubenswrapper[18707]: I0320 08:40:55.024650 18707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 08:40:55.024884 master-0 kubenswrapper[18707]: I0320 08:40:55.024662 18707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 08:40:55.024884 master-0 kubenswrapper[18707]: I0320 08:40:55.024673 18707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 08:40:55.024884 master-0 kubenswrapper[18707]: I0320 08:40:55.024690 18707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 08:40:55.024884 master-0 kubenswrapper[18707]: I0320 08:40:55.024703 18707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 08:40:55.024884 master-0 kubenswrapper[18707]: I0320 08:40:55.024736 18707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 08:40:55.025447 master-0 kubenswrapper[18707]: I0320 08:40:55.025419 18707 server.go:1280] "Started kubelet" Mar 20 08:40:55.026470 master-0 kubenswrapper[18707]: I0320 08:40:55.026395 18707 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 08:40:55.026470 master-0 kubenswrapper[18707]: I0320 08:40:55.026356 18707 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 08:40:55.026565 master-0 kubenswrapper[18707]: I0320 08:40:55.026519 18707 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 20 08:40:55.026481 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 20 08:40:55.027875 master-0 kubenswrapper[18707]: I0320 08:40:55.027843 18707 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 08:40:55.034577 master-0 kubenswrapper[18707]: I0320 08:40:55.034525 18707 server.go:449] "Adding debug handlers to kubelet server" Mar 20 08:40:55.040389 master-0 kubenswrapper[18707]: I0320 08:40:55.040322 18707 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 08:40:55.052230 master-0 kubenswrapper[18707]: I0320 08:40:55.052155 18707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 08:40:55.052230 master-0 kubenswrapper[18707]: I0320 08:40:55.052238 18707 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 08:40:55.052543 master-0 kubenswrapper[18707]: I0320 08:40:55.052280 18707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-21 08:25:35 +0000 UTC, rotation deadline is 2026-03-21 03:20:22.56891579 +0000 UTC Mar 20 08:40:55.052543 master-0 kubenswrapper[18707]: I0320 08:40:55.052358 18707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h39m27.516561209s for next certificate rotation Mar 20 08:40:55.052746 master-0 kubenswrapper[18707]: I0320 08:40:55.052718 18707 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 08:40:55.052746 master-0 kubenswrapper[18707]: I0320 08:40:55.052735 18707 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 08:40:55.052877 master-0 kubenswrapper[18707]: I0320 08:40:55.052857 18707 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 20 08:40:55.063435 master-0 kubenswrapper[18707]: I0320 08:40:55.061877 18707 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 08:40:55.071648 master-0 kubenswrapper[18707]: I0320 08:40:55.066568 18707 factory.go:55] Registering systemd factory Mar 20 08:40:55.071648 master-0 kubenswrapper[18707]: I0320 08:40:55.066626 18707 factory.go:221] Registration of the systemd container factory successfully Mar 20 08:40:55.075119 master-0 kubenswrapper[18707]: I0320 08:40:55.075068 18707 factory.go:153] Registering CRI-O factory Mar 20 08:40:55.075119 master-0 kubenswrapper[18707]: I0320 08:40:55.075096 18707 factory.go:221] Registration of the crio container factory successfully Mar 20 08:40:55.075412 master-0 kubenswrapper[18707]: I0320 08:40:55.075111 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0142d4e-9fd4-4375-a773-bb89b38af654" volumeName="kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd" seLinuxMountContext="" Mar 20 08:40:55.075412 master-0 kubenswrapper[18707]: I0320 08:40:55.075208 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" volumeName="kubernetes.io/secret/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.075412 master-0 kubenswrapper[18707]: I0320 08:40:55.075226 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46de2acc-9f5d-4ecf-befe-a480f86466f5" volumeName="kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-config" seLinuxMountContext="" Mar 20 08:40:55.075412 master-0 kubenswrapper[18707]: I0320 08:40:55.075239 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b639e578-628e-404d-b759-8b6e84e771d9" volumeName="kubernetes.io/empty-dir/b639e578-628e-404d-b759-8b6e84e771d9-catalog-content" seLinuxMountContext="" Mar 20 08:40:55.075412 master-0 kubenswrapper[18707]: I0320 08:40:55.075252 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a57854ac-809a-4745-aaa1-774f0a08a560" volumeName="kubernetes.io/projected/a57854ac-809a-4745-aaa1-774f0a08a560-kube-api-access" seLinuxMountContext="" Mar 20 08:40:55.075412 master-0 kubenswrapper[18707]: I0320 08:40:55.075265 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b543f82e-683d-47c1-af73-4dcede4cf4df" volumeName="kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-webhook-cert" seLinuxMountContext="" Mar 20 08:40:55.075412 master-0 kubenswrapper[18707]: I0320 08:40:55.075270 18707 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 08:40:55.075412 master-0 kubenswrapper[18707]: I0320 08:40:55.075276 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="de6078d7-2aad-46fe-b17a-b6b38e4eaa41" volumeName="kubernetes.io/configmap/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 08:40:55.075412 master-0 kubenswrapper[18707]: I0320 08:40:55.075303 18707 factory.go:103] Registering Raw factory Mar 20 08:40:55.075412 master-0 kubenswrapper[18707]: I0320 08:40:55.075308 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f844652-225b-4713-a9ad-cf9bcc348f47" volumeName="kubernetes.io/projected/2f844652-225b-4713-a9ad-cf9bcc348f47-kube-api-access-jdwvw" seLinuxMountContext="" Mar 20 08:40:55.075412 master-0 kubenswrapper[18707]: I0320 08:40:55.075327 18707 manager.go:1196] Started watching for new ooms in manager Mar 20 08:40:55.076372 master-0 kubenswrapper[18707]: I0320 08:40:55.076217 18707 manager.go:319] Starting recovery of all containers Mar 20 08:40:55.077876 master-0 kubenswrapper[18707]: E0320 08:40:55.077825 18707 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 20 08:40:55.078380 master-0 kubenswrapper[18707]: I0320 08:40:55.075327 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="75e3e2cc-aa56-41f3-8859-1c086f419d05" volumeName="kubernetes.io/configmap/75e3e2cc-aa56-41f3-8859-1c086f419d05-config" seLinuxMountContext="" Mar 20 08:40:55.078473 master-0 kubenswrapper[18707]: I0320 08:40:55.078400 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fa759777-de22-4440-a3d3-ad429a3b8e7b" volumeName="kubernetes.io/configmap/fa759777-de22-4440-a3d3-ad429a3b8e7b-config" seLinuxMountContext="" Mar 20 08:40:55.078473 master-0 kubenswrapper[18707]: I0320 08:40:55.078422 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fa759777-de22-4440-a3d3-ad429a3b8e7b" volumeName="kubernetes.io/projected/fa759777-de22-4440-a3d3-ad429a3b8e7b-kube-api-access" seLinuxMountContext="" Mar 20 08:40:55.078473 master-0 kubenswrapper[18707]: I0320 08:40:55.078436 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02" volumeName="kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-bound-sa-token" seLinuxMountContext="" Mar 20 08:40:55.078650 master-0 kubenswrapper[18707]: I0320 08:40:55.078475 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f046860d-2d54-4746-8ba2-f8e90fa55e38" volumeName="kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.078650 master-0 kubenswrapper[18707]: I0320 08:40:55.078497 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fa759777-de22-4440-a3d3-ad429a3b8e7b" volumeName="kubernetes.io/secret/fa759777-de22-4440-a3d3-ad429a3b8e7b-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.078650 master-0 kubenswrapper[18707]: I0320 08:40:55.078512 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1375da42-ecaf-4d86-b554-25fd1c3d00bd" volumeName="kubernetes.io/configmap/1375da42-ecaf-4d86-b554-25fd1c3d00bd-service-ca" seLinuxMountContext="" Mar 20 08:40:55.078650 master-0 kubenswrapper[18707]: I0320 08:40:55.078549 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1db4d695-5a6a-4fbe-b610-3777bfebed79" volumeName="kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-tls" seLinuxMountContext="" Mar 20 08:40:55.078650 master-0 kubenswrapper[18707]: I0320 08:40:55.078571 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46de2acc-9f5d-4ecf-befe-a480f86466f5" volumeName="kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-image-import-ca" seLinuxMountContext="" Mar 20 08:40:55.078650 master-0 kubenswrapper[18707]: I0320 08:40:55.078586 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7e451189-850e-4d19-a40c-40f642d08511" volumeName="kubernetes.io/projected/7e451189-850e-4d19-a40c-40f642d08511-ca-certs" seLinuxMountContext="" Mar 20 08:40:55.078650 master-0 kubenswrapper[18707]: I0320 08:40:55.078600 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbc0b783-28d5-4554-b49d-c66082546f44" volumeName="kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.078650 master-0 kubenswrapper[18707]: I0320 08:40:55.078639 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a80bd6f-2263-4251-8197-5173193f8afc" volumeName="kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs" seLinuxMountContext="" Mar 20 08:40:55.078650 master-0 kubenswrapper[18707]: I0320 08:40:55.078655 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c7f5e6cd-e093-409a-8758-d3db7a7eb32c" volumeName="kubernetes.io/secret/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-machine-api-operator-tls" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.078672 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ee3cc021-67d8-4b7f-b443-16f18228712e" volumeName="kubernetes.io/projected/ee3cc021-67d8-4b7f-b443-16f18228712e-kube-api-access-7l4b7" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.078731 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f1468ec0-2aa4-461c-a62f-e9f067be490f" volumeName="kubernetes.io/projected/f1468ec0-2aa4-461c-a62f-e9f067be490f-kube-api-access-bqmv5" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.078746 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f53bc282-5937-49ac-ac98-2ee37ccb268d" volumeName="kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-config" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.078759 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" volumeName="kubernetes.io/projected/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-kube-api-access-s69rd" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.078799 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ddac301-a604-4f07-8849-5928befd336e" volumeName="kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-certs" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.078822 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab175f7e-a5e8-4fda-98c9-6d052a221a83" volumeName="kubernetes.io/projected/ab175f7e-a5e8-4fda-98c9-6d052a221a83-kube-api-access-zmrxh" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.078836 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="acb704a9-6c8d-4378-ae93-e7095b1fce85" volumeName="kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.078889 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c593e31d-82b5-4d42-992e-6b050ccf3019" volumeName="kubernetes.io/empty-dir/c593e31d-82b5-4d42-992e-6b050ccf3019-utilities" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.078910 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="68252533-bd64-4fc5-838a-cc350cbe77f0" volumeName="kubernetes.io/secret/68252533-bd64-4fc5-838a-cc350cbe77f0-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.078924 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7c4e7e57-43be-4d31-b523-f7e4d316dce3" volumeName="kubernetes.io/projected/7c4e7e57-43be-4d31-b523-f7e4d316dce3-kube-api-access-bvjkr" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.078960 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="91b2899e-8d24-41a0-bec8-d11c67b8f955" volumeName="kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-stats-auth" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.078978 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c593e31d-82b5-4d42-992e-6b050ccf3019" volumeName="kubernetes.io/projected/c593e31d-82b5-4d42-992e-6b050ccf3019-kube-api-access-gxmkh" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.078992 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70020125-af49-47d7-8853-fb951c561dc4" volumeName="kubernetes.io/projected/70020125-af49-47d7-8853-fb951c561dc4-kube-api-access-9s2tb" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079007 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b639e578-628e-404d-b759-8b6e84e771d9" volumeName="kubernetes.io/empty-dir/b639e578-628e-404d-b759-8b6e84e771d9-utilities" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079068 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f53bc282-5937-49ac-ac98-2ee37ccb268d" volumeName="kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079083 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="12e1d9e5-96b5-4367-81a5-d87b3f93d8da" volumeName="kubernetes.io/secret/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-metrics-tls" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079119 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" volumeName="kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-trusted-ca-bundle" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079137 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a57854ac-809a-4745-aaa1-774f0a08a560" volumeName="kubernetes.io/secret/a57854ac-809a-4745-aaa1-774f0a08a560-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079151 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae39c09b-7aef-4615-8ced-0dcad39f23a5" volumeName="kubernetes.io/secret/ae39c09b-7aef-4615-8ced-0dcad39f23a5-machine-approver-tls" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079164 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb0fc10f-5796-4cd5-b8f5-72d678054c24" volumeName="kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079263 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21bebade-17fa-444e-92a9-eea53d6cd673" volumeName="kubernetes.io/projected/21bebade-17fa-444e-92a9-eea53d6cd673-kube-api-access-zsht7" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079282 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b639e578-628e-404d-b759-8b6e84e771d9" volumeName="kubernetes.io/projected/b639e578-628e-404d-b759-8b6e84e771d9-kube-api-access-9zp8f" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079357 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b98b4efc-6117-487f-9cfc-38ce66dd9570" volumeName="kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-binary-copy" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079375 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b98b4efc-6117-487f-9cfc-38ce66dd9570" volumeName="kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-whereabouts-flatfile-configmap" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079388 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="325f0a83-d56d-4b62-977b-088a7d5f0e00" volumeName="kubernetes.io/projected/325f0a83-d56d-4b62-977b-088a7d5f0e00-kube-api-access-lqdlf" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079402 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4291bfd-53d9-4c78-b7cb-d7eb46560528" volumeName="kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079440 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0a17669-a122-44aa-bdda-581bf1fc4649" volumeName="kubernetes.io/empty-dir/c0a17669-a122-44aa-bdda-581bf1fc4649-utilities" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079454 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d88ba8e1-ee42-423f-9839-e71cb0265c6c" volumeName="kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-auth-proxy-config" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079468 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e" volumeName="kubernetes.io/secret/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079480 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1db4d695-5a6a-4fbe-b610-3777bfebed79" volumeName="kubernetes.io/projected/1db4d695-5a6a-4fbe-b610-3777bfebed79-kube-api-access-whmmk" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079524 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3de37144-a9ab-45fb-a23f-2287a5198edf" volumeName="kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-trusted-ca-bundle" seLinuxMountContext="" Mar 20 08:40:55.079227 master-0 kubenswrapper[18707]: I0320 08:40:55.079559 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="45e8b72b-564c-4bb1-b911-baff2d6c87ad" volumeName="kubernetes.io/projected/45e8b72b-564c-4bb1-b911-baff2d6c87ad-kube-api-access-9zpkn" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079599 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a18b9230-de78-41b8-a61e-361b8bb1fbb3" volumeName="kubernetes.io/empty-dir/a18b9230-de78-41b8-a61e-361b8bb1fbb3-tmp" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079622 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a69e8d3a-a0b1-4688-8631-d9f265aa4c69" volumeName="kubernetes.io/empty-dir/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-audit-log" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079641 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a25248c0-8de7-4624-b785-f053665fcb23" volumeName="kubernetes.io/projected/a25248c0-8de7-4624-b785-f053665fcb23-kube-api-access-prcgg" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079682 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a57854ac-809a-4745-aaa1-774f0a08a560" volumeName="kubernetes.io/configmap/a57854ac-809a-4745-aaa1-774f0a08a560-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079698 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02" volumeName="kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079711 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7" volumeName="kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079726 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb0fc10f-5796-4cd5-b8f5-72d678054c24" volumeName="kubernetes.io/secret/fb0fc10f-5796-4cd5-b8f5-72d678054c24-webhook-cert" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079759 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21bebade-17fa-444e-92a9-eea53d6cd673" volumeName="kubernetes.io/secret/21bebade-17fa-444e-92a9-eea53d6cd673-cert" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079772 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7e451189-850e-4d19-a40c-40f642d08511" volumeName="kubernetes.io/empty-dir/7e451189-850e-4d19-a40c-40f642d08511-cache" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079782 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0a17669-a122-44aa-bdda-581bf1fc4649" volumeName="kubernetes.io/empty-dir/c0a17669-a122-44aa-bdda-581bf1fc4649-catalog-content" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079792 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d88ba8e1-ee42-423f-9839-e71cb0265c6c" volumeName="kubernetes.io/projected/d88ba8e1-ee42-423f-9839-e71cb0265c6c-kube-api-access-lw4np" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079801 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ee3cc021-67d8-4b7f-b443-16f18228712e" volumeName="kubernetes.io/configmap/ee3cc021-67d8-4b7f-b443-16f18228712e-iptables-alerter-script" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079812 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9a9ecf2-c476-4962-8333-21f242dbcb89" volumeName="kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-client-ca" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079843 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae39c09b-7aef-4615-8ced-0dcad39f23a5" volumeName="kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079858 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4fea9b05-222e-4b58-95c8-735fc1cf3a8b" volumeName="kubernetes.io/empty-dir/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-cache" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079869 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7b489385-2c96-4a97-8b31-362162de020e" volumeName="kubernetes.io/projected/7b489385-2c96-4a97-8b31-362162de020e-kube-api-access-pg6th" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079881 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a638c468-010c-4da3-ad62-26f5f2bbdbb9" volumeName="kubernetes.io/secret/a638c468-010c-4da3-ad62-26f5f2bbdbb9-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079890 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c1854ea4-c8e2-4289-84b6-1f18b2ac684f" volumeName="kubernetes.io/projected/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-kube-api-access" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079923 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="42df77ec-94aa-48ba-bb35-7b1f1e8b8e97" volumeName="kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-images" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079935 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aa16c3bf-2350-46d1-afa0-9477b3ec8877" volumeName="kubernetes.io/projected/aa16c3bf-2350-46d1-afa0-9477b3ec8877-kube-api-access-qmndg" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079946 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e3bf8eaf-5f6c-41a6-aaeb-6c921d789466" volumeName="kubernetes.io/secret/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466-samples-operator-tls" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079956 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f5782718-9118-4682-a287-7998cd0304b3" volumeName="kubernetes.io/secret/f5782718-9118-4682-a287-7998cd0304b3-proxy-tls" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.079968 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d88ba8e1-ee42-423f-9839-e71cb0265c6c" volumeName="kubernetes.io/secret/d88ba8e1-ee42-423f-9839-e71cb0265c6c-cloud-controller-manager-operator-tls" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080005 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f046860d-2d54-4746-8ba2-f8e90fa55e38" volumeName="kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080017 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f91d1788-027d-432b-be33-ca952a95046a" volumeName="kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-tls" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080029 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2bf90db0-f943-464c-8599-e36b4fc32e1c" volumeName="kubernetes.io/projected/2bf90db0-f943-464c-8599-e36b4fc32e1c-kube-api-access-qns9g" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080041 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="654b5b1c-2764-415c-bb13-aa06899f4076" volumeName="kubernetes.io/empty-dir/654b5b1c-2764-415c-bb13-aa06899f4076-utilities" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080051 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a18b9230-de78-41b8-a61e-361b8bb1fbb3" volumeName="kubernetes.io/empty-dir/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-tuned" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080082 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a25248c0-8de7-4624-b785-f053665fcb23" volumeName="kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-custom-resource-state-configmap" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080099 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0a17669-a122-44aa-bdda-581bf1fc4649" volumeName="kubernetes.io/projected/c0a17669-a122-44aa-bdda-581bf1fc4649-kube-api-access-xf485" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080114 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7" volumeName="kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-daemon-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080128 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="248a3d2f-3be4-46bf-959c-79d28736c0d6" volumeName="kubernetes.io/projected/248a3d2f-3be4-46bf-959c-79d28736c0d6-kube-api-access-mtc2p" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080139 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="654b5b1c-2764-415c-bb13-aa06899f4076" volumeName="kubernetes.io/projected/654b5b1c-2764-415c-bb13-aa06899f4076-kube-api-access-xcp8t" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080152 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="91b2899e-8d24-41a0-bec8-d11c67b8f955" volumeName="kubernetes.io/projected/91b2899e-8d24-41a0-bec8-d11c67b8f955-kube-api-access-hqtvp" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080163 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aa16c3bf-2350-46d1-afa0-9477b3ec8877" volumeName="kubernetes.io/secret/aa16c3bf-2350-46d1-afa0-9477b3ec8877-cluster-storage-operator-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080174 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb0fc10f-5796-4cd5-b8f5-72d678054c24" volumeName="kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-env-overrides" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080228 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5e3ddf9e-eeb5-4266-b675-092fd4e27623" volumeName="kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-env-overrides" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080243 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9a9ecf2-c476-4962-8333-21f242dbcb89" volumeName="kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080268 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c7f5e6cd-e093-409a-8758-d3db7a7eb32c" volumeName="kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080279 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="df428d5a-c722-4536-8e7f-cdd85c560481" volumeName="kubernetes.io/projected/df428d5a-c722-4536-8e7f-cdd85c560481-kube-api-access-dtcnq" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080290 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f5782718-9118-4682-a287-7998cd0304b3" volumeName="kubernetes.io/projected/f5782718-9118-4682-a287-7998cd0304b3-kube-api-access-bbvtp" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080301 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="de6078d7-2aad-46fe-b17a-b6b38e4eaa41" volumeName="kubernetes.io/secret/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-proxy-tls" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080310 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7" volumeName="kubernetes.io/projected/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-kube-api-access-lpdk5" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080323 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f53bc282-5937-49ac-ac98-2ee37ccb268d" volumeName="kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-images" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080333 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46de2acc-9f5d-4ecf-befe-a480f86466f5" volumeName="kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080344 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a69e8d3a-a0b1-4688-8631-d9f265aa4c69" volumeName="kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-metrics-server-audit-profiles" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080357 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="469183dd-dc54-467d-82a1-611132ae8ec4" volumeName="kubernetes.io/secret/469183dd-dc54-467d-82a1-611132ae8ec4-cloud-credential-operator-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080366 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9a9ecf2-c476-4962-8333-21f242dbcb89" volumeName="kubernetes.io/projected/a9a9ecf2-c476-4962-8333-21f242dbcb89-kube-api-access-d5v9f" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080375 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f046860d-2d54-4746-8ba2-f8e90fa55e38" volumeName="kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-ca" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080385 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f2217de0-7805-4f5f-8ea5-93b81b7e0236" volumeName="kubernetes.io/secret/f2217de0-7805-4f5f-8ea5-93b81b7e0236-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080393 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5e3ddf9e-eeb5-4266-b675-092fd4e27623" volumeName="kubernetes.io/projected/5e3ddf9e-eeb5-4266-b675-092fd4e27623-kube-api-access-xd6vv" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080412 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="654b5b1c-2764-415c-bb13-aa06899f4076" volumeName="kubernetes.io/empty-dir/654b5b1c-2764-415c-bb13-aa06899f4076-catalog-content" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080429 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="91b2899e-8d24-41a0-bec8-d11c67b8f955" volumeName="kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-metrics-certs" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080443 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a25248c0-8de7-4624-b785-f053665fcb23" volumeName="kubernetes.io/empty-dir/a25248c0-8de7-4624-b785-f053665fcb23-volume-directive-shadow" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080456 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b543f82e-683d-47c1-af73-4dcede4cf4df" volumeName="kubernetes.io/projected/b543f82e-683d-47c1-af73-4dcede4cf4df-kube-api-access-4c2rq" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080470 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46de2acc-9f5d-4ecf-befe-a480f86466f5" volumeName="kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-etcd-serving-ca" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080483 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a638c468-010c-4da3-ad62-26f5f2bbdbb9" volumeName="kubernetes.io/projected/a638c468-010c-4da3-ad62-26f5f2bbdbb9-kube-api-access-sdbds" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080497 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b543f82e-683d-47c1-af73-4dcede4cf4df" volumeName="kubernetes.io/empty-dir/b543f82e-683d-47c1-af73-4dcede4cf4df-tmpfs" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080509 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e3bf8eaf-5f6c-41a6-aaeb-6c921d789466" volumeName="kubernetes.io/projected/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466-kube-api-access-gkccn" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080521 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7" volumeName="kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080535 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="248a3d2f-3be4-46bf-959c-79d28736c0d6" volumeName="kubernetes.io/secret/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080551 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4291bfd-53d9-4c78-b7cb-d7eb46560528" volumeName="kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-kube-api-access-9nvl4" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080563 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f53bc282-5937-49ac-ac98-2ee37ccb268d" volumeName="kubernetes.io/projected/f53bc282-5937-49ac-ac98-2ee37ccb268d-kube-api-access-b2d4r" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080585 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a638c468-010c-4da3-ad62-26f5f2bbdbb9" volumeName="kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-client-ca" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080606 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c593e31d-82b5-4d42-992e-6b050ccf3019" volumeName="kubernetes.io/empty-dir/c593e31d-82b5-4d42-992e-6b050ccf3019-catalog-content" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080620 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3de37144-a9ab-45fb-a23f-2287a5198edf" volumeName="kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-etcd-client" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080636 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="47eadda0-35a6-4b5c-a96c-24854be15098" volumeName="kubernetes.io/secret/47eadda0-35a6-4b5c-a96c-24854be15098-tls-certificates" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080648 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d88ba8e1-ee42-423f-9839-e71cb0265c6c" volumeName="kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-images" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080659 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3de37144-a9ab-45fb-a23f-2287a5198edf" volumeName="kubernetes.io/projected/3de37144-a9ab-45fb-a23f-2287a5198edf-kube-api-access-zr8br" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080670 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="acb704a9-6c8d-4378-ae93-e7095b1fce85" volumeName="kubernetes.io/projected/acb704a9-6c8d-4378-ae93-e7095b1fce85-kube-api-access-xvx6w" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080683 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="91b2899e-8d24-41a0-bec8-d11c67b8f955" volumeName="kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-default-certificate" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080702 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f046860d-2d54-4746-8ba2-f8e90fa55e38" volumeName="kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-service-ca" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080714 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f5782718-9118-4682-a287-7998cd0304b3" volumeName="kubernetes.io/configmap/f5782718-9118-4682-a287-7998cd0304b3-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080725 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" volumeName="kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080738 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f844652-225b-4713-a9ad-cf9bcc348f47" volumeName="kubernetes.io/secret/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-key" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080753 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3de37144-a9ab-45fb-a23f-2287a5198edf" volumeName="kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080765 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="45e8b72b-564c-4bb1-b911-baff2d6c87ad" volumeName="kubernetes.io/configmap/45e8b72b-564c-4bb1-b911-baff2d6c87ad-telemetry-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080776 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5e3ddf9e-eeb5-4266-b675-092fd4e27623" volumeName="kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovnkube-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080795 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02" volumeName="kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-kube-api-access-8d57k" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080808 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f91d1788-027d-432b-be33-ca952a95046a" volumeName="kubernetes.io/projected/f91d1788-027d-432b-be33-ca952a95046a-kube-api-access-lnm6c" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080856 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1db4d695-5a6a-4fbe-b610-3777bfebed79" volumeName="kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080873 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3de37144-a9ab-45fb-a23f-2287a5198edf" volumeName="kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-etcd-serving-ca" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080886 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="42df77ec-94aa-48ba-bb35-7b1f1e8b8e97" volumeName="kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080898 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ddac301-a604-4f07-8849-5928befd336e" volumeName="kubernetes.io/projected/4ddac301-a604-4f07-8849-5928befd336e-kube-api-access-27j9q" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080910 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a25248c0-8de7-4624-b785-f053665fcb23" volumeName="kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-tls" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080947 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7" volumeName="kubernetes.io/configmap/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-trusted-ca" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080968 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f2217de0-7805-4f5f-8ea5-93b81b7e0236" volumeName="kubernetes.io/empty-dir/f2217de0-7805-4f5f-8ea5-93b81b7e0236-snapshots" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.080989 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a69e8d3a-a0b1-4688-8631-d9f265aa4c69" volumeName="kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-client-certs" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081030 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a69e8d3a-a0b1-4688-8631-d9f265aa4c69" volumeName="kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-client-ca-bundle" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081047 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae39c09b-7aef-4615-8ced-0dcad39f23a5" volumeName="kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-auth-proxy-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081059 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b98b4efc-6117-487f-9cfc-38ce66dd9570" volumeName="kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081071 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97" volumeName="kubernetes.io/projected/e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97-kube-api-access-6wl7f" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081083 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a25248c0-8de7-4624-b785-f053665fcb23" volumeName="kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-metrics-client-ca" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081095 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2a23d24-9e09-431e-8c3b-8456ff51a8d0" volumeName="kubernetes.io/configmap/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081108 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2a23d24-9e09-431e-8c3b-8456ff51a8d0" volumeName="kubernetes.io/secret/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081163 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="248a3d2f-3be4-46bf-959c-79d28736c0d6" volumeName="kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-script-lib" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081179 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46de2acc-9f5d-4ecf-befe-a480f86466f5" volumeName="kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081221 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7e451189-850e-4d19-a40c-40f642d08511" volumeName="kubernetes.io/projected/7e451189-850e-4d19-a40c-40f642d08511-kube-api-access-snmpq" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081242 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a69e8d3a-a0b1-4688-8631-d9f265aa4c69" volumeName="kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-configmap-kubelet-serving-ca-bundle" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081255 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="42df77ec-94aa-48ba-bb35-7b1f1e8b8e97" volumeName="kubernetes.io/projected/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-kube-api-access-wrws5" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081289 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="469183dd-dc54-467d-82a1-611132ae8ec4" volumeName="kubernetes.io/projected/469183dd-dc54-467d-82a1-611132ae8ec4-kube-api-access-8dtbl" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081304 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7b489385-2c96-4a97-8b31-362162de020e" volumeName="kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081321 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7" volumeName="kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cni-binary-copy" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081343 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="68252533-bd64-4fc5-838a-cc350cbe77f0" volumeName="kubernetes.io/configmap/68252533-bd64-4fc5-838a-cc350cbe77f0-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081380 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e" volumeName="kubernetes.io/projected/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e-kube-api-access-htv9s" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081394 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f1468ec0-2aa4-461c-a62f-e9f067be490f" volumeName="kubernetes.io/configmap/f1468ec0-2aa4-461c-a62f-e9f067be490f-metrics-client-ca" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081408 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1375da42-ecaf-4d86-b554-25fd1c3d00bd" volumeName="kubernetes.io/secret/1375da42-ecaf-4d86-b554-25fd1c3d00bd-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081451 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3de37144-a9ab-45fb-a23f-2287a5198edf" volumeName="kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-audit-policies" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081468 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46de2acc-9f5d-4ecf-befe-a480f86466f5" volumeName="kubernetes.io/projected/46de2acc-9f5d-4ecf-befe-a480f86466f5-kube-api-access-4f9vt" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081484 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="813f91c2-2b37-4681-968d-4217e286e22f" volumeName="kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081498 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c7f5e6cd-e093-409a-8758-d3db7a7eb32c" volumeName="kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-images" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081549 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f471ecc-922c-4cb1-9bdd-fdb5da08c592" volumeName="kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081563 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="45e8b72b-564c-4bb1-b911-baff2d6c87ad" volumeName="kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081581 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46de2acc-9f5d-4ecf-befe-a480f86466f5" volumeName="kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-encryption-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081616 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="df428d5a-c722-4536-8e7f-cdd85c560481" volumeName="kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081638 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f046860d-2d54-4746-8ba2-f8e90fa55e38" volumeName="kubernetes.io/projected/f046860d-2d54-4746-8ba2-f8e90fa55e38-kube-api-access-xhkh7" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081653 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="12e1d9e5-96b5-4367-81a5-d87b3f93d8da" volumeName="kubernetes.io/projected/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-kube-api-access-g2qf7" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081668 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3de37144-a9ab-45fb-a23f-2287a5198edf" volumeName="kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-encryption-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081714 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a638c468-010c-4da3-ad62-26f5f2bbdbb9" volumeName="kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-config" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081737 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab175f7e-a5e8-4fda-98c9-6d052a221a83" volumeName="kubernetes.io/empty-dir/ab175f7e-a5e8-4fda-98c9-6d052a221a83-available-featuregates" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081773 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb0fc10f-5796-4cd5-b8f5-72d678054c24" volumeName="kubernetes.io/projected/fb0fc10f-5796-4cd5-b8f5-72d678054c24-kube-api-access-lh47j" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081791 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4291bfd-53d9-4c78-b7cb-d7eb46560528" volumeName="kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-bound-sa-token" seLinuxMountContext="" Mar 20 08:40:55.081531 master-0 kubenswrapper[18707]: I0320 08:40:55.081821 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f1468ec0-2aa4-461c-a62f-e9f067be490f" volumeName="kubernetes.io/empty-dir/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-textfile" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.081864 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="86cb5d23-df7f-4f67-8086-1789d8e68544" volumeName="kubernetes.io/projected/86cb5d23-df7f-4f67-8086-1789d8e68544-kube-api-access-j7k8m" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.081881 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="91b2899e-8d24-41a0-bec8-d11c67b8f955" volumeName="kubernetes.io/configmap/91b2899e-8d24-41a0-bec8-d11c67b8f955-service-ca-bundle" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.081893 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a18b9230-de78-41b8-a61e-361b8bb1fbb3" volumeName="kubernetes.io/projected/a18b9230-de78-41b8-a61e-361b8bb1fbb3-kube-api-access-8crkc" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.081905 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="acb704a9-6c8d-4378-ae93-e7095b1fce85" volumeName="kubernetes.io/configmap/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.081940 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4291bfd-53d9-4c78-b7cb-d7eb46560528" volumeName="kubernetes.io/configmap/b4291bfd-53d9-4c78-b7cb-d7eb46560528-trusted-ca" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.081974 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f471ecc-922c-4cb1-9bdd-fdb5da08c592" volumeName="kubernetes.io/projected/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-kube-api-access-224dg" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.081987 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="75e3e2cc-aa56-41f3-8859-1c086f419d05" volumeName="kubernetes.io/secret/75e3e2cc-aa56-41f3-8859-1c086f419d05-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082031 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="325f0a83-d56d-4b62-977b-088a7d5f0e00" volumeName="kubernetes.io/configmap/325f0a83-d56d-4b62-977b-088a7d5f0e00-config" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082046 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46de2acc-9f5d-4ecf-befe-a480f86466f5" volumeName="kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-etcd-client" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082059 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02" volumeName="kubernetes.io/configmap/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-trusted-ca" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082071 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f046860d-2d54-4746-8ba2-f8e90fa55e38" volumeName="kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-client" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082110 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f1468ec0-2aa4-461c-a62f-e9f067be490f" volumeName="kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-tls" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082127 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ad692349-5089-4afc-85b2-9b6e7997567c" volumeName="kubernetes.io/secret/ad692349-5089-4afc-85b2-9b6e7997567c-metrics-tls" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082140 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="86cb5d23-df7f-4f67-8086-1789d8e68544" volumeName="kubernetes.io/empty-dir/86cb5d23-df7f-4f67-8086-1789d8e68544-operand-assets" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082152 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a69e8d3a-a0b1-4688-8631-d9f265aa4c69" volumeName="kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-server-tls" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082162 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b98b4efc-6117-487f-9cfc-38ce66dd9570" volumeName="kubernetes.io/projected/b98b4efc-6117-487f-9cfc-38ce66dd9570-kube-api-access-6c5ch" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082214 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21bebade-17fa-444e-92a9-eea53d6cd673" volumeName="kubernetes.io/configmap/21bebade-17fa-444e-92a9-eea53d6cd673-auth-proxy-config" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082226 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="325f0a83-d56d-4b62-977b-088a7d5f0e00" volumeName="kubernetes.io/secret/325f0a83-d56d-4b62-977b-088a7d5f0e00-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082238 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="42df77ec-94aa-48ba-bb35-7b1f1e8b8e97" volumeName="kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-auth-proxy-config" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082280 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4fea9b05-222e-4b58-95c8-735fc1cf3a8b" volumeName="kubernetes.io/projected/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-kube-api-access-qstvb" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082298 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7" volumeName="kubernetes.io/projected/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-kube-api-access-fxg4z" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082310 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f2217de0-7805-4f5f-8ea5-93b81b7e0236" volumeName="kubernetes.io/projected/f2217de0-7805-4f5f-8ea5-93b81b7e0236-kube-api-access-82x7p" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082323 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9a9ecf2-c476-4962-8333-21f242dbcb89" volumeName="kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-proxy-ca-bundles" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082362 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="248a3d2f-3be4-46bf-959c-79d28736c0d6" volumeName="kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-env-overrides" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082377 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab175f7e-a5e8-4fda-98c9-6d052a221a83" volumeName="kubernetes.io/secret/ab175f7e-a5e8-4fda-98c9-6d052a221a83-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082390 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c1854ea4-c8e2-4289-84b6-1f18b2ac684f" volumeName="kubernetes.io/secret/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082403 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="de6078d7-2aad-46fe-b17a-b6b38e4eaa41" volumeName="kubernetes.io/projected/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-kube-api-access-66kz7" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082447 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f91d1788-027d-432b-be33-ca952a95046a" volumeName="kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-kube-rbac-proxy-config" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082463 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" volumeName="kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-service-ca-bundle" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082477 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4fea9b05-222e-4b58-95c8-735fc1cf3a8b" volumeName="kubernetes.io/projected/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-ca-certs" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082513 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b543f82e-683d-47c1-af73-4dcede4cf4df" volumeName="kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-apiservice-cert" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082531 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbc0b783-28d5-4554-b49d-c66082546f44" volumeName="kubernetes.io/projected/bbc0b783-28d5-4554-b49d-c66082546f44-kube-api-access-27qvw" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082549 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f91d1788-027d-432b-be33-ca952a95046a" volumeName="kubernetes.io/configmap/f91d1788-027d-432b-be33-ca952a95046a-metrics-client-ca" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082591 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f844652-225b-4713-a9ad-cf9bcc348f47" volumeName="kubernetes.io/configmap/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-cabundle" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082610 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ddac301-a604-4f07-8849-5928befd336e" volumeName="kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-node-bootstrap-token" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082626 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f53bc282-5937-49ac-ac98-2ee37ccb268d" volumeName="kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082640 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ad692349-5089-4afc-85b2-9b6e7997567c" volumeName="kubernetes.io/projected/ad692349-5089-4afc-85b2-9b6e7997567c-kube-api-access-h6mb5" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082676 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46de2acc-9f5d-4ecf-befe-a480f86466f5" volumeName="kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-audit" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082692 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4fea9b05-222e-4b58-95c8-735fc1cf3a8b" volumeName="kubernetes.io/secret/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-catalogserver-certs" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082706 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="86cb5d23-df7f-4f67-8086-1789d8e68544" volumeName="kubernetes.io/secret/86cb5d23-df7f-4f67-8086-1789d8e68544-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082722 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96de6024-e20f-4b52-9294-b330d65e4153" volumeName="kubernetes.io/projected/96de6024-e20f-4b52-9294-b330d65e4153-kube-api-access-z8bxz" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082779 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a25248c0-8de7-4624-b785-f053665fcb23" volumeName="kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082790 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2a23d24-9e09-431e-8c3b-8456ff51a8d0" volumeName="kubernetes.io/projected/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-kube-api-access-btdjt" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082804 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae39c09b-7aef-4615-8ced-0dcad39f23a5" volumeName="kubernetes.io/projected/ae39c09b-7aef-4615-8ced-0dcad39f23a5-kube-api-access-rxn2f" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082815 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f1468ec0-2aa4-461c-a62f-e9f067be490f" volumeName="kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-kube-rbac-proxy-config" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082845 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f2217de0-7805-4f5f-8ea5-93b81b7e0236" volumeName="kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-service-ca-bundle" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082857 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1db4d695-5a6a-4fbe-b610-3777bfebed79" volumeName="kubernetes.io/configmap/1db4d695-5a6a-4fbe-b610-3777bfebed79-metrics-client-ca" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082867 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="469183dd-dc54-467d-82a1-611132ae8ec4" volumeName="kubernetes.io/configmap/469183dd-dc54-467d-82a1-611132ae8ec4-cco-trusted-ca" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082880 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5e3ddf9e-eeb5-4266-b675-092fd4e27623" volumeName="kubernetes.io/secret/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082890 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="75e3e2cc-aa56-41f3-8859-1c086f419d05" volumeName="kubernetes.io/projected/75e3e2cc-aa56-41f3-8859-1c086f419d05-kube-api-access-clz6w" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082901 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a69e8d3a-a0b1-4688-8631-d9f265aa4c69" volumeName="kubernetes.io/projected/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-kube-api-access-c6p8v" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082934 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f2217de0-7805-4f5f-8ea5-93b81b7e0236" volumeName="kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-trusted-ca-bundle" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082946 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="12e1d9e5-96b5-4367-81a5-d87b3f93d8da" volumeName="kubernetes.io/configmap/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-config-volume" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082958 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1375da42-ecaf-4d86-b554-25fd1c3d00bd" volumeName="kubernetes.io/projected/1375da42-ecaf-4d86-b554-25fd1c3d00bd-kube-api-access" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082969 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="248a3d2f-3be4-46bf-959c-79d28736c0d6" volumeName="kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-config" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.082978 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="68252533-bd64-4fc5-838a-cc350cbe77f0" volumeName="kubernetes.io/projected/68252533-bd64-4fc5-838a-cc350cbe77f0-kube-api-access-75r7k" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.083012 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c7f5e6cd-e093-409a-8758-d3db7a7eb32c" volumeName="kubernetes.io/projected/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-kube-api-access-plc2q" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.083042 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a80bd6f-2263-4251-8197-5173193f8afc" volumeName="kubernetes.io/projected/6a80bd6f-2263-4251-8197-5173193f8afc-kube-api-access-tqd2v" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.083053 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="813f91c2-2b37-4681-968d-4217e286e22f" volumeName="kubernetes.io/projected/813f91c2-2b37-4681-968d-4217e286e22f-kube-api-access-njjkt" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.083064 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9a9ecf2-c476-4962-8333-21f242dbcb89" volumeName="kubernetes.io/secret/a9a9ecf2-c476-4962-8333-21f242dbcb89-serving-cert" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.083078 18707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c1854ea4-c8e2-4289-84b6-1f18b2ac684f" volumeName="kubernetes.io/configmap/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-config" seLinuxMountContext="" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.083088 18707 reconstruct.go:97] "Volume reconstruction finished" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.083114 18707 reconciler.go:26] "Reconciler: start to sync state" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.087058 18707 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 08:40:55.089657 master-0 kubenswrapper[18707]: I0320 08:40:55.089707 18707 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 08:40:55.092790 master-0 kubenswrapper[18707]: I0320 08:40:55.092749 18707 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 08:40:55.092846 master-0 kubenswrapper[18707]: I0320 08:40:55.092798 18707 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 08:40:55.092846 master-0 kubenswrapper[18707]: I0320 08:40:55.092832 18707 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 08:40:55.092930 master-0 kubenswrapper[18707]: E0320 08:40:55.092885 18707 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 08:40:55.095818 master-0 kubenswrapper[18707]: I0320 08:40:55.095780 18707 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 08:40:55.114395 master-0 kubenswrapper[18707]: I0320 08:40:55.114253 18707 generic.go:334] "Generic (PLEG): container finished" podID="3de37144-a9ab-45fb-a23f-2287a5198edf" containerID="078946765e0bcafd3c39a471f72aabe9c5152a4c66ba4a584be214e5cb42544f" exitCode=0 Mar 20 08:40:55.119611 master-0 kubenswrapper[18707]: I0320 08:40:55.119487 18707 generic.go:334] "Generic (PLEG): container finished" podID="c0a17669-a122-44aa-bdda-581bf1fc4649" containerID="4d95b114dff245825e6087b6ba414ae9712434ff235bee5e21733cbc9dc925e4" exitCode=0 Mar 20 08:40:55.119611 master-0 kubenswrapper[18707]: I0320 08:40:55.119585 18707 generic.go:334] "Generic (PLEG): container finished" podID="c0a17669-a122-44aa-bdda-581bf1fc4649" containerID="b03eca8e9b81865f87dea3515203478115ad1b39533d7a34515e851d32bd2010" exitCode=0 Mar 20 08:40:55.144288 master-0 kubenswrapper[18707]: I0320 08:40:55.144174 18707 generic.go:334] "Generic (PLEG): container finished" podID="412becc8-c1a7-422c-94d1-dd1849070ef1" containerID="21b9803fda84668208544ea6b68c3d3a859b684d4b97f36df7e3a02f81f34399" exitCode=0 Mar 20 08:40:55.175898 master-0 kubenswrapper[18707]: I0320 08:40:55.175806 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 20 08:40:55.176661 master-0 kubenswrapper[18707]: I0320 08:40:55.176232 18707 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="96ff83659a940b2334d29c8de766dce2f56c0eddae2926cc1d3dd2e347430a94" exitCode=1 Mar 20 08:40:55.176661 master-0 kubenswrapper[18707]: I0320 08:40:55.176260 18707 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="9f89ba02fad31edac59462f307fd61765f540c624fde9c9ab1bb13b80a642b0c" exitCode=0 Mar 20 08:40:55.184901 master-0 kubenswrapper[18707]: I0320 08:40:55.183168 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-gzg9m_ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/ingress-operator/1.log" Mar 20 08:40:55.184901 master-0 kubenswrapper[18707]: I0320 08:40:55.184392 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-gzg9m_ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/ingress-operator/0.log" Mar 20 08:40:55.184901 master-0 kubenswrapper[18707]: I0320 08:40:55.184441 18707 generic.go:334] "Generic (PLEG): container finished" podID="ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02" containerID="32acfc021b8f8071fac0cc1a8b0129efcea8236c65c56620ec15567dda3b37db" exitCode=1 Mar 20 08:40:55.184901 master-0 kubenswrapper[18707]: I0320 08:40:55.184465 18707 generic.go:334] "Generic (PLEG): container finished" podID="ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02" containerID="0c7a85a881c1e6ccd13a87741cbb2be0fd8a4f88ff19bd0cb0941e6a43061f79" exitCode=1 Mar 20 08:40:55.195021 master-0 kubenswrapper[18707]: E0320 08:40:55.194619 18707 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 20 08:40:55.210210 master-0 kubenswrapper[18707]: I0320 08:40:55.210105 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-p7pt6_68252533-bd64-4fc5-838a-cc350cbe77f0/openshift-controller-manager-operator/0.log" Mar 20 08:40:55.210210 master-0 kubenswrapper[18707]: I0320 08:40:55.210160 18707 generic.go:334] "Generic (PLEG): container finished" podID="68252533-bd64-4fc5-838a-cc350cbe77f0" containerID="e3f8332f8ebf3d1a4f060a0c3657a0fbf7dd52dab57065dbe98fe23495c66678" exitCode=1 Mar 20 08:40:55.223164 master-0 kubenswrapper[18707]: I0320 08:40:55.223109 18707 generic.go:334] "Generic (PLEG): container finished" podID="325f0a83-d56d-4b62-977b-088a7d5f0e00" containerID="d498d1943e73378963ddf1fcb87e4851f496c741db4b494c02cde7a06a7405b7" exitCode=0 Mar 20 08:40:55.225265 master-0 kubenswrapper[18707]: I0320 08:40:55.225070 18707 generic.go:334] "Generic (PLEG): container finished" podID="fac672fa-7660-449e-a0d1-244dc6282d76" containerID="aecbf33029725426faa2806ba773a548665753d84d9ec4f0ac83ae36cdffa3ce" exitCode=0 Mar 20 08:40:55.242174 master-0 kubenswrapper[18707]: I0320 08:40:55.242117 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-62zrx_29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/authentication-operator/1.log" Mar 20 08:40:55.242416 master-0 kubenswrapper[18707]: I0320 08:40:55.242210 18707 generic.go:334] "Generic (PLEG): container finished" podID="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" containerID="8009103455f6e5b4ff5637d33a8290182816ff88ad007a0262e9e5800739bab4" exitCode=255 Mar 20 08:40:55.250096 master-0 kubenswrapper[18707]: I0320 08:40:55.250029 18707 generic.go:334] "Generic (PLEG): container finished" podID="4490a747-da2d-4f1a-8986-bc2c1c58424b" containerID="0521d9515acccdbef13de273c2fd3fc8c0c08193b40755e745ddfeeb3789e32d" exitCode=0 Mar 20 08:40:55.252818 master-0 kubenswrapper[18707]: I0320 08:40:55.252792 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-6t5vb_fb0fc10f-5796-4cd5-b8f5-72d678054c24/approver/0.log" Mar 20 08:40:55.253202 master-0 kubenswrapper[18707]: I0320 08:40:55.253150 18707 generic.go:334] "Generic (PLEG): container finished" podID="fb0fc10f-5796-4cd5-b8f5-72d678054c24" containerID="11fb9901a3c95280d4ada671324191ca15e473462e8fdb0e196a3a680e2d44a1" exitCode=1 Mar 20 08:40:55.262661 master-0 kubenswrapper[18707]: I0320 08:40:55.262620 18707 generic.go:334] "Generic (PLEG): container finished" podID="c1854ea4-c8e2-4289-84b6-1f18b2ac684f" containerID="f44e15f6f5dbc67eb3bd400bcba69ac1adc2aa6d2546e0cbc2778889121d3bcf" exitCode=0 Mar 20 08:40:55.297267 master-0 kubenswrapper[18707]: I0320 08:40:55.297199 18707 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="d11c59462bae131a903a2325334d63a1ab5222611dbbb14b008e128c7a2a34a9" exitCode=0 Mar 20 08:40:55.315209 master-0 kubenswrapper[18707]: I0320 08:40:55.315124 18707 generic.go:334] "Generic (PLEG): container finished" podID="8dd3d3608fe9c86b0f65904ec2353df4" containerID="2bb061585e269ad50f22944212861cbe8de65df048c827dffb60a910fb8f58b1" exitCode=0 Mar 20 08:40:55.348215 master-0 kubenswrapper[18707]: I0320 08:40:55.344053 18707 generic.go:334] "Generic (PLEG): container finished" podID="91b2899e-8d24-41a0-bec8-d11c67b8f955" containerID="239c73fe7acb21fae05105f39e9db825bd1c976521db41e975ce454317c12c28" exitCode=0 Mar 20 08:40:55.377245 master-0 kubenswrapper[18707]: I0320 08:40:55.376856 18707 generic.go:334] "Generic (PLEG): container finished" podID="75e3e2cc-aa56-41f3-8859-1c086f419d05" containerID="adad78c0e9b98fa52036392a08a46d2f94455f9ec32c4bacb460a8042eb8ee5e" exitCode=0 Mar 20 08:40:55.408699 master-0 kubenswrapper[18707]: E0320 08:40:55.408640 18707 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 20 08:40:55.412740 master-0 kubenswrapper[18707]: I0320 08:40:55.412712 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_e3c8b9da1cd5cef8ca0690a6bbf5a601/kube-scheduler-cert-syncer/0.log" Mar 20 08:40:55.414559 master-0 kubenswrapper[18707]: I0320 08:40:55.414504 18707 generic.go:334] "Generic (PLEG): container finished" podID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerID="73a7f9993a52ad274232661b06d25f3b18e0675faed4b301aeb4072dcc7cfa79" exitCode=0 Mar 20 08:40:55.414559 master-0 kubenswrapper[18707]: I0320 08:40:55.414539 18707 generic.go:334] "Generic (PLEG): container finished" podID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerID="230d37232882904f1764e96ed6057bf568baed29ff892b892e215ce87e945710" exitCode=2 Mar 20 08:40:55.414559 master-0 kubenswrapper[18707]: I0320 08:40:55.414548 18707 generic.go:334] "Generic (PLEG): container finished" podID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerID="6e31068727643e077f1c9461b5883b919e163a79d9088735e4c5d39688c47867" exitCode=0 Mar 20 08:40:55.414559 master-0 kubenswrapper[18707]: I0320 08:40:55.414557 18707 generic.go:334] "Generic (PLEG): container finished" podID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerID="b43c5f4dbc5493b32b9934371a1875a8e1d7c69940c30587bfa291adee73b603" exitCode=0 Mar 20 08:40:55.417402 master-0 kubenswrapper[18707]: I0320 08:40:55.417374 18707 generic.go:334] "Generic (PLEG): container finished" podID="46de2acc-9f5d-4ecf-befe-a480f86466f5" containerID="a5663c5e028603466a885bf8e6c2930eae2c60da4e4cb920e82d9e29e7d29f42" exitCode=0 Mar 20 08:40:55.419539 master-0 kubenswrapper[18707]: I0320 08:40:55.419410 18707 generic.go:334] "Generic (PLEG): container finished" podID="d245e5b2-a30d-45c8-9b79-6e8096765c14" containerID="215282b36f0afcf690dcc7252b249b0c54821b459a2fe5a0ff25640fd36b6290" exitCode=0 Mar 20 08:40:55.420990 master-0 kubenswrapper[18707]: I0320 08:40:55.420957 18707 generic.go:334] "Generic (PLEG): container finished" podID="7e219558-98b7-4528-88cf-97b87cd1eb6c" containerID="7ea4527aa6e7513c4d82b891ac586f8993c11e7ba38c1d1a048fc3535809e191" exitCode=0 Mar 20 08:40:55.424549 master-0 kubenswrapper[18707]: I0320 08:40:55.424522 18707 generic.go:334] "Generic (PLEG): container finished" podID="8b1c7a56-5d00-468a-bb8d-dbaf8e854951" containerID="5b4a47b78349fa5185bcf45526d28c821dd34bc78966a86b575a5f0037835565" exitCode=0 Mar 20 08:40:55.429271 master-0 kubenswrapper[18707]: I0320 08:40:55.428972 18707 generic.go:334] "Generic (PLEG): container finished" podID="654b5b1c-2764-415c-bb13-aa06899f4076" containerID="40882309ca6cfeb4b89a668f255895a49cb18211d2d4c38846d98d1ae8591f1f" exitCode=0 Mar 20 08:40:55.429271 master-0 kubenswrapper[18707]: I0320 08:40:55.428996 18707 generic.go:334] "Generic (PLEG): container finished" podID="654b5b1c-2764-415c-bb13-aa06899f4076" containerID="ba2c385399d075f7d3be23f2cb6f802e608ef356862884d90b8c839e8667b5b3" exitCode=0 Mar 20 08:40:55.441108 master-0 kubenswrapper[18707]: I0320 08:40:55.441050 18707 generic.go:334] "Generic (PLEG): container finished" podID="f1468ec0-2aa4-461c-a62f-e9f067be490f" containerID="883d2d12dc7b471a5dda61efc08657fb43e4a9f74d94e048d8c741bca0b177ad" exitCode=0 Mar 20 08:40:55.458950 master-0 kubenswrapper[18707]: I0320 08:40:55.458861 18707 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="a2139218314ea5d5d1e04c37be758e7a9f90c106dd3c470737be6550fb6322a9" exitCode=0 Mar 20 08:40:55.460003 master-0 kubenswrapper[18707]: I0320 08:40:55.459137 18707 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="31b5815996c66a028a6e102943aed8dd0cbf1cb918ec3a5b728d9fb0cb098506" exitCode=0 Mar 20 08:40:55.460091 master-0 kubenswrapper[18707]: I0320 08:40:55.460076 18707 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="91045cb8c13e35ca1f0bfb21ba636da24cd41b91eea8db817a9a5a02317192b3" exitCode=0 Mar 20 08:40:55.483359 master-0 kubenswrapper[18707]: I0320 08:40:55.483315 18707 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="09df5a13ce7374304f28bc120919f2392b8b1eedb768ae74aa71f1f46b1260f3" exitCode=0 Mar 20 08:40:55.483359 master-0 kubenswrapper[18707]: I0320 08:40:55.483354 18707 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="0562124cc868051528c8c76baabb685e9f641cfd32418a6cbc0b305b7b8b1525" exitCode=0 Mar 20 08:40:55.483359 master-0 kubenswrapper[18707]: I0320 08:40:55.483363 18707 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="ca1d7ca00a56b55ea93c4440e9e959ff93d3c3b08431ba60809fba320b9496a7" exitCode=0 Mar 20 08:40:55.483569 master-0 kubenswrapper[18707]: I0320 08:40:55.483372 18707 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="d7a9f93548b4324f9218b5fb15026983da36f57336679426ecdeef802c274095" exitCode=0 Mar 20 08:40:55.483569 master-0 kubenswrapper[18707]: I0320 08:40:55.483380 18707 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="2f1eef10f4235bf6943bb1062fc964d69fc5c901795041a7ddca120ef33de66d" exitCode=0 Mar 20 08:40:55.483569 master-0 kubenswrapper[18707]: I0320 08:40:55.483391 18707 generic.go:334] "Generic (PLEG): container finished" podID="b98b4efc-6117-487f-9cfc-38ce66dd9570" containerID="5772594b3f3e6aae19a5e357ad1c9bc0dade5e494667c07e21d51c8697d24253" exitCode=0 Mar 20 08:40:55.486371 master-0 kubenswrapper[18707]: I0320 08:40:55.486321 18707 generic.go:334] "Generic (PLEG): container finished" podID="f046860d-2d54-4746-8ba2-f8e90fa55e38" containerID="0bd5b879aadefbf9a44c6c6dcf866736c73041c10aa61588b2570b45b21eb4d2" exitCode=0 Mar 20 08:40:55.489274 master-0 kubenswrapper[18707]: I0320 08:40:55.489238 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7bd846bfc4-mt454_ad692349-5089-4afc-85b2-9b6e7997567c/network-operator/0.log" Mar 20 08:40:55.489341 master-0 kubenswrapper[18707]: I0320 08:40:55.489297 18707 generic.go:334] "Generic (PLEG): container finished" podID="ad692349-5089-4afc-85b2-9b6e7997567c" containerID="871b2216305e9883e9345351545dfa140768d0b1a9975612361ac0d841fda34c" exitCode=255 Mar 20 08:40:55.493780 master-0 kubenswrapper[18707]: I0320 08:40:55.493679 18707 generic.go:334] "Generic (PLEG): container finished" podID="c593e31d-82b5-4d42-992e-6b050ccf3019" containerID="06e5d2b0055041a2ae0e49aa4151d374fab625e94cc004550ff8ef85c3bfe80e" exitCode=0 Mar 20 08:40:55.493780 master-0 kubenswrapper[18707]: I0320 08:40:55.493719 18707 generic.go:334] "Generic (PLEG): container finished" podID="c593e31d-82b5-4d42-992e-6b050ccf3019" containerID="633b6e62526e4e0cbd07fd4d4b0af4afaceded4e0aa25ebac529d888061b8a40" exitCode=0 Mar 20 08:40:55.496623 master-0 kubenswrapper[18707]: I0320 08:40:55.496576 18707 generic.go:334] "Generic (PLEG): container finished" podID="fdfdabb8-83d6-4b38-a709-9e354062ba1a" containerID="ef7d3c19081b3942ae839231125bb3d9ed41e1148d63c694dd308a85f91f661c" exitCode=0 Mar 20 08:40:55.515206 master-0 kubenswrapper[18707]: I0320 08:40:55.515154 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-67dcd4998-c5742_86cb5d23-df7f-4f67-8086-1789d8e68544/cluster-olm-operator/0.log" Mar 20 08:40:55.517783 master-0 kubenswrapper[18707]: I0320 08:40:55.515988 18707 generic.go:334] "Generic (PLEG): container finished" podID="86cb5d23-df7f-4f67-8086-1789d8e68544" containerID="aba047bf62fdafe49170c75e4151358bb7421fff8f12793b7c505a061479cedc" exitCode=255 Mar 20 08:40:55.517783 master-0 kubenswrapper[18707]: I0320 08:40:55.516035 18707 generic.go:334] "Generic (PLEG): container finished" podID="86cb5d23-df7f-4f67-8086-1789d8e68544" containerID="e5ff2b8e0eaf0ec4027bec63b2223736cf7655c761ce84b8c07f00889c293502" exitCode=0 Mar 20 08:40:55.517783 master-0 kubenswrapper[18707]: I0320 08:40:55.516045 18707 generic.go:334] "Generic (PLEG): container finished" podID="86cb5d23-df7f-4f67-8086-1789d8e68544" containerID="eb336787da69f3659db83e9b59377f619a1bab475c9f6f4fe67e34d16e998717" exitCode=0 Mar 20 08:40:55.525838 master-0 kubenswrapper[18707]: I0320 08:40:55.525791 18707 generic.go:334] "Generic (PLEG): container finished" podID="b639e578-628e-404d-b759-8b6e84e771d9" containerID="70cbec433b4a5afb013d99a248cda222d66b2abdceddd1d72d46fce02f57b45d" exitCode=0 Mar 20 08:40:55.525838 master-0 kubenswrapper[18707]: I0320 08:40:55.525834 18707 generic.go:334] "Generic (PLEG): container finished" podID="b639e578-628e-404d-b759-8b6e84e771d9" containerID="1270d65f2b1bd2bb8e9f27e0d20a7b179ff2340a8f906b8f6439c6b5966d578b" exitCode=0 Mar 20 08:40:55.528932 master-0 kubenswrapper[18707]: I0320 08:40:55.528906 18707 generic.go:334] "Generic (PLEG): container finished" podID="fa759777-de22-4440-a3d3-ad429a3b8e7b" containerID="9fc8932c5b9da1ed7961a714ed5b905683dbfe51746690c485563d67ef3b0b29" exitCode=0 Mar 20 08:40:55.540095 master-0 kubenswrapper[18707]: I0320 08:40:55.540057 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_e1d21f11-7386-4a04-a82e-5a03f3602a3b/installer/0.log" Mar 20 08:40:55.540174 master-0 kubenswrapper[18707]: I0320 08:40:55.540118 18707 generic.go:334] "Generic (PLEG): container finished" podID="e1d21f11-7386-4a04-a82e-5a03f3602a3b" containerID="bf471cdb978763d680a893df02a2a47dbe930e97fc0ccb05e480229f6feda593" exitCode=1 Mar 20 08:40:55.546433 master-0 kubenswrapper[18707]: I0320 08:40:55.546374 18707 generic.go:334] "Generic (PLEG): container finished" podID="248a3d2f-3be4-46bf-959c-79d28736c0d6" containerID="f4a74ff585c6a7d1deca8c58f38e8ca10a816620bf09146c1f9ff9a31d89c1a7" exitCode=0 Mar 20 08:40:55.563974 master-0 kubenswrapper[18707]: I0320 08:40:55.563908 18707 generic.go:334] "Generic (PLEG): container finished" podID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerID="32a6a1f6b15d34dd1d0099bb33c1df2bcbe7798374f91764aaa7bb5b94a96471" exitCode=0 Mar 20 08:40:55.563974 master-0 kubenswrapper[18707]: I0320 08:40:55.563945 18707 generic.go:334] "Generic (PLEG): container finished" podID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerID="6fafac3f004d2f582e04bae9436b72da7fad6247504ddaf33a3c755f3641fa2c" exitCode=0 Mar 20 08:40:55.574937 master-0 kubenswrapper[18707]: I0320 08:40:55.574894 18707 generic.go:334] "Generic (PLEG): container finished" podID="c2a23d24-9e09-431e-8c3b-8456ff51a8d0" containerID="4f3597589c3cef89a127e7d3fec06145ebc7ceb92ec5a686f505064f491c35a3" exitCode=0 Mar 20 08:40:55.577630 master-0 kubenswrapper[18707]: I0320 08:40:55.577600 18707 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="8cdf7ffa9625537bd484b3cd72f3ca62a1fbd66303b800564461ec0e3e2735c7" exitCode=0 Mar 20 08:40:55.808776 master-0 kubenswrapper[18707]: E0320 08:40:55.808733 18707 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 20 08:40:55.812988 master-0 kubenswrapper[18707]: I0320 08:40:55.812964 18707 manager.go:324] Recovery completed Mar 20 08:40:55.951037 master-0 kubenswrapper[18707]: I0320 08:40:55.949474 18707 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 08:40:55.951037 master-0 kubenswrapper[18707]: I0320 08:40:55.949516 18707 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 08:40:55.951037 master-0 kubenswrapper[18707]: I0320 08:40:55.949567 18707 state_mem.go:36] "Initialized new in-memory state store" Mar 20 08:40:55.951037 master-0 kubenswrapper[18707]: I0320 08:40:55.949867 18707 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 20 08:40:55.951037 master-0 kubenswrapper[18707]: I0320 08:40:55.949881 18707 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 20 08:40:55.951037 master-0 kubenswrapper[18707]: I0320 08:40:55.949912 18707 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 20 08:40:55.951037 master-0 kubenswrapper[18707]: I0320 08:40:55.949919 18707 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 20 08:40:55.951037 master-0 kubenswrapper[18707]: I0320 08:40:55.949927 18707 policy_none.go:49] "None policy: Start" Mar 20 08:40:55.956399 master-0 kubenswrapper[18707]: I0320 08:40:55.956361 18707 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 08:40:55.956399 master-0 kubenswrapper[18707]: I0320 08:40:55.956400 18707 state_mem.go:35] "Initializing new in-memory state store" Mar 20 08:40:55.956617 master-0 kubenswrapper[18707]: I0320 08:40:55.956606 18707 state_mem.go:75] "Updated machine memory state" Mar 20 08:40:55.956617 master-0 kubenswrapper[18707]: I0320 08:40:55.956619 18707 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 20 08:40:55.973074 master-0 kubenswrapper[18707]: I0320 08:40:55.972518 18707 manager.go:334] "Starting Device Plugin manager" Mar 20 08:40:55.973074 master-0 kubenswrapper[18707]: I0320 08:40:55.972630 18707 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 08:40:55.973074 master-0 kubenswrapper[18707]: I0320 08:40:55.972646 18707 server.go:79] "Starting device plugin registration server" Mar 20 08:40:55.978669 master-0 kubenswrapper[18707]: I0320 08:40:55.974506 18707 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 08:40:55.978669 master-0 kubenswrapper[18707]: I0320 08:40:55.974632 18707 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 08:40:55.978669 master-0 kubenswrapper[18707]: I0320 08:40:55.974771 18707 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 08:40:55.978669 master-0 kubenswrapper[18707]: I0320 08:40:55.974914 18707 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 08:40:55.978669 master-0 kubenswrapper[18707]: I0320 08:40:55.974923 18707 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 08:40:56.024622 master-0 kubenswrapper[18707]: I0320 08:40:56.022224 18707 apiserver.go:52] "Watching apiserver" Mar 20 08:40:56.056271 master-0 kubenswrapper[18707]: I0320 08:40:56.056198 18707 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 08:40:56.075097 master-0 kubenswrapper[18707]: I0320 08:40:56.074981 18707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:40:56.077945 master-0 kubenswrapper[18707]: I0320 08:40:56.077898 18707 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:40:56.078011 master-0 kubenswrapper[18707]: I0320 08:40:56.077951 18707 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:40:56.078011 master-0 kubenswrapper[18707]: I0320 08:40:56.077966 18707 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:40:56.078135 master-0 kubenswrapper[18707]: I0320 08:40:56.078100 18707 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 20 08:40:56.084986 master-0 kubenswrapper[18707]: E0320 08:40:56.082206 18707 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 20 08:40:56.282706 master-0 kubenswrapper[18707]: I0320 08:40:56.282643 18707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:40:56.285409 master-0 kubenswrapper[18707]: I0320 08:40:56.285350 18707 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:40:56.285493 master-0 kubenswrapper[18707]: I0320 08:40:56.285414 18707 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:40:56.285493 master-0 kubenswrapper[18707]: I0320 08:40:56.285429 18707 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:40:56.285561 master-0 kubenswrapper[18707]: I0320 08:40:56.285540 18707 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 20 08:40:56.288579 master-0 kubenswrapper[18707]: E0320 08:40:56.288539 18707 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 20 08:40:56.609972 master-0 kubenswrapper[18707]: I0320 08:40:56.609862 18707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0"] Mar 20 08:40:56.610412 master-0 kubenswrapper[18707]: I0320 08:40:56.610337 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j","openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz","openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr","openshift-machine-config-operator/machine-config-server-gj4pm","openshift-monitoring/node-exporter-lb4t5","openshift-dns/node-resolver-qnp9w","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0","openshift-multus/multus-2fp4b","openshift-apiserver/apiserver-779f85678d-lrzfz","openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr","openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr","openshift-kube-apiserver/kube-apiserver-master-0","openshift-marketplace/redhat-operators-jstrn","openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv","openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5","openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h","openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh","openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr","openshift-marketplace/certified-operators-cc955","openshift-marketplace/community-operators-dtqgc","openshift-ovn-kubernetes/ovnkube-node-rxdwp","openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq","openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m","openshift-kube-scheduler/installer-3-master-0","openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg","openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h","openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6","openshift-kube-apiserver/installer-1-master-0","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5","openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j","assisted-installer/assisted-installer-controller-w2zwp","openshift-dns-operator/dns-operator-9c5679d8f-r6dm8","openshift-network-diagnostics/network-check-target-xnrw6","openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf","openshift-config-operator/openshift-config-operator-95bf4f4d-25cml","openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4","openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv","openshift-monitoring/metrics-server-64c67d44c4-s7vfs","openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg","openshift-monitoring/prometheus-operator-6c8df6d4b-5m857","openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh","openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx","openshift-dns/dns-default-v5h69","openshift-etcd/etcd-master-0","openshift-ingress/router-default-7dcf5569b5-xmvwz","openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4","openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-network-operator/iptables-alerter-dd9wv","openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l","openshift-controller-manager/controller-manager-fc56bb77c-qd4sn","openshift-kube-apiserver/installer-3-master-0","openshift-kube-scheduler/installer-2-master-0","openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-machine-config-operator/machine-config-daemon-9t8x6","openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77","openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj","openshift-service-ca/service-ca-79bc6b8d76-72j8t","openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742","openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7","openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd","openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj","openshift-network-operator/network-operator-7bd846bfc4-mt454","openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28","openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh","openshift-insights/insights-operator-68bf6ff9d6-mvfn5","openshift-marketplace/marketplace-operator-89ccd998f-mvn4t","openshift-cluster-node-tuning-operator/tuned-hb77b","openshift-kube-controller-manager/installer-2-master-0","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7","openshift-marketplace/redhat-marketplace-hqqrk","openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq","openshift-multus/network-metrics-daemon-srdjm","openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj","openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h","openshift-etcd/installer-1-master-0","openshift-multus/multus-additional-cni-plugins-rpbcn","openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6","openshift-network-node-identity/network-node-identity-6t5vb"] Mar 20 08:40:56.612887 master-0 kubenswrapper[18707]: I0320 08:40:56.612830 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-w2zwp" Mar 20 08:40:56.614068 master-0 kubenswrapper[18707]: I0320 08:40:56.614028 18707 kubelet.go:2566] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="c89340c6-97f7-4855-950d-1c17da08b16a" Mar 20 08:40:56.615258 master-0 kubenswrapper[18707]: I0320 08:40:56.615222 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 20 08:40:56.631236 master-0 kubenswrapper[18707]: I0320 08:40:56.630371 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 08:40:56.644570 master-0 kubenswrapper[18707]: I0320 08:40:56.644504 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 08:40:56.645266 master-0 kubenswrapper[18707]: I0320 08:40:56.645229 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 08:40:56.645326 master-0 kubenswrapper[18707]: I0320 08:40:56.645288 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 08:40:56.645429 master-0 kubenswrapper[18707]: I0320 08:40:56.645397 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 20 08:40:56.645584 master-0 kubenswrapper[18707]: I0320 08:40:56.645546 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 20 08:40:56.645664 master-0 kubenswrapper[18707]: I0320 08:40:56.645580 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 08:40:56.645664 master-0 kubenswrapper[18707]: I0320 08:40:56.645598 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 20 08:40:56.645664 master-0 kubenswrapper[18707]: I0320 08:40:56.645650 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 08:40:56.645813 master-0 kubenswrapper[18707]: I0320 08:40:56.645721 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 08:40:56.645813 master-0 kubenswrapper[18707]: E0320 08:40:56.645765 18707 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:56.645813 master-0 kubenswrapper[18707]: I0320 08:40:56.645789 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 08:40:56.645967 master-0 kubenswrapper[18707]: I0320 08:40:56.645842 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 08:40:56.645967 master-0 kubenswrapper[18707]: I0320 08:40:56.645872 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 08:40:56.645967 master-0 kubenswrapper[18707]: I0320 08:40:56.645549 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 08:40:56.645967 master-0 kubenswrapper[18707]: I0320 08:40:56.645935 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 08:40:56.645967 master-0 kubenswrapper[18707]: I0320 08:40:56.645959 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 08:40:56.646170 master-0 kubenswrapper[18707]: I0320 08:40:56.646002 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 08:40:56.646170 master-0 kubenswrapper[18707]: I0320 08:40:56.645568 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 08:40:56.646170 master-0 kubenswrapper[18707]: I0320 08:40:56.646056 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 08:40:56.651784 master-0 kubenswrapper[18707]: I0320 08:40:56.651731 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 20 08:40:56.651967 master-0 kubenswrapper[18707]: I0320 08:40:56.651882 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 20 08:40:56.652230 master-0 kubenswrapper[18707]: I0320 08:40:56.652168 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 20 08:40:56.653986 master-0 kubenswrapper[18707]: I0320 08:40:56.653302 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 20 08:40:56.657362 master-0 kubenswrapper[18707]: I0320 08:40:56.657277 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 08:40:56.657486 master-0 kubenswrapper[18707]: I0320 08:40:56.657179 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 08:40:56.657539 master-0 kubenswrapper[18707]: I0320 08:40:56.657517 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 20 08:40:56.657701 master-0 kubenswrapper[18707]: I0320 08:40:56.657571 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 08:40:56.658084 master-0 kubenswrapper[18707]: I0320 08:40:56.658053 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:40:56.658134 master-0 kubenswrapper[18707]: I0320 08:40:56.658121 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 08:40:56.658361 master-0 kubenswrapper[18707]: I0320 08:40:56.658168 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 08:40:56.658361 master-0 kubenswrapper[18707]: I0320 08:40:56.658252 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 08:40:56.658361 master-0 kubenswrapper[18707]: I0320 08:40:56.658258 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 08:40:56.658361 master-0 kubenswrapper[18707]: I0320 08:40:56.658333 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 08:40:56.658873 master-0 kubenswrapper[18707]: I0320 08:40:56.658371 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 08:40:56.658873 master-0 kubenswrapper[18707]: I0320 08:40:56.658411 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 08:40:56.658873 master-0 kubenswrapper[18707]: I0320 08:40:56.658471 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 08:40:56.658873 master-0 kubenswrapper[18707]: I0320 08:40:56.658489 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 08:40:56.658873 master-0 kubenswrapper[18707]: I0320 08:40:56.658560 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 08:40:56.658873 master-0 kubenswrapper[18707]: I0320 08:40:56.658568 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 08:40:56.658873 master-0 kubenswrapper[18707]: I0320 08:40:56.658057 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 08:40:56.658873 master-0 kubenswrapper[18707]: I0320 08:40:56.658060 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 08:40:56.658873 master-0 kubenswrapper[18707]: I0320 08:40:56.658679 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 08:40:56.658873 master-0 kubenswrapper[18707]: I0320 08:40:56.658809 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 08:40:56.658873 master-0 kubenswrapper[18707]: I0320 08:40:56.658838 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 08:40:56.658873 master-0 kubenswrapper[18707]: I0320 08:40:56.658335 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 08:40:56.659489 master-0 kubenswrapper[18707]: I0320 08:40:56.658681 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 08:40:56.659489 master-0 kubenswrapper[18707]: E0320 08:40:56.659094 18707 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-startup-monitor-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.659489 master-0 kubenswrapper[18707]: I0320 08:40:56.659147 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 20 08:40:56.659489 master-0 kubenswrapper[18707]: I0320 08:40:56.659236 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 08:40:56.659772 master-0 kubenswrapper[18707]: I0320 08:40:56.659686 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 08:40:56.659772 master-0 kubenswrapper[18707]: I0320 08:40:56.659729 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 08:40:56.659932 master-0 kubenswrapper[18707]: I0320 08:40:56.659901 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 08:40:56.659974 master-0 kubenswrapper[18707]: I0320 08:40:56.659950 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 08:40:56.660137 master-0 kubenswrapper[18707]: I0320 08:40:56.660086 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 08:40:56.660328 master-0 kubenswrapper[18707]: I0320 08:40:56.660303 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 20 08:40:56.663119 master-0 kubenswrapper[18707]: I0320 08:40:56.663015 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" event={"ID":"3de37144-a9ab-45fb-a23f-2287a5198edf","Type":"ContainerStarted","Data":"02b8b46e9f6cf48ded279c24ec1e51a94bbe25b122e72584be4a8549a6a9d74b"} Mar 20 08:40:56.663222 master-0 kubenswrapper[18707]: I0320 08:40:56.663120 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc955" event={"ID":"c0a17669-a122-44aa-bdda-581bf1fc4649","Type":"ContainerStarted","Data":"efcd91e8fee4e2c0e31de5da275313b21efe9ca0b897e5b0a39fdcdb9033ff18"} Mar 20 08:40:56.663222 master-0 kubenswrapper[18707]: I0320 08:40:56.663138 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc955" event={"ID":"c0a17669-a122-44aa-bdda-581bf1fc4649","Type":"ContainerDied","Data":"4d95b114dff245825e6087b6ba414ae9712434ff235bee5e21733cbc9dc925e4"} Mar 20 08:40:56.663222 master-0 kubenswrapper[18707]: I0320 08:40:56.663157 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc955" event={"ID":"c0a17669-a122-44aa-bdda-581bf1fc4649","Type":"ContainerDied","Data":"b03eca8e9b81865f87dea3515203478115ad1b39533d7a34515e851d32bd2010"} Mar 20 08:40:56.663222 master-0 kubenswrapper[18707]: I0320 08:40:56.663170 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc955" event={"ID":"c0a17669-a122-44aa-bdda-581bf1fc4649","Type":"ContainerStarted","Data":"ec3f7a57e8d7aa7239f51fc0b75ccf091bb42e503457a1919c637dd65b9da53e"} Mar 20 08:40:56.663222 master-0 kubenswrapper[18707]: I0320 08:40:56.663201 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr" event={"ID":"2bf90db0-f943-464c-8599-e36b4fc32e1c","Type":"ContainerStarted","Data":"f56cbcb6003f86134027b553b740ce400f8478f47bcb39227381ffc5427ea999"} Mar 20 08:40:56.663222 master-0 kubenswrapper[18707]: I0320 08:40:56.663217 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr" event={"ID":"2bf90db0-f943-464c-8599-e36b4fc32e1c","Type":"ContainerStarted","Data":"205e4e19a489854c9e21dadf12222f24c4ca924c96c05925ba16193713f47edd"} Mar 20 08:40:56.663409 master-0 kubenswrapper[18707]: I0320 08:40:56.663231 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr" event={"ID":"2bf90db0-f943-464c-8599-e36b4fc32e1c","Type":"ContainerStarted","Data":"3fd8857a2c2302ef6094cc86660ac7c76904e438bf13dc986cbdd24f1c5f29f3"} Mar 20 08:40:56.663409 master-0 kubenswrapper[18707]: I0320 08:40:56.663243 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" event={"ID":"f5782718-9118-4682-a287-7998cd0304b3","Type":"ContainerStarted","Data":"b0cb9210b4b2bf3cfe59c44d3722ebab536e55adf6ae14c57d5a60f0e9fe993b"} Mar 20 08:40:56.663409 master-0 kubenswrapper[18707]: I0320 08:40:56.663255 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" event={"ID":"f5782718-9118-4682-a287-7998cd0304b3","Type":"ContainerStarted","Data":"4100829f793cce2954b64f7463941089e6db3bf46fc83040646900405ad68496"} Mar 20 08:40:56.663409 master-0 kubenswrapper[18707]: I0320 08:40:56.663266 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" event={"ID":"f5782718-9118-4682-a287-7998cd0304b3","Type":"ContainerStarted","Data":"48eeec4a0ddb4a6c0b04997cccaac081db5581507160a9804ec767ba66cbdb34"} Mar 20 08:40:56.663409 master-0 kubenswrapper[18707]: I0320 08:40:56.663276 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2fp4b" event={"ID":"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7","Type":"ContainerStarted","Data":"2e3f8fb15f65cb56f636062e77511d2b7c7ac1c5b96ff94db9a664613cc3a72a"} Mar 20 08:40:56.663409 master-0 kubenswrapper[18707]: I0320 08:40:56.663289 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2fp4b" event={"ID":"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7","Type":"ContainerStarted","Data":"1cdce851761fa32d1bf31f749f94743d1cc0bca1a250ad08c114ccc4c2f77b9b"} Mar 20 08:40:56.663409 master-0 kubenswrapper[18707]: I0320 08:40:56.663299 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" event={"ID":"6a80bd6f-2263-4251-8197-5173193f8afc","Type":"ContainerStarted","Data":"d02eea2683ac6efae0f70fd8f717e97bc121a80e06da8071bf5f8c5bd619e9f0"} Mar 20 08:40:56.663409 master-0 kubenswrapper[18707]: I0320 08:40:56.663297 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 08:40:56.663409 master-0 kubenswrapper[18707]: I0320 08:40:56.663366 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:40:56.663409 master-0 kubenswrapper[18707]: I0320 08:40:56.663310 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" event={"ID":"6a80bd6f-2263-4251-8197-5173193f8afc","Type":"ContainerStarted","Data":"199be5e4e7ebbf5d512a41b53ac83599e0d0e19351865228cfbac1cc0f5d6aa8"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663424 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" event={"ID":"6a80bd6f-2263-4251-8197-5173193f8afc","Type":"ContainerStarted","Data":"4d57d4740bcb6c82be46e50156208d72579c0c7e7b87226d99fe9e3e9e1ff1bb"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663436 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hb77b" event={"ID":"a18b9230-de78-41b8-a61e-361b8bb1fbb3","Type":"ContainerStarted","Data":"5a144517ab4145de856f7fc1dbaa9248dc8d50b14986e3a4c4c4e8525dca2fdd"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663447 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hb77b" event={"ID":"a18b9230-de78-41b8-a61e-361b8bb1fbb3","Type":"ContainerStarted","Data":"1e3599d0789edeb8eec1a5568ce03b8378093e7082af3b48ff3c7ee7e6273252"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663458 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-srdjm" event={"ID":"813f91c2-2b37-4681-968d-4217e286e22f","Type":"ContainerStarted","Data":"8b504057c998514e9a6f75544fd4b2e6f3e06b14334afca0cc280a0d4b21513a"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663469 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-srdjm" event={"ID":"813f91c2-2b37-4681-968d-4217e286e22f","Type":"ContainerStarted","Data":"0cdda3b89325b61f030fe62f7e1e40dae9fd6495c82df19791b581b4f2a2b2bd"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663478 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-srdjm" event={"ID":"813f91c2-2b37-4681-968d-4217e286e22f","Type":"ContainerStarted","Data":"48ea9b1e1ed051eaf5386ce4d24d2d55f57d357f51f1c79f94723fc2aed83c0f"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663489 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh" event={"ID":"7c4e7e57-43be-4d31-b523-f7e4d316dce3","Type":"ContainerStarted","Data":"68bec1ef3f4454b1453d2de2db069e48c08d8a5c1a267f409f8da798126b9d46"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663498 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh" event={"ID":"7c4e7e57-43be-4d31-b523-f7e4d316dce3","Type":"ContainerStarted","Data":"9a8426b4146cf2f000b30dafab20a28003c24ae65a16f62f890c575d2f9770d9"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663515 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb6de0a9ae1218db5c085d7b11d80bbaa7dc058173d109d69d1996de9307a7d5" Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663525 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa276789b940e2d115bb484a83b6dacabc102c332a9c60e259aaf653e5bc53d5" Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663534 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qnp9w" event={"ID":"70020125-af49-47d7-8853-fb951c561dc4","Type":"ContainerStarted","Data":"332fd47a2a02eef67172c0fb87a227f35ffda45b1b4e74194c1d5e85f7d71a60"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663544 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qnp9w" event={"ID":"70020125-af49-47d7-8853-fb951c561dc4","Type":"ContainerStarted","Data":"e78425b2d05a4bc50aa45d6948d8f1a6996398b351024eb248c91be99ea577d1"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663553 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"879adddb20c3ea4126b46482343a718dc4153b404d31f5e2d5d624d657e93169"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663565 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"96ff83659a940b2334d29c8de766dce2f56c0eddae2926cc1d3dd2e347430a94"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663577 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"9f89ba02fad31edac59462f307fd61765f540c624fde9c9ab1bb13b80a642b0c"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663590 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"91607e86857f069937668c70d7500f8b27325b07177f03b7a7ab831b3600ac2a"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663602 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" event={"ID":"1db4d695-5a6a-4fbe-b610-3777bfebed79","Type":"ContainerStarted","Data":"17b9005f6779cb1212a0b15002baf44d2f058535e99a27fb893b2250411c5f89"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663612 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" event={"ID":"1db4d695-5a6a-4fbe-b610-3777bfebed79","Type":"ContainerStarted","Data":"856f29ac023f36f98944c12ae603aac4a4e79b44143d8336c20eaae8f55415c9"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663623 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" event={"ID":"1db4d695-5a6a-4fbe-b610-3777bfebed79","Type":"ContainerStarted","Data":"3e60c5cfa98299dd39cc95de18ed36bba9874f58f0e45ca3537df89924dd66d3"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663633 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" event={"ID":"1db4d695-5a6a-4fbe-b610-3777bfebed79","Type":"ContainerStarted","Data":"a07ae992a49295676f3184ce503f903e0b4447cd57b0d7e0c91d07d9a0f3bc30"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663643 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" event={"ID":"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02","Type":"ContainerDied","Data":"32acfc021b8f8071fac0cc1a8b0129efcea8236c65c56620ec15567dda3b37db"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663657 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" event={"ID":"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02","Type":"ContainerStarted","Data":"afe5be40b772a3679b289a32d738409fbbd0267f6b546d1fa3b047d53cf456bb"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663669 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" event={"ID":"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02","Type":"ContainerDied","Data":"0c7a85a881c1e6ccd13a87741cbb2be0fd8a4f88ff19bd0cb0941e6a43061f79"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663680 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" event={"ID":"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02","Type":"ContainerStarted","Data":"0262d134f60647f6e04ff950df203ce5bc3f1656b20c1e15f442731269c3be76"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663498 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663704 18707 scope.go:117] "RemoveContainer" containerID="0c7a85a881c1e6ccd13a87741cbb2be0fd8a4f88ff19bd0cb0941e6a43061f79" Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663690 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" event={"ID":"68252533-bd64-4fc5-838a-cc350cbe77f0","Type":"ContainerStarted","Data":"e90b46b2a24eed1acbde07d446b8c7de8acf8cbdfe00eeb63977c91e3cae9f34"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663774 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" event={"ID":"68252533-bd64-4fc5-838a-cc350cbe77f0","Type":"ContainerDied","Data":"e3f8332f8ebf3d1a4f060a0c3657a0fbf7dd52dab57065dbe98fe23495c66678"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663803 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" event={"ID":"68252533-bd64-4fc5-838a-cc350cbe77f0","Type":"ContainerStarted","Data":"9af8ad1671806bd505a390180b2388970cc534101cf6a5cf64e76e13311c0cb9"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663850 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xnrw6" event={"ID":"c0142d4e-9fd4-4375-a773-bb89b38af654","Type":"ContainerStarted","Data":"1179771f4bb82559252cc032e9b6d619a03143a4bf62b3be7c0a1d8b8023730c"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663864 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xnrw6" event={"ID":"c0142d4e-9fd4-4375-a773-bb89b38af654","Type":"ContainerStarted","Data":"5702154693e32d84807189cf18ed2f8ceb28029864edaaaff188dc529b9551c9"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663875 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" event={"ID":"325f0a83-d56d-4b62-977b-088a7d5f0e00","Type":"ContainerStarted","Data":"3fb2d44fc3d06ba7cfb01123d6eb14daa319280841df97c7fb0370eae6efe992"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663930 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" event={"ID":"325f0a83-d56d-4b62-977b-088a7d5f0e00","Type":"ContainerDied","Data":"d498d1943e73378963ddf1fcb87e4851f496c741db4b494c02cde7a06a7405b7"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663950 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" event={"ID":"325f0a83-d56d-4b62-977b-088a7d5f0e00","Type":"ContainerStarted","Data":"f826050a5c784de5b331265098c5bf62987ce32e6de55cf05cab0f3f76894ea8"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663962 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"fac672fa-7660-449e-a0d1-244dc6282d76","Type":"ContainerDied","Data":"aecbf33029725426faa2806ba773a548665753d84d9ec4f0ac83ae36cdffa3ce"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.663975 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"fac672fa-7660-449e-a0d1-244dc6282d76","Type":"ContainerDied","Data":"49b715d08715612464f503c3f66bf7c99b13a5e872383e023e27eea30084adb2"} Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.664021 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49b715d08715612464f503c3f66bf7c99b13a5e872383e023e27eea30084adb2" Mar 20 08:40:56.663964 master-0 kubenswrapper[18707]: I0320 08:40:56.664038 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6" event={"ID":"e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97","Type":"ContainerStarted","Data":"17d9a381fe77c2a99690d4e954254b88e9da3b66911db388af1c343ca887780e"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664050 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6" event={"ID":"e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97","Type":"ContainerStarted","Data":"a56a69cfc23cf8add77dfc1a237e33143ff59495f1a2048a86a1759c1954faee"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664061 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" event={"ID":"b543f82e-683d-47c1-af73-4dcede4cf4df","Type":"ContainerStarted","Data":"f1451cb7c441d0c0436b2b43cfeec19d47072600b8f030c7cbe1b7c9914bab91"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664072 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" event={"ID":"b543f82e-683d-47c1-af73-4dcede4cf4df","Type":"ContainerStarted","Data":"285790bb4eeaea0e1399502a5e31c8d8bf1bd484bccae96128ad9795ef9ca21a"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664099 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" event={"ID":"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6","Type":"ContainerStarted","Data":"df7ec56bc0dc6a5103a746a24bbb9fc1482c902df08dcd67e4b6e70f5d055d5f"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664112 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" event={"ID":"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6","Type":"ContainerDied","Data":"8009103455f6e5b4ff5637d33a8290182816ff88ad007a0262e9e5800739bab4"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664127 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" event={"ID":"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6","Type":"ContainerStarted","Data":"51fe4cded0c2312a6b3bfd1f48ff9089513eaa05b60208bd1ced0f39d8ffe36e"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664166 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" event={"ID":"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466","Type":"ContainerStarted","Data":"3ed993094a62661a225afe193a23c8a2caab31f2640837dfe5f3c3a7f7e685b0"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664179 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" event={"ID":"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466","Type":"ContainerStarted","Data":"c7891d6c35a440903f4bcbd4a40f2e1fa48f1550526df732c517bd9b6c44c0c9"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664225 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" event={"ID":"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466","Type":"ContainerStarted","Data":"ead213f06d6e13b0b8afce02cff25edfe82c583b53f661ee9bdc498f394f53a9"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664238 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"4490a747-da2d-4f1a-8986-bc2c1c58424b","Type":"ContainerDied","Data":"0521d9515acccdbef13de273c2fd3fc8c0c08193b40755e745ddfeeb3789e32d"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664248 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"4490a747-da2d-4f1a-8986-bc2c1c58424b","Type":"ContainerDied","Data":"0aa1305a973a71f928c142131df579b42fa3e776fd7926a4aa71bddb2c85fcba"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664258 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aa1305a973a71f928c142131df579b42fa3e776fd7926a4aa71bddb2c85fcba" Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664269 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-6t5vb" event={"ID":"fb0fc10f-5796-4cd5-b8f5-72d678054c24","Type":"ContainerStarted","Data":"b38f5242ee8fec0a4fb77638e3088bf483c8bc44e65e9e4b954af76f0ae77a90"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664302 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-6t5vb" event={"ID":"fb0fc10f-5796-4cd5-b8f5-72d678054c24","Type":"ContainerDied","Data":"11fb9901a3c95280d4ada671324191ca15e473462e8fdb0e196a3a680e2d44a1"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664313 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-6t5vb" event={"ID":"fb0fc10f-5796-4cd5-b8f5-72d678054c24","Type":"ContainerStarted","Data":"2530070030abd272dac9151bcbdcdd74c4a2472ddf19cb97a57dadd8614ece94"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664321 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-6t5vb" event={"ID":"fb0fc10f-5796-4cd5-b8f5-72d678054c24","Type":"ContainerStarted","Data":"37ba8dc9671f8fe4b5333f2dc9ab294ad8d004712a5f4bddaf2f6742452a4b3c"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664331 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"665625ddaf4d7d5a13e6f9aa415e12a52677c52b3254cb6bcb690bbf3d2cdd27"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664346 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"cfd1a328fad8a1fbf228c2c6250b1d586490a1a703c23535a3eb7dcf1bbe5867"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664382 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"df731fba572a5e906cf5649aa4cc14b16e25717d1ba7fd8e7dc02e7ea0aa85be"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664394 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"fe1c54a0e0873f0780c7faf59fe96cab185ee928520426321b3ddc92f25a0204"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664403 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"760e18ae3700547257879a5def4bd8a3e6845f819368981331d76400fe97af9e"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664413 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" event={"ID":"f2217de0-7805-4f5f-8ea5-93b81b7e0236","Type":"ContainerStarted","Data":"54740cdff38742d3ffc49ed74ce0bcac4131631c1e86f7422b93ffc4f5462afe"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664424 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" event={"ID":"f2217de0-7805-4f5f-8ea5-93b81b7e0236","Type":"ContainerStarted","Data":"1038dded4ac6146a3ef7e05fc425b32ac120e0351ec2aaee7b8ebe45679034dd"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664433 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" event={"ID":"de6078d7-2aad-46fe-b17a-b6b38e4eaa41","Type":"ContainerStarted","Data":"614acf21995c6ef4e652413ccece98d1915da356d7813f8b0dcd90d12e6d4a8d"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664443 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" event={"ID":"de6078d7-2aad-46fe-b17a-b6b38e4eaa41","Type":"ContainerStarted","Data":"d47bba92b6fb8946edb6fa2f6a021436ea604b27f3b2a8581b9108a215eab3e8"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664453 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" event={"ID":"de6078d7-2aad-46fe-b17a-b6b38e4eaa41","Type":"ContainerStarted","Data":"8a073909baddafc91ef0c1a8a7d1dde84b7bce6d841c77590b71f4a6754c9117"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664462 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" event={"ID":"c1854ea4-c8e2-4289-84b6-1f18b2ac684f","Type":"ContainerStarted","Data":"eedbb1dfd13f24b92d1505673b2418928be1e1bfdd5eb59005a694a899688fee"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664472 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" event={"ID":"c1854ea4-c8e2-4289-84b6-1f18b2ac684f","Type":"ContainerDied","Data":"f44e15f6f5dbc67eb3bd400bcba69ac1adc2aa6d2546e0cbc2778889121d3bcf"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664483 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" event={"ID":"c1854ea4-c8e2-4289-84b6-1f18b2ac684f","Type":"ContainerStarted","Data":"b77a1ff885b1de50033d17a32311fabaefba7e39efce419b16febb35e5db2498"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664492 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"6ab22b3f2d37bf5775a5ff7b8bee43ed84a6b6e971e88c5aa5c7bea50f737d96"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664502 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"0f63c13521e5ca3a0dcdf96b62fa9e56532f32f43af31de53a30911e593c568a"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664511 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"3030fbaae53d50269bd5e2ef3f7474bc1b880c795eb0e6c078b59b5741beb732"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664545 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"f3b6bb748ad127ad783c317a9ec2edeaa6139af683fa7cda66b138823bc535c8"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664555 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"d0111435fee5f861beed3e54093288b80fe796da20879b74d333fc90fd130f88"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664565 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerDied","Data":"d11c59462bae131a903a2325334d63a1ab5222611dbbb14b008e128c7a2a34a9"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664576 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"fb1900c9365a08f03e6fa39a54bd69429d8cedb421e2b0d1e4f977c7a6faf417"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664586 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" event={"ID":"a57854ac-809a-4745-aaa1-774f0a08a560","Type":"ContainerStarted","Data":"4d0d97d44af51af5156c718231836b8527e98e8ee5a7d3079503faf5682e5428"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664596 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" event={"ID":"a57854ac-809a-4745-aaa1-774f0a08a560","Type":"ContainerStarted","Data":"4d1e15f7704367048583514d8253b668db81d2da2b51801dfb366fd0e7170679"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664606 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerStarted","Data":"3c189090fb625d43e4a0aad0248864aa122ec54e7ab96d232c95dbcba79fcc95"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664617 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerStarted","Data":"aa54b546db103668e772defaa4b64e4e2c01b2bc8d91706ab1484cd99b14f9d9"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664625 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerStarted","Data":"5aef46f74824b7bd8319047c24bacbb9cbf6ac782ef810c2be78f3961d31d75e"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664635 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerDied","Data":"2bb061585e269ad50f22944212861cbe8de65df048c827dffb60a910fb8f58b1"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664645 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerStarted","Data":"89e373acd0a6c2bca2cd563de1cef238723b46b416d83b3242f9ebb18d644754"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664654 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" event={"ID":"5e3ddf9e-eeb5-4266-b675-092fd4e27623","Type":"ContainerStarted","Data":"c4a834368b75816e5bf327a50499cbf160883d81fc9ea89519da8bf5870c95aa"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664666 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" event={"ID":"5e3ddf9e-eeb5-4266-b675-092fd4e27623","Type":"ContainerStarted","Data":"a134abc184f79415563956f2eeb439b259ce0571570a2fb953199c779754242d"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664676 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" event={"ID":"5e3ddf9e-eeb5-4266-b675-092fd4e27623","Type":"ContainerStarted","Data":"f63327bd315d50a90d1b9877abdb96105e01d6fdb0e17116424c13ecbe62dbc3"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664686 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" event={"ID":"91b2899e-8d24-41a0-bec8-d11c67b8f955","Type":"ContainerStarted","Data":"62850383ce84470064c579fd7119b27a28f493cfc80dbd0ce02b112368cba3fd"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664695 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" event={"ID":"91b2899e-8d24-41a0-bec8-d11c67b8f955","Type":"ContainerDied","Data":"239c73fe7acb21fae05105f39e9db825bd1c976521db41e975ce454317c12c28"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664705 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" event={"ID":"91b2899e-8d24-41a0-bec8-d11c67b8f955","Type":"ContainerStarted","Data":"630f3ef68fb2ab037a83499120027474c94dfe12bf91c1a5c52579bd6c878cbf"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664715 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" event={"ID":"f91d1788-027d-432b-be33-ca952a95046a","Type":"ContainerStarted","Data":"a66c13b8d3c8a27dd4cd87a525d5a24a89e0e07f3750199c7db475093b70bb91"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664726 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" event={"ID":"f91d1788-027d-432b-be33-ca952a95046a","Type":"ContainerStarted","Data":"b94c08b7e5587b5832ecd9153622e9d6c7645f7646fa587d7cb88f5fb9199df4"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664736 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" event={"ID":"f91d1788-027d-432b-be33-ca952a95046a","Type":"ContainerStarted","Data":"49245723e92395f35c0b36240f7d6fbf94cef777cdd886e69b77ece8cf42ea5f"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664746 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" event={"ID":"ae39c09b-7aef-4615-8ced-0dcad39f23a5","Type":"ContainerStarted","Data":"0e60f2693cbc96c33931a792326fb808ba028038939cac58b0b52b50bec85ee7"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664755 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" event={"ID":"ae39c09b-7aef-4615-8ced-0dcad39f23a5","Type":"ContainerStarted","Data":"62f67b84eaa6aca590ae16bc4212ddb118ad7d5cdbb373eca099fc3bf11c95b9"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664767 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" event={"ID":"ae39c09b-7aef-4615-8ced-0dcad39f23a5","Type":"ContainerStarted","Data":"93a839a770be652f7d12315bd6dada21638d2838dbf7e7ad327b9d696e2d3e07"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664777 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" event={"ID":"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e","Type":"ContainerStarted","Data":"322d73e8b151dad5452501bac1f7dfab899c0c317c5ec70fec1dfe654509113e"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664788 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" event={"ID":"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e","Type":"ContainerStarted","Data":"f8c21c05090492f9afafa02ead2a469af0d1260ed484823064a0610864bf15d8"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664798 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" event={"ID":"4fea9b05-222e-4b58-95c8-735fc1cf3a8b","Type":"ContainerStarted","Data":"92b79031f76eecd271206da72b0a3408ff8ea5659094905a6bd063d6847591cb"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664808 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" event={"ID":"4fea9b05-222e-4b58-95c8-735fc1cf3a8b","Type":"ContainerStarted","Data":"02aff2f4c34f3ffc5f01d06a5769735a5d3c6b81311638c6d9a8ab1333acabbf"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664818 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" event={"ID":"4fea9b05-222e-4b58-95c8-735fc1cf3a8b","Type":"ContainerStarted","Data":"6cc2d27a03b36826decc5cc4343612194df412f00fd1e83d62bd9da95cdaba5c"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664827 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" event={"ID":"f53bc282-5937-49ac-ac98-2ee37ccb268d","Type":"ContainerStarted","Data":"20027ed3d8ce2945b58e2ed2edbfa6fa2b33157326dd7e152264af390a255b26"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664838 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" event={"ID":"f53bc282-5937-49ac-ac98-2ee37ccb268d","Type":"ContainerStarted","Data":"84b569249d8110b2d1ae9f2ad68f8d09c1249f9b0523e1c444309cb25f725be7"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664847 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" event={"ID":"f53bc282-5937-49ac-ac98-2ee37ccb268d","Type":"ContainerStarted","Data":"c1a730f8ab11fc1de19f12e0bc73ab4daa97d0d1b64ae152032167057be533ed"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664856 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" event={"ID":"75e3e2cc-aa56-41f3-8859-1c086f419d05","Type":"ContainerStarted","Data":"9526eea2cea58cb9e28474105457b96211d2f64f5d2c17947ddff373db76ab0b"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664867 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" event={"ID":"75e3e2cc-aa56-41f3-8859-1c086f419d05","Type":"ContainerDied","Data":"adad78c0e9b98fa52036392a08a46d2f94455f9ec32c4bacb460a8042eb8ee5e"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664878 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" event={"ID":"75e3e2cc-aa56-41f3-8859-1c086f419d05","Type":"ContainerStarted","Data":"df43cdf08fb65d06b9db1ef59770a000ce2240e1f2e40f1d6a1619ba0e8b03df"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664889 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" event={"ID":"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97","Type":"ContainerStarted","Data":"ba0427f5fde4559006d8e5e0960edad997c51f4ea55994718ca7aa91f3b87a5b"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664901 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" event={"ID":"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97","Type":"ContainerStarted","Data":"becf6a9468ee5d2197c4916442372c9501293c27732b04e68b431411779a05c6"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664912 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" event={"ID":"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97","Type":"ContainerStarted","Data":"ac63789abc1163c5f9db4788ba7a388635d218ac761e65be8043709a0019d115"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664924 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" event={"ID":"aa16c3bf-2350-46d1-afa0-9477b3ec8877","Type":"ContainerStarted","Data":"222440b4a2f7299de95ce041a034d3160fcac83fac650064e342b5c86cfa35c1"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664934 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" event={"ID":"aa16c3bf-2350-46d1-afa0-9477b3ec8877","Type":"ContainerStarted","Data":"cadfc06b46a2370e89939aa270eb81d7b99f5548dca07c84a3c027dea72e913b"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664944 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" event={"ID":"a25248c0-8de7-4624-b785-f053665fcb23","Type":"ContainerStarted","Data":"6e8c691b31da6df37623c08d39c5a3d5d1885fb91c071985b479d2eb81e7db7d"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664953 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" event={"ID":"a25248c0-8de7-4624-b785-f053665fcb23","Type":"ContainerStarted","Data":"663a1ee240ddbbb57df122e471c26a4e956062c21904538d83d0ecc72e0d36d2"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664964 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" event={"ID":"a25248c0-8de7-4624-b785-f053665fcb23","Type":"ContainerStarted","Data":"de2b21b57e0c31844cd5e9cb7d4c3aefcf1e8f73d137e744bfa1142beaa27fcb"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664974 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" event={"ID":"a25248c0-8de7-4624-b785-f053665fcb23","Type":"ContainerStarted","Data":"ddcba86a9171baf183956cb4886e6b2242be15b32412346569edbcdcef731dfb"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664984 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" event={"ID":"469183dd-dc54-467d-82a1-611132ae8ec4","Type":"ContainerStarted","Data":"476d7d163398e477e8e01c588ecf93d6f7b1021117a57e97b3cab709add5591d"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.664995 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" event={"ID":"469183dd-dc54-467d-82a1-611132ae8ec4","Type":"ContainerStarted","Data":"47b82c3aabac1e522a2b9825a1bdcce46331b472c4fd92d80f06797cd3c1f73f"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665005 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" event={"ID":"469183dd-dc54-467d-82a1-611132ae8ec4","Type":"ContainerStarted","Data":"df1df4af888713c77332d729a24c1e1fdb472ce369b8165f8ad6dfbe7c60bbd6"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665023 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="854c2e8507c2801d7ffa3834f81534fd70cc4fe64f92e8f44b54916d68349b2e" Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665033 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" event={"ID":"46de2acc-9f5d-4ecf-befe-a480f86466f5","Type":"ContainerStarted","Data":"6cd834e174ccc442576b15f26c10970d3f2f599cb7a1f56492db3a3174d18af6"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665044 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" event={"ID":"46de2acc-9f5d-4ecf-befe-a480f86466f5","Type":"ContainerStarted","Data":"bcb35fb659786fc08cf75d55d32a2988d612f73857d18fd89b1d7870b73afb52"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665054 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" event={"ID":"46de2acc-9f5d-4ecf-befe-a480f86466f5","Type":"ContainerDied","Data":"a5663c5e028603466a885bf8e6c2930eae2c60da4e4cb920e82d9e29e7d29f42"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665064 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" event={"ID":"46de2acc-9f5d-4ecf-befe-a480f86466f5","Type":"ContainerStarted","Data":"bc3668412459475b58df22c5952b6fe210803ae27cac46ab11b8236701860e95"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665075 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"d245e5b2-a30d-45c8-9b79-6e8096765c14","Type":"ContainerDied","Data":"215282b36f0afcf690dcc7252b249b0c54821b459a2fe5a0ff25640fd36b6290"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665088 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"d245e5b2-a30d-45c8-9b79-6e8096765c14","Type":"ContainerDied","Data":"20814875b6be1da3d4e673bd4cab493f3904bdc8689013ab1822b3b670087e6a"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665096 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20814875b6be1da3d4e673bd4cab493f3904bdc8689013ab1822b3b670087e6a" Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665106 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"7e219558-98b7-4528-88cf-97b87cd1eb6c","Type":"ContainerDied","Data":"7ea4527aa6e7513c4d82b891ac586f8993c11e7ba38c1d1a048fc3535809e191"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665115 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"7e219558-98b7-4528-88cf-97b87cd1eb6c","Type":"ContainerDied","Data":"2d6d575a8acffb3103f6760ea1452c58c9caadc565e202f8ee52c999c161d223"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665123 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d6d575a8acffb3103f6760ea1452c58c9caadc565e202f8ee52c999c161d223" Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665132 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" event={"ID":"2f844652-225b-4713-a9ad-cf9bcc348f47","Type":"ContainerStarted","Data":"cdc09fd6c3bb18aaf3523f814928e0e85e0c65581ea0a2f8e18d09f87a8cff20"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665145 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" event={"ID":"2f844652-225b-4713-a9ad-cf9bcc348f47","Type":"ContainerStarted","Data":"fee33178d398a85728734b8702eecb787d89c780d680fd9fa904a7591c14e420"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665155 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"8b1c7a56-5d00-468a-bb8d-dbaf8e854951","Type":"ContainerDied","Data":"5b4a47b78349fa5185bcf45526d28c821dd34bc78966a86b575a5f0037835565"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665167 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"8b1c7a56-5d00-468a-bb8d-dbaf8e854951","Type":"ContainerDied","Data":"ff84ae96bfd96fbd48f779854c556cddaa04ed8b8b78d40e920af9da64968681"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665176 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff84ae96bfd96fbd48f779854c556cddaa04ed8b8b78d40e920af9da64968681" Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665200 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" event={"ID":"1375da42-ecaf-4d86-b554-25fd1c3d00bd","Type":"ContainerStarted","Data":"ef48bc7a298f21dc7e1c4f0e8ec7b05b2de65f0d7e2d6a14897ed741dcf440bd"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665213 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" event={"ID":"1375da42-ecaf-4d86-b554-25fd1c3d00bd","Type":"ContainerStarted","Data":"d3032285e1cfcfd919da168e10b18ee5ee2720e85e2457d64bfd97de17bf8050"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665225 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqqrk" event={"ID":"654b5b1c-2764-415c-bb13-aa06899f4076","Type":"ContainerStarted","Data":"3067a13f78d11d596f1c77026b3bb1da6fa4f2d79a95bf33c69377015a27bf8d"} Mar 20 08:40:56.665062 master-0 kubenswrapper[18707]: I0320 08:40:56.665237 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqqrk" event={"ID":"654b5b1c-2764-415c-bb13-aa06899f4076","Type":"ContainerDied","Data":"40882309ca6cfeb4b89a668f255895a49cb18211d2d4c38846d98d1ae8591f1f"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665248 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqqrk" event={"ID":"654b5b1c-2764-415c-bb13-aa06899f4076","Type":"ContainerDied","Data":"ba2c385399d075f7d3be23f2cb6f802e608ef356862884d90b8c839e8667b5b3"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665258 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hqqrk" event={"ID":"654b5b1c-2764-415c-bb13-aa06899f4076","Type":"ContainerStarted","Data":"4308310cb66871b8d038228451f6cb347ae9fe6f0a69349b4baf59dc1b20775d"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665268 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" event={"ID":"acb704a9-6c8d-4378-ae93-e7095b1fce85","Type":"ContainerStarted","Data":"4edf8229d20f78af8d6c9bd895781409ea67ff831e7f63eef258c5dd26b7d38d"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665279 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" event={"ID":"acb704a9-6c8d-4378-ae93-e7095b1fce85","Type":"ContainerStarted","Data":"d1f4c3462eb562d7885b549a3182d1636527f9d646efb4fbbe9ff562004c787d"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665289 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lb4t5" event={"ID":"f1468ec0-2aa4-461c-a62f-e9f067be490f","Type":"ContainerStarted","Data":"b0c63e9d7c1f9be7381bf4be717b03c8b5ca9ba05360c41198144679850f6e32"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665300 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lb4t5" event={"ID":"f1468ec0-2aa4-461c-a62f-e9f067be490f","Type":"ContainerStarted","Data":"8a0af0baf6c97cf8c67073408806404b1f7a015e994a90e5da1cb7cb116ae5cd"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665309 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lb4t5" event={"ID":"f1468ec0-2aa4-461c-a62f-e9f067be490f","Type":"ContainerDied","Data":"883d2d12dc7b471a5dda61efc08657fb43e4a9f74d94e048d8c741bca0b177ad"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665320 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lb4t5" event={"ID":"f1468ec0-2aa4-461c-a62f-e9f067be490f","Type":"ContainerStarted","Data":"89ef9ca9615c3facb086051e9be1597bf7ab0b883ad4dc7992533c4b3196f4e1"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665330 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" event={"ID":"47eadda0-35a6-4b5c-a96c-24854be15098","Type":"ContainerStarted","Data":"1dad796a4b96686dc1ca4a32fa60300c9992326d4cb1ceaa47be4941e0d0b81b"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665340 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" event={"ID":"47eadda0-35a6-4b5c-a96c-24854be15098","Type":"ContainerStarted","Data":"5a9e26d5feffb5d032a36611c5da4c454dba6454147ff1cf841070d4b4feeb67"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665350 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dd9wv" event={"ID":"ee3cc021-67d8-4b7f-b443-16f18228712e","Type":"ContainerStarted","Data":"89d9294562c55f84d7f5035d5fc91869611db748859a24623525e7ba4ce8193e"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665360 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dd9wv" event={"ID":"ee3cc021-67d8-4b7f-b443-16f18228712e","Type":"ContainerStarted","Data":"49cf279e25f73f2651832183fcd1e966f6a074b295d2f5e34a59a5be738d2377"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665370 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" event={"ID":"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7","Type":"ContainerStarted","Data":"68df10b3a72fc3b0c353b5fc70a166a2be68d78636e2ecc68d4b89aecbe60781"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665380 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" event={"ID":"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7","Type":"ContainerStarted","Data":"b8d362586f6fb451267fc09fb3ebee6bcd03ae4f33abd963a5893c4d677a7fc5"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665391 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" event={"ID":"df428d5a-c722-4536-8e7f-cdd85c560481","Type":"ContainerStarted","Data":"514dbe166dec5b1f878e0f4a5bf082ca6bf2afdda7b02979cf775c8d27e07456"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665407 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" event={"ID":"df428d5a-c722-4536-8e7f-cdd85c560481","Type":"ContainerStarted","Data":"edc62dc83d0212adeb196aa9fb63d28b17a6054a019750eef25f143d8b2816f1"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665418 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"336ee2eca239c702b23f8eadc224486f445c6fd4853f373a73d423d9f64cfcac"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665429 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"267f2d3e5624276bc815692d6f63750c35fab88bf4fad9637c60210f294ab470"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665438 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"5de11809fbb3db5b6981fddb634a5dbf7f162fcbe9eede8cb63026b2ff7e2a3e"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665446 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"91928dd4bf037a74fc3110c950269e5b4ae8998e3616107aa1170ce1d3fede55"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665454 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"c0c0eafff8c825fc9c4a32593e8d54d61ac68f27a7fde59d8dfb857aeb1580f0"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665463 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"a2139218314ea5d5d1e04c37be758e7a9f90c106dd3c470737be6550fb6322a9"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665474 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"31b5815996c66a028a6e102943aed8dd0cbf1cb918ec3a5b728d9fb0cb098506"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665483 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"91045cb8c13e35ca1f0bfb21ba636da24cd41b91eea8db817a9a5a02317192b3"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665493 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"6860ec0c6307c0854099262d2b68eee9cef0172599ec80b28a89c6d016fb4071"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665503 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" event={"ID":"7e451189-850e-4d19-a40c-40f642d08511","Type":"ContainerStarted","Data":"546c7b24cfc5ec09edc9a677851e1c9898e06c55218cbc617714bc25cf6c07e6"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665517 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" event={"ID":"7e451189-850e-4d19-a40c-40f642d08511","Type":"ContainerStarted","Data":"270e7ed792fece0ff9d9a6dbda1ff1ab238d9c5aab177de687ac26e9f4d69fcc"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665528 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" event={"ID":"7e451189-850e-4d19-a40c-40f642d08511","Type":"ContainerStarted","Data":"fd67ef0d5263721af2c9563f40007bfa8118e9b8e56c96c99f48f239bd0b7044"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665540 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" event={"ID":"a69e8d3a-a0b1-4688-8631-d9f265aa4c69","Type":"ContainerStarted","Data":"3fbc2ee15fb564cff30cd6bea616239c5ddcc5c59f79a881b3626f244e98915b"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665552 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" event={"ID":"a69e8d3a-a0b1-4688-8631-d9f265aa4c69","Type":"ContainerStarted","Data":"9786d6c6139bd8ce0f2d0b4a8dd76068f202fd2531fb86bd7b58ce95ed53f64f"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665562 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" event={"ID":"d88ba8e1-ee42-423f-9839-e71cb0265c6c","Type":"ContainerStarted","Data":"402362a050fc8c12c159a90a9bcc448b79348e6b94cef2fffbaa9a0b475c7274"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665573 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" event={"ID":"d88ba8e1-ee42-423f-9839-e71cb0265c6c","Type":"ContainerStarted","Data":"8849c0e374773ff413e6a07005d70c646b3dbfad2bb39cd593ab7f09dab9e689"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665586 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" event={"ID":"d88ba8e1-ee42-423f-9839-e71cb0265c6c","Type":"ContainerStarted","Data":"40d3c58441549fd94b4fd06f62f9b9e1bdfe941a93f1f046de6cd048124dc220"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665597 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" event={"ID":"d88ba8e1-ee42-423f-9839-e71cb0265c6c","Type":"ContainerStarted","Data":"49f836c59184940ba77efca8e250a75cfca3ff592389502fe02c69990736b85f"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665630 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" event={"ID":"bbc0b783-28d5-4554-b49d-c66082546f44","Type":"ContainerStarted","Data":"7ee99faecdaa8ce9ade5aaa3b49dd8416a312e96db798b1de9fced997f6fd077"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665642 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" event={"ID":"bbc0b783-28d5-4554-b49d-c66082546f44","Type":"ContainerStarted","Data":"86d25298a72dddc328867a1ea8164a3314dce7f2eff3eb07267e45c914c0415e"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665652 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" event={"ID":"bbc0b783-28d5-4554-b49d-c66082546f44","Type":"ContainerStarted","Data":"21791b9b344da8b052097bc3f6be11ec8238d51625fab3e6901854f679a950ba"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665664 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerStarted","Data":"e0bba107e6b49f693f3963a5c0c601999ff2c2d961d6645822b30cd922e252a1"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665677 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerDied","Data":"09df5a13ce7374304f28bc120919f2392b8b1eedb768ae74aa71f1f46b1260f3"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665693 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerDied","Data":"0562124cc868051528c8c76baabb685e9f641cfd32418a6cbc0b305b7b8b1525"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665707 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerDied","Data":"ca1d7ca00a56b55ea93c4440e9e959ff93d3c3b08431ba60809fba320b9496a7"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665720 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerDied","Data":"d7a9f93548b4324f9218b5fb15026983da36f57336679426ecdeef802c274095"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665732 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerDied","Data":"2f1eef10f4235bf6943bb1062fc964d69fc5c901795041a7ddca120ef33de66d"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665744 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerDied","Data":"5772594b3f3e6aae19a5e357ad1c9bc0dade5e494667c07e21d51c8697d24253"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665755 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rpbcn" event={"ID":"b98b4efc-6117-487f-9cfc-38ce66dd9570","Type":"ContainerStarted","Data":"a49d9b3d48753f2f078ac0130c6af5ca5c251ec6fd9f68a980d833a929f8987e"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665768 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" event={"ID":"f046860d-2d54-4746-8ba2-f8e90fa55e38","Type":"ContainerStarted","Data":"c3742feb1f4aa394282e45f9e7e1ad5a78209b23e0c120a4f3b31f9fa95097bc"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665779 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" event={"ID":"f046860d-2d54-4746-8ba2-f8e90fa55e38","Type":"ContainerDied","Data":"0bd5b879aadefbf9a44c6c6dcf866736c73041c10aa61588b2570b45b21eb4d2"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665792 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" event={"ID":"f046860d-2d54-4746-8ba2-f8e90fa55e38","Type":"ContainerStarted","Data":"16961d83ade56434cc5d6847f42954f60a19529cc2fd922d6eec9e1e749ea461"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665804 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" event={"ID":"ad692349-5089-4afc-85b2-9b6e7997567c","Type":"ContainerStarted","Data":"b10e547edcdc3314e5e478ee6b910f608083e53cf3a4277550ec5bfade59f20f"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665817 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" event={"ID":"ad692349-5089-4afc-85b2-9b6e7997567c","Type":"ContainerDied","Data":"871b2216305e9883e9345351545dfa140768d0b1a9975612361ac0d841fda34c"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665832 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" event={"ID":"ad692349-5089-4afc-85b2-9b6e7997567c","Type":"ContainerStarted","Data":"972f78b16f9b47d7c2f794578bdc02049da4958a2a754da51e80ca31806a5573"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665848 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jstrn" event={"ID":"c593e31d-82b5-4d42-992e-6b050ccf3019","Type":"ContainerStarted","Data":"f7a9dfc612086fe204a444301a032e9febcd36b7ff057eed4b49245b1a8cb51b"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665861 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jstrn" event={"ID":"c593e31d-82b5-4d42-992e-6b050ccf3019","Type":"ContainerDied","Data":"06e5d2b0055041a2ae0e49aa4151d374fab625e94cc004550ff8ef85c3bfe80e"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665872 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jstrn" event={"ID":"c593e31d-82b5-4d42-992e-6b050ccf3019","Type":"ContainerDied","Data":"633b6e62526e4e0cbd07fd4d4b0af4afaceded4e0aa25ebac529d888061b8a40"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665883 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jstrn" event={"ID":"c593e31d-82b5-4d42-992e-6b050ccf3019","Type":"ContainerStarted","Data":"ef4e9117db9997ed5777e4eb0ac97e214bfa4a8d0f2ec4d63af464bd290bf782"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665894 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-w2zwp" event={"ID":"fdfdabb8-83d6-4b38-a709-9e354062ba1a","Type":"ContainerDied","Data":"ef7d3c19081b3942ae839231125bb3d9ed41e1148d63c694dd308a85f91f661c"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665907 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-w2zwp" event={"ID":"fdfdabb8-83d6-4b38-a709-9e354062ba1a","Type":"ContainerDied","Data":"1ffb3901c3fd86821b7075a50ea55db7d24b5dcb1f13fa6241a8b0a2754ace0c"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665917 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ffb3901c3fd86821b7075a50ea55db7d24b5dcb1f13fa6241a8b0a2754ace0c" Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665927 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"8e7a82869988463543d3d8dd1f0b5fe3","Type":"ContainerStarted","Data":"2f574795ee9d934844b92324a83362cb7abdf8cc28431e8355456d552139443f"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665937 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"8e7a82869988463543d3d8dd1f0b5fe3","Type":"ContainerStarted","Data":"9da40744da0c1f755b7ca8d13405871816427a42b29bf11d678dd70f488e5c6a"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665948 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" event={"ID":"86cb5d23-df7f-4f67-8086-1789d8e68544","Type":"ContainerStarted","Data":"517434b092860d80f200ad453a8ab960ca389e8d7a3ffc04820cc51b48ee30fe"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665964 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" event={"ID":"86cb5d23-df7f-4f67-8086-1789d8e68544","Type":"ContainerDied","Data":"aba047bf62fdafe49170c75e4151358bb7421fff8f12793b7c505a061479cedc"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665977 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" event={"ID":"86cb5d23-df7f-4f67-8086-1789d8e68544","Type":"ContainerDied","Data":"e5ff2b8e0eaf0ec4027bec63b2223736cf7655c761ce84b8c07f00889c293502"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.665988 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" event={"ID":"86cb5d23-df7f-4f67-8086-1789d8e68544","Type":"ContainerDied","Data":"eb336787da69f3659db83e9b59377f619a1bab475c9f6f4fe67e34d16e998717"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666002 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" event={"ID":"86cb5d23-df7f-4f67-8086-1789d8e68544","Type":"ContainerStarted","Data":"dc885deb2f8a42b2a9647ca9a7a52b0dd31aa93757cc5e47ba821f5cab2f46b8"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666013 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" event={"ID":"3f471ecc-922c-4cb1-9bdd-fdb5da08c592","Type":"ContainerStarted","Data":"b7f59c60792cc7ff5e71be447612403a3bb4cc5643976d4b99c5a00201eb0b72"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666024 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" event={"ID":"3f471ecc-922c-4cb1-9bdd-fdb5da08c592","Type":"ContainerStarted","Data":"cebc013f8d48a631981ef28b781b77dd8f3f5a8d8bf87e9f117a4185f222c73e"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666038 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" event={"ID":"3f471ecc-922c-4cb1-9bdd-fdb5da08c592","Type":"ContainerStarted","Data":"3092eb7a16220393b74c3ca8c6aedf7058f62f9313af91e571c5d2e31d050e35"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666051 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtqgc" event={"ID":"b639e578-628e-404d-b759-8b6e84e771d9","Type":"ContainerStarted","Data":"b8bdf077983bcc6ca23493b1788cffc2d1c4bb1c6018cd76d67efa10f3e3c4d0"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666063 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtqgc" event={"ID":"b639e578-628e-404d-b759-8b6e84e771d9","Type":"ContainerDied","Data":"70cbec433b4a5afb013d99a248cda222d66b2abdceddd1d72d46fce02f57b45d"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666077 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtqgc" event={"ID":"b639e578-628e-404d-b759-8b6e84e771d9","Type":"ContainerDied","Data":"1270d65f2b1bd2bb8e9f27e0d20a7b179ff2340a8f906b8f6439c6b5966d578b"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666092 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dtqgc" event={"ID":"b639e578-628e-404d-b759-8b6e84e771d9","Type":"ContainerStarted","Data":"d48534fe1c98270494577c8d49aed8602c14ccc175395517708a7b89389db471"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666102 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" event={"ID":"fa759777-de22-4440-a3d3-ad429a3b8e7b","Type":"ContainerStarted","Data":"ef88f829645fabb894212937333305cd3d87e5b021830721d4e5e9b594609690"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666112 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" event={"ID":"fa759777-de22-4440-a3d3-ad429a3b8e7b","Type":"ContainerDied","Data":"9fc8932c5b9da1ed7961a714ed5b905683dbfe51746690c485563d67ef3b0b29"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666122 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" event={"ID":"fa759777-de22-4440-a3d3-ad429a3b8e7b","Type":"ContainerStarted","Data":"46a12190f11c7c4d27d18246d17e718f0fd8c23ea7d374844186aa5295484abf"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666132 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"e1d21f11-7386-4a04-a82e-5a03f3602a3b","Type":"ContainerDied","Data":"bf471cdb978763d680a893df02a2a47dbe930e97fc0ccb05e480229f6feda593"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666142 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"e1d21f11-7386-4a04-a82e-5a03f3602a3b","Type":"ContainerDied","Data":"fc9fcf2245b5e00e0473ecdf9c16e18d2e148c7fa6e4f86bf8df81bc8b274006"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666150 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc9fcf2245b5e00e0473ecdf9c16e18d2e148c7fa6e4f86bf8df81bc8b274006" Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666159 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"5bde0c42e4478c0c2b1f9cfccbfe9763429b578e44acd46f68165c02cc1775d9"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666214 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"97323a42971f498c1d4a021f8b56f02bad8b5f835d83cca8811f50e11754376d"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666225 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"c852ab295eb4b8ecb5f260bea21018b1e68a8d8bb5cfc3cbdbcada9f4439cadb"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666235 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"dae890376a07a525dd48bc3450179c32c083e3bb463f45b933386a53e1383fa6"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666244 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"346250461efb9157130994da8b48e7d0351db81bbd48a603a6aff3e21924579d"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666253 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"64ff076f67738b1cdfe3015490df851855bf3f60f28119fcbb4633f4e27fd2e7"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666264 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"7169d72dd34d5edaa756497d6149ce488f989f970423b41a486bae8df6c73e89"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666276 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"f48fa507ff5a87b0b98dbe8df038ac6da6336decb46491d119e2c7e9b5563a25"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666288 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerDied","Data":"f4a74ff585c6a7d1deca8c58f38e8ca10a816620bf09146c1f9ff9a31d89c1a7"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666301 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" event={"ID":"248a3d2f-3be4-46bf-959c-79d28736c0d6","Type":"ContainerStarted","Data":"ef2664ab7688b2c180ba8801467801f25c2ef381f1cb268907fa44549876a809"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666311 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" event={"ID":"a9a9ecf2-c476-4962-8333-21f242dbcb89","Type":"ContainerStarted","Data":"c11d7390f44d6be5bc3fcba45513f31241f9e0af79908ed280e1f11a5db8a9db"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666321 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" event={"ID":"a9a9ecf2-c476-4962-8333-21f242dbcb89","Type":"ContainerStarted","Data":"0885038f20b8301926b3fbf1704c402c0cf75d3a9c64af977d4466bc2f1fe1b6"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666332 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" event={"ID":"b4291bfd-53d9-4c78-b7cb-d7eb46560528","Type":"ContainerStarted","Data":"7d9ef09c05c17f91e19a7e2b31b502d477af56141dfbd1c2fd48a2cadd1f3194"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666341 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" event={"ID":"b4291bfd-53d9-4c78-b7cb-d7eb46560528","Type":"ContainerStarted","Data":"3f9c2dbd6bdf8182b597345f8c7fea11c09d5e650fe0f55bf00a3c9f8887aa52"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666351 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" event={"ID":"21bebade-17fa-444e-92a9-eea53d6cd673","Type":"ContainerStarted","Data":"8b198f10122a271a46d1da5f2f799d55468d2123b4b2ad74d6f0cb05641e6136"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666361 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" event={"ID":"21bebade-17fa-444e-92a9-eea53d6cd673","Type":"ContainerStarted","Data":"a74f65fd79254bc2069d4d58186204f182bbdda409cb8c9a6055b2b1614423bb"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666372 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" event={"ID":"21bebade-17fa-444e-92a9-eea53d6cd673","Type":"ContainerStarted","Data":"0e8cf476b590f62cf998ca26bf5d07d7ec24559896e92dec128d4a67e132990a"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666382 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" event={"ID":"c7f5e6cd-e093-409a-8758-d3db7a7eb32c","Type":"ContainerStarted","Data":"43664e36cb7b60519ab710dfbfcb9bd2c63951d962e394659ce8bb21e98ebbb9"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666393 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" event={"ID":"c7f5e6cd-e093-409a-8758-d3db7a7eb32c","Type":"ContainerStarted","Data":"b78496ecf995c35d24dfd3908193418c538ce4e684ecf45bdc674a187caf26f7"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666401 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" event={"ID":"c7f5e6cd-e093-409a-8758-d3db7a7eb32c","Type":"ContainerStarted","Data":"60bbe3130a4d3341abf02d519702749eab204c658db33e134e22b746126c5516"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666409 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" event={"ID":"ab175f7e-a5e8-4fda-98c9-6d052a221a83","Type":"ContainerStarted","Data":"c27171e28b2189b6b8d129f565f467a424f94c7f7677d037a89da262ce118c31"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666419 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" event={"ID":"ab175f7e-a5e8-4fda-98c9-6d052a221a83","Type":"ContainerDied","Data":"32a6a1f6b15d34dd1d0099bb33c1df2bcbe7798374f91764aaa7bb5b94a96471"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666430 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" event={"ID":"ab175f7e-a5e8-4fda-98c9-6d052a221a83","Type":"ContainerDied","Data":"6fafac3f004d2f582e04bae9436b72da7fad6247504ddaf33a3c755f3641fa2c"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666439 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" event={"ID":"ab175f7e-a5e8-4fda-98c9-6d052a221a83","Type":"ContainerStarted","Data":"5d21afac0935094080df835d139dd20efdf51fad3502782c60bb38ee7294f13b"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666447 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v5h69" event={"ID":"12e1d9e5-96b5-4367-81a5-d87b3f93d8da","Type":"ContainerStarted","Data":"2cc9c806c560b46d56c6690ddba6f9750c6827399375c05fca98bbd78728f7b8"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666457 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v5h69" event={"ID":"12e1d9e5-96b5-4367-81a5-d87b3f93d8da","Type":"ContainerStarted","Data":"4d3b35b91bdc99e207b007c75a667fa870de699431a48d9ff6499d3e08d2063c"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666465 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v5h69" event={"ID":"12e1d9e5-96b5-4367-81a5-d87b3f93d8da","Type":"ContainerStarted","Data":"989d132822ac99b97c52492bc7539dcc4d25a3a8fbced6fed73e66c9b3f74f8d"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666472 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-gj4pm" event={"ID":"4ddac301-a604-4f07-8849-5928befd336e","Type":"ContainerStarted","Data":"1ceab06d6d63f112c8a02eecb5b4790818231a1b5a5eaba30ed3842eed1cfc03"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666484 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-gj4pm" event={"ID":"4ddac301-a604-4f07-8849-5928befd336e","Type":"ContainerStarted","Data":"e9d5f349b622bea576ae3dd04cdf2c2da1c82af6b9e42a0b5011a9e0e2cc47e6"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666494 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" event={"ID":"45e8b72b-564c-4bb1-b911-baff2d6c87ad","Type":"ContainerStarted","Data":"6af5a2f2427206da008543d3c2e9de1d09b1789d70a831c73c81aa5b4993f15e"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666503 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" event={"ID":"45e8b72b-564c-4bb1-b911-baff2d6c87ad","Type":"ContainerStarted","Data":"948f733f9e7fc399ff3028ac75f39dbd9ac2f6622b269cc750e23eb9c88dedb1"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666513 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" event={"ID":"a638c468-010c-4da3-ad62-26f5f2bbdbb9","Type":"ContainerStarted","Data":"ba86f095c655a63b2bbaa35e0d6ce4a38b529d13f07b295310237aeaa7d4bc4f"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666522 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" event={"ID":"a638c468-010c-4da3-ad62-26f5f2bbdbb9","Type":"ContainerStarted","Data":"e9d1c009ab8bfc1b5d9e5ffc8e765a713719b93c5edffb019bac7a72776dcb9e"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666532 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" event={"ID":"c2a23d24-9e09-431e-8c3b-8456ff51a8d0","Type":"ContainerStarted","Data":"bf7423bac144bcaaf3719ed8e76389e5f2ec9717aa4868ad4761ed7cc6782d76"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666544 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" event={"ID":"c2a23d24-9e09-431e-8c3b-8456ff51a8d0","Type":"ContainerDied","Data":"4f3597589c3cef89a127e7d3fec06145ebc7ceb92ec5a686f505064f491c35a3"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666555 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" event={"ID":"c2a23d24-9e09-431e-8c3b-8456ff51a8d0","Type":"ContainerStarted","Data":"919c09e620d76c77a9425f79c1d3a73bcd978ef6fd46e99de901376458ec1273"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666568 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24e53548727bf62b2d0124119a6e8fe69dde1c3031b4a91a063f8dd58773fece" Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666577 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" event={"ID":"96de6024-e20f-4b52-9294-b330d65e4153","Type":"ContainerStarted","Data":"08ddad0132f1249a0926348017f97a2ace0dd09f6ef4f6ff1405a91cd7359f8f"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666586 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" event={"ID":"96de6024-e20f-4b52-9294-b330d65e4153","Type":"ContainerStarted","Data":"07c77bfc7ae9afd423082de8b582bc56bb7322d1ee441fcf749b3f543d9311b5"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666594 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" event={"ID":"7b489385-2c96-4a97-8b31-362162de020e","Type":"ContainerStarted","Data":"efd6836c5e507ec16e6e082bc5946a6c45ff929a136363cbf3994fcefbdc7906"} Mar 20 08:40:56.670480 master-0 kubenswrapper[18707]: I0320 08:40:56.666604 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" event={"ID":"7b489385-2c96-4a97-8b31-362162de020e","Type":"ContainerStarted","Data":"8a61c21711f690cdda83fe881555e8ad64b01a2f6d1c312d8da79d83d36082f5"} Mar 20 08:40:56.693332 master-0 kubenswrapper[18707]: I0320 08:40:56.687414 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 08:40:56.694614 master-0 kubenswrapper[18707]: I0320 08:40:56.694326 18707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:40:56.697257 master-0 kubenswrapper[18707]: I0320 08:40:56.697208 18707 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 20 08:40:56.697257 master-0 kubenswrapper[18707]: I0320 08:40:56.697245 18707 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 20 08:40:56.697257 master-0 kubenswrapper[18707]: I0320 08:40:56.697256 18707 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 20 08:40:56.697987 master-0 kubenswrapper[18707]: I0320 08:40:56.697766 18707 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.710711 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd6vv\" (UniqueName: \"kubernetes.io/projected/5e3ddf9e-eeb5-4266-b675-092fd4e27623-kube-api-access-xd6vv\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.710794 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5782718-9118-4682-a287-7998cd0304b3-proxy-tls\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.710829 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-ovn\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.710853 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqdlf\" (UniqueName: \"kubernetes.io/projected/325f0a83-d56d-4b62-977b-088a7d5f0e00-kube-api-access-lqdlf\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.710880 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.710904 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/47eadda0-35a6-4b5c-a96c-24854be15098-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-tkc2j\" (UID: \"47eadda0-35a6-4b5c-a96c-24854be15098\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.710927 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-script-lib\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.710954 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.710980 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-etc-kubernetes\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711021 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27qvw\" (UniqueName: \"kubernetes.io/projected/bbc0b783-28d5-4554-b49d-c66082546f44-kube-api-access-27qvw\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711044 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqd2v\" (UniqueName: \"kubernetes.io/projected/6a80bd6f-2263-4251-8197-5173193f8afc-kube-api-access-tqd2v\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711067 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711085 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-bin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711107 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qstvb\" (UniqueName: \"kubernetes.io/projected/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-kube-api-access-qstvb\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711128 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysctl-conf\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711149 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711169 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/7e451189-850e-4d19-a40c-40f642d08511-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711209 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711230 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711248 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-bin\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711272 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmrxh\" (UniqueName: \"kubernetes.io/projected/ab175f7e-a5e8-4fda-98c9-6d052a221a83-kube-api-access-zmrxh\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711300 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711324 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cni-binary-copy\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711348 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb0fc10f-5796-4cd5-b8f5-72d678054c24-webhook-cert\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711368 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711390 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8bxz\" (UniqueName: \"kubernetes.io/projected/96de6024-e20f-4b52-9294-b330d65e4153-kube-api-access-z8bxz\") pod \"csi-snapshot-controller-64854d9cff-f44gr\" (UID: \"96de6024-e20f-4b52-9294-b330d65e4153\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711409 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711429 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtcnq\" (UniqueName: \"kubernetes.io/projected/df428d5a-c722-4536-8e7f-cdd85c560481-kube-api-access-dtcnq\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711448 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-metrics-tls\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711469 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68252533-bd64-4fc5-838a-cc350cbe77f0-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711486 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-config-volume\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711505 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711526 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/86cb5d23-df7f-4f67-8086-1789d8e68544-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711548 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhkh7\" (UniqueName: \"kubernetes.io/projected/f046860d-2d54-4746-8ba2-f8e90fa55e38-kube-api-access-xhkh7\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711583 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-etcd-serving-ca\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711606 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr8br\" (UniqueName: \"kubernetes.io/projected/3de37144-a9ab-45fb-a23f-2287a5198edf-kube-api-access-zr8br\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711621 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2qf7\" (UniqueName: \"kubernetes.io/projected/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-kube-api-access-g2qf7\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711641 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-kubelet\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711658 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-var-lib-kubelet\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711682 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-env-overrides\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711702 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-etcd-client\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711723 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-run\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711742 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711763 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa759777-de22-4440-a3d3-ad429a3b8e7b-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711784 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-conf-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711802 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5v9f\" (UniqueName: \"kubernetes.io/projected/a9a9ecf2-c476-4962-8333-21f242dbcb89-kube-api-access-d5v9f\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711824 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-cache\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711843 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711862 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68252533-bd64-4fc5-838a-cc350cbe77f0-config\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711914 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2d4r\" (UniqueName: \"kubernetes.io/projected/f53bc282-5937-49ac-ac98-2ee37ccb268d-kube-api-access-b2d4r\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711935 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711954 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-ovnkube-identity-cm\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.711986 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-default-certificate\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712005 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9a9ecf2-c476-4962-8333-21f242dbcb89-serving-cert\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712027 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46de2acc-9f5d-4ecf-befe-a480f86466f5-audit-dir\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712069 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f9vt\" (UniqueName: \"kubernetes.io/projected/46de2acc-9f5d-4ecf-befe-a480f86466f5-kube-api-access-4f9vt\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712101 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7e451189-850e-4d19-a40c-40f642d08511-cache\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712120 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysconfig\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712151 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/45e8b72b-564c-4bb1-b911-baff2d6c87ad-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712215 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d57k\" (UniqueName: \"kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-kube-api-access-8d57k\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712242 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-key\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712286 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njjkt\" (UniqueName: \"kubernetes.io/projected/813f91c2-2b37-4681-968d-4217e286e22f-kube-api-access-njjkt\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712319 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ad692349-5089-4afc-85b2-9b6e7997567c-metrics-tls\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712363 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clz6w\" (UniqueName: \"kubernetes.io/projected/75e3e2cc-aa56-41f3-8859-1c086f419d05-kube-api-access-clz6w\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712397 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s69rd\" (UniqueName: \"kubernetes.io/projected/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-kube-api-access-s69rd\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712497 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-metrics-certs\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712644 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-system-cni-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712673 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66kz7\" (UniqueName: \"kubernetes.io/projected/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-kube-api-access-66kz7\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712738 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712774 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-env-overrides\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712841 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8crkc\" (UniqueName: \"kubernetes.io/projected/a18b9230-de78-41b8-a61e-361b8bb1fbb3-kube-api-access-8crkc\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712872 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qns9g\" (UniqueName: \"kubernetes.io/projected/2bf90db0-f943-464c-8599-e36b4fc32e1c-kube-api-access-qns9g\") pod \"migrator-8487694857-w5tlr\" (UID: \"2bf90db0-f943-464c-8599-e36b4fc32e1c\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712894 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2028761b8522f874dcebf13c4683d033\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712923 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/7e451189-850e-4d19-a40c-40f642d08511-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.712963 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-7t5qv\" (UID: \"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713008 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-env-overrides\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713030 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-serving-cert\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713082 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-client-ca\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713130 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-cnibin\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713170 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713209 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-apiservice-cert\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713308 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46de2acc-9f5d-4ecf-befe-a480f86466f5-node-pullsecrets\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713344 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e3e2cc-aa56-41f3-8859-1c086f419d05-config\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713372 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-kubernetes\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713397 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvjkr\" (UniqueName: \"kubernetes.io/projected/7c4e7e57-43be-4d31-b523-f7e4d316dce3-kube-api-access-bvjkr\") pod \"csi-snapshot-controller-operator-5f5d689c6b-4dqrh\" (UID: \"7c4e7e57-43be-4d31-b523-f7e4d316dce3\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713426 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-trusted-ca-bundle\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713452 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4291bfd-53d9-4c78-b7cb-d7eb46560528-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713474 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa759777-de22-4440-a3d3-ad429a3b8e7b-config\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713505 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713525 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-audit-policies\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713547 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3de37144-a9ab-45fb-a23f-2287a5198edf-audit-dir\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713567 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqtvp\" (UniqueName: \"kubernetes.io/projected/91b2899e-8d24-41a0-bec8-d11c67b8f955-kube-api-access-hqtvp\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713585 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-systemd-units\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713620 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtc2p\" (UniqueName: \"kubernetes.io/projected/248a3d2f-3be4-46bf-959c-79d28736c0d6-kube-api-access-mtc2p\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713639 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713666 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snmpq\" (UniqueName: \"kubernetes.io/projected/7e451189-850e-4d19-a40c-40f642d08511-kube-api-access-snmpq\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713698 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713771 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1375da42-ecaf-4d86-b554-25fd1c3d00bd-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713798 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nvl4\" (UniqueName: \"kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-kube-api-access-9nvl4\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713837 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/7e451189-850e-4d19-a40c-40f642d08511-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713862 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-audit\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713894 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-encryption-config\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713928 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-netns\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713947 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.713980 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-client\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714060 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714098 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh47j\" (UniqueName: \"kubernetes.io/projected/fb0fc10f-5796-4cd5-b8f5-72d678054c24-kube-api-access-lh47j\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714137 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-trusted-ca-bundle\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714179 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cnibin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714214 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ee3cc021-67d8-4b7f-b443-16f18228712e-iptables-alerter-script\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714262 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-modprobe-d\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714281 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714318 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75r7k\" (UniqueName: \"kubernetes.io/projected/68252533-bd64-4fc5-838a-cc350cbe77f0-kube-api-access-75r7k\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714425 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-netns\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714453 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvx6w\" (UniqueName: \"kubernetes.io/projected/acb704a9-6c8d-4378-ae93-e7095b1fce85-kube-api-access-xvx6w\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714484 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-tuned\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714516 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-config\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714546 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-os-release\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714606 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ad692349-5089-4afc-85b2-9b6e7997567c-host-etc-kube\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714633 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-multus\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714650 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-etc-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714672 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-config\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714694 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714723 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-systemd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714782 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a57854ac-809a-4745-aaa1-774f0a08a560-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714801 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-host\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714824 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714870 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-config\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:40:56.717022 master-0 kubenswrapper[18707]: I0320 08:40:56.714904 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.715007 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-netd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.715034 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a57854ac-809a-4745-aaa1-774f0a08a560-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.715081 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-224dg\" (UniqueName: \"kubernetes.io/projected/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-kube-api-access-224dg\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.715123 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-serving-cert\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.715150 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.718888 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.719441 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.719996 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.720204 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.720313 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.720409 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.720659 18707 scope.go:117] "RemoveContainer" containerID="0c7a85a881c1e6ccd13a87741cbb2be0fd8a4f88ff19bd0cb0941e6a43061f79" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.720927 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.721085 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: E0320 08:40:56.721352 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c7a85a881c1e6ccd13a87741cbb2be0fd8a4f88ff19bd0cb0941e6a43061f79\": container with ID starting with 0c7a85a881c1e6ccd13a87741cbb2be0fd8a4f88ff19bd0cb0941e6a43061f79 not found: ID does not exist" containerID="0c7a85a881c1e6ccd13a87741cbb2be0fd8a4f88ff19bd0cb0941e6a43061f79" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.721744 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c7a85a881c1e6ccd13a87741cbb2be0fd8a4f88ff19bd0cb0941e6a43061f79"} err="failed to get container status \"0c7a85a881c1e6ccd13a87741cbb2be0fd8a4f88ff19bd0cb0941e6a43061f79\": rpc error: code = NotFound desc = could not find container \"0c7a85a881c1e6ccd13a87741cbb2be0fd8a4f88ff19bd0cb0941e6a43061f79\": container with ID starting with 0c7a85a881c1e6ccd13a87741cbb2be0fd8a4f88ff19bd0cb0941e6a43061f79 not found: ID does not exist" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.722147 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.722166 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.722349 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-env-overrides\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.722649 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-env-overrides\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.723049 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/813f91c2-2b37-4681-968d-4217e286e22f-metrics-certs\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.723120 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/86cb5d23-df7f-4f67-8086-1789d8e68544-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.723279 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-env-overrides\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.723656 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa759777-de22-4440-a3d3-ad429a3b8e7b-config\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.723711 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.723823 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e3e2cc-aa56-41f3-8859-1c086f419d05-config\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.723948 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:40:56.724397 master-0 kubenswrapper[18707]: I0320 08:40:56.724050 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb0fc10f-5796-4cd5-b8f5-72d678054c24-webhook-cert\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:40:56.725311 master-0 kubenswrapper[18707]: I0320 08:40:56.724696 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:40:56.725311 master-0 kubenswrapper[18707]: I0320 08:40:56.725006 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 20 08:40:56.725998 master-0 kubenswrapper[18707]: I0320 08:40:56.725929 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 08:40:56.726597 master-0 kubenswrapper[18707]: E0320 08:40:56.726092 18707 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-master-0\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:40:56.726597 master-0 kubenswrapper[18707]: I0320 08:40:56.726288 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 08:40:56.726597 master-0 kubenswrapper[18707]: I0320 08:40:56.726452 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 08:40:56.726597 master-0 kubenswrapper[18707]: I0320 08:40:56.726554 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 20 08:40:56.726597 master-0 kubenswrapper[18707]: I0320 08:40:56.726588 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 20 08:40:56.726749 master-0 kubenswrapper[18707]: I0320 08:40:56.726679 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 20 08:40:56.726749 master-0 kubenswrapper[18707]: I0320 08:40:56.726695 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 20 08:40:56.726845 master-0 kubenswrapper[18707]: I0320 08:40:56.726779 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 08:40:56.726845 master-0 kubenswrapper[18707]: I0320 08:40:56.726790 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 08:40:56.727180 master-0 kubenswrapper[18707]: I0320 08:40:56.727105 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 08:40:56.727180 master-0 kubenswrapper[18707]: I0320 08:40:56.727150 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 08:40:56.727304 master-0 kubenswrapper[18707]: I0320 08:40:56.727253 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 08:40:56.727517 master-0 kubenswrapper[18707]: I0320 08:40:56.727365 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 08:40:56.727517 master-0 kubenswrapper[18707]: I0320 08:40:56.727399 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 08:40:56.727517 master-0 kubenswrapper[18707]: I0320 08:40:56.727111 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 08:40:56.727625 master-0 kubenswrapper[18707]: I0320 08:40:56.727536 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 08:40:56.727625 master-0 kubenswrapper[18707]: I0320 08:40:56.727616 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 20 08:40:56.727688 master-0 kubenswrapper[18707]: I0320 08:40:56.727643 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 08:40:56.733401 master-0 kubenswrapper[18707]: I0320 08:40:56.733333 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-node-bootstrap-token\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:40:56.735991 master-0 kubenswrapper[18707]: I0320 08:40:56.735925 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:40:56.736605 master-0 kubenswrapper[18707]: I0320 08:40:56.736572 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-tuned\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.737642 master-0 kubenswrapper[18707]: I0320 08:40:56.737588 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68252533-bd64-4fc5-838a-cc350cbe77f0-config\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:40:56.738129 master-0 kubenswrapper[18707]: I0320 08:40:56.738110 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/fb0fc10f-5796-4cd5-b8f5-72d678054c24-ovnkube-identity-cm\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:40:56.738675 master-0 kubenswrapper[18707]: I0320 08:40:56.738659 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7e451189-850e-4d19-a40c-40f642d08511-cache\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:40:56.738989 master-0 kubenswrapper[18707]: I0320 08:40:56.738975 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/45e8b72b-564c-4bb1-b911-baff2d6c87ad-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:40:56.739314 master-0 kubenswrapper[18707]: I0320 08:40:56.739283 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-cache\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:40:56.739448 master-0 kubenswrapper[18707]: I0320 08:40:56.739434 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ad692349-5089-4afc-85b2-9b6e7997567c-metrics-tls\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:40:56.740048 master-0 kubenswrapper[18707]: I0320 08:40:56.740021 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-config\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.740227 master-0 kubenswrapper[18707]: I0320 08:40:56.740212 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-client\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:40:56.740611 master-0 kubenswrapper[18707]: I0320 08:40:56.740596 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-config\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:40:56.740888 master-0 kubenswrapper[18707]: I0320 08:40:56.740871 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:40:56.741261 master-0 kubenswrapper[18707]: I0320 08:40:56.741240 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/df428d5a-c722-4536-8e7f-cdd85c560481-srv-cert\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:40:56.741569 master-0 kubenswrapper[18707]: I0320 08:40:56.741551 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-config\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:40:56.742123 master-0 kubenswrapper[18707]: I0320 08:40:56.741733 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 08:40:56.742326 master-0 kubenswrapper[18707]: I0320 08:40:56.742254 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 08:40:56.742326 master-0 kubenswrapper[18707]: I0320 08:40:56.741777 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 08:40:56.742469 master-0 kubenswrapper[18707]: I0320 08:40:56.742447 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 08:40:56.742525 master-0 kubenswrapper[18707]: I0320 08:40:56.741807 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 08:40:56.742640 master-0 kubenswrapper[18707]: I0320 08:40:56.741852 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 08:40:56.742770 master-0 kubenswrapper[18707]: I0320 08:40:56.741865 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 20 08:40:56.742840 master-0 kubenswrapper[18707]: I0320 08:40:56.733446 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5782718-9118-4682-a287-7998cd0304b3-mcd-auth-proxy-config\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:40:56.742917 master-0 kubenswrapper[18707]: I0320 08:40:56.742894 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovn-node-metrics-cert\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.742955 master-0 kubenswrapper[18707]: I0320 08:40:56.742932 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:40:56.742986 master-0 kubenswrapper[18707]: I0320 08:40:56.742969 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:40:56.743029 master-0 kubenswrapper[18707]: I0320 08:40:56.743007 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:40:56.743069 master-0 kubenswrapper[18707]: I0320 08:40:56.743042 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-systemd\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.743100 master-0 kubenswrapper[18707]: I0320 08:40:56.743070 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-daemon-config\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.743464 master-0 kubenswrapper[18707]: I0320 08:40:56.743102 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.743464 master-0 kubenswrapper[18707]: I0320 08:40:56.741877 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 08:40:56.743464 master-0 kubenswrapper[18707]: I0320 08:40:56.743129 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-os-release\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.743464 master-0 kubenswrapper[18707]: I0320 08:40:56.743155 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-socket-dir-parent\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.743464 master-0 kubenswrapper[18707]: I0320 08:40:56.743406 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/325f0a83-d56d-4b62-977b-088a7d5f0e00-config\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:40:56.743464 master-0 kubenswrapper[18707]: I0320 08:40:56.743441 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysctl-d\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.743656 master-0 kubenswrapper[18707]: I0320 08:40:56.743470 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a18b9230-de78-41b8-a61e-361b8bb1fbb3-tmp\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.743656 master-0 kubenswrapper[18707]: I0320 08:40:56.743496 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b543f82e-683d-47c1-af73-4dcede4cf4df-tmpfs\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:40:56.743656 master-0 kubenswrapper[18707]: I0320 08:40:56.743522 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:40:56.743656 master-0 kubenswrapper[18707]: I0320 08:40:56.743546 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:40:56.743656 master-0 kubenswrapper[18707]: I0320 08:40:56.743579 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdwvw\" (UniqueName: \"kubernetes.io/projected/2f844652-225b-4713-a9ad-cf9bcc348f47-kube-api-access-jdwvw\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:40:56.743656 master-0 kubenswrapper[18707]: I0320 08:40:56.743610 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmndg\" (UniqueName: \"kubernetes.io/projected/aa16c3bf-2350-46d1-afa0-9477b3ec8877-kube-api-access-qmndg\") pod \"cluster-storage-operator-7d87854d6-vlq7h\" (UID: \"aa16c3bf-2350-46d1-afa0-9477b3ec8877\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:40:56.743656 master-0 kubenswrapper[18707]: I0320 08:40:56.743636 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-images\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:40:56.743847 master-0 kubenswrapper[18707]: I0320 08:40:56.743664 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-log-socket\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.743847 master-0 kubenswrapper[18707]: I0320 08:40:56.743692 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:40:56.743847 master-0 kubenswrapper[18707]: I0320 08:40:56.743716 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.743847 master-0 kubenswrapper[18707]: I0320 08:40:56.743745 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:40:56.743847 master-0 kubenswrapper[18707]: I0320 08:40:56.743771 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-system-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.743847 master-0 kubenswrapper[18707]: I0320 08:40:56.743797 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wl7f\" (UniqueName: \"kubernetes.io/projected/e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97-kube-api-access-6wl7f\") pod \"network-check-source-b4bf74f6-fhvg6\" (UID: \"e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6" Mar 20 08:40:56.743847 master-0 kubenswrapper[18707]: I0320 08:40:56.743820 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-var-lib-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.744045 master-0 kubenswrapper[18707]: I0320 08:40:56.743905 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a18b9230-de78-41b8-a61e-361b8bb1fbb3-tmp\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.744045 master-0 kubenswrapper[18707]: I0320 08:40:56.743957 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.744045 master-0 kubenswrapper[18707]: I0320 08:40:56.743975 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b543f82e-683d-47c1-af73-4dcede4cf4df-tmpfs\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:40:56.744045 master-0 kubenswrapper[18707]: I0320 08:40:56.743995 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:40:56.744045 master-0 kubenswrapper[18707]: I0320 08:40:56.744022 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-image-import-ca\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:56.744238 master-0 kubenswrapper[18707]: I0320 08:40:56.744070 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:40:56.744238 master-0 kubenswrapper[18707]: I0320 08:40:56.744096 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-node-log\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.744238 master-0 kubenswrapper[18707]: I0320 08:40:56.744116 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:40:56.744238 master-0 kubenswrapper[18707]: I0320 08:40:56.744139 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.744238 master-0 kubenswrapper[18707]: I0320 08:40:56.744158 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-serving-cert\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:56.744238 master-0 kubenswrapper[18707]: I0320 08:40:56.744177 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.744238 master-0 kubenswrapper[18707]: I0320 08:40:56.744217 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91b2899e-8d24-41a0-bec8-d11c67b8f955-service-ca-bundle\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:40:56.744238 master-0 kubenswrapper[18707]: I0320 08:40:56.744235 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-binary-copy\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.744474 master-0 kubenswrapper[18707]: I0320 08:40:56.744258 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:40:56.744474 master-0 kubenswrapper[18707]: I0320 08:40:56.744283 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:40:56.744474 master-0 kubenswrapper[18707]: I0320 08:40:56.744294 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f53bc282-5937-49ac-ac98-2ee37ccb268d-cert\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:40:56.744474 master-0 kubenswrapper[18707]: I0320 08:40:56.744301 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.744474 master-0 kubenswrapper[18707]: I0320 08:40:56.744342 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:40:56.744474 master-0 kubenswrapper[18707]: I0320 08:40:56.744364 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ab175f7e-a5e8-4fda-98c9-6d052a221a83-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:40:56.744474 master-0 kubenswrapper[18707]: I0320 08:40:56.744396 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpdk5\" (UniqueName: \"kubernetes.io/projected/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-kube-api-access-lpdk5\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:40:56.744474 master-0 kubenswrapper[18707]: I0320 08:40:56.744417 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s2tb\" (UniqueName: \"kubernetes.io/projected/70020125-af49-47d7-8853-fb951c561dc4-kube-api-access-9s2tb\") pod \"node-resolver-qnp9w\" (UID: \"70020125-af49-47d7-8853-fb951c561dc4\") " pod="openshift-dns/node-resolver-qnp9w" Mar 20 08:40:56.744474 master-0 kubenswrapper[18707]: I0320 08:40:56.744435 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.744474 master-0 kubenswrapper[18707]: I0320 08:40:56.744454 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-serving-cert\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744477 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab175f7e-a5e8-4fda-98c9-6d052a221a83-serving-cert\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744509 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a57854ac-809a-4745-aaa1-774f0a08a560-config\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744529 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btdjt\" (UniqueName: \"kubernetes.io/projected/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-kube-api-access-btdjt\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744556 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-encryption-config\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744578 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744599 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-config\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744618 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-config\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744635 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/70020125-af49-47d7-8853-fb951c561dc4-hosts-file\") pod \"node-resolver-qnp9w\" (UID: \"70020125-af49-47d7-8853-fb951c561dc4\") " pod="openshift-dns/node-resolver-qnp9w" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744637 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-images\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744654 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-lib-modules\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744676 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744699 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1375da42-ecaf-4d86-b554-25fd1c3d00bd-serving-cert\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744729 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744749 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-certs\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744770 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744790 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c5ch\" (UniqueName: \"kubernetes.io/projected/b98b4efc-6117-487f-9cfc-38ce66dd9570-kube-api-access-6c5ch\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744809 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-config\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744846 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.744855 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-metrics-tls\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.743098 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5782718-9118-4682-a287-7998cd0304b3-mcd-auth-proxy-config\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:40:56.745112 master-0 kubenswrapper[18707]: I0320 08:40:56.745125 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovn-node-metrics-cert\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.745839 master-0 kubenswrapper[18707]: I0320 08:40:56.744864 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-kubelet\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.745839 master-0 kubenswrapper[18707]: I0320 08:40:56.745177 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee3cc021-67d8-4b7f-b443-16f18228712e-host-slash\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:40:56.745839 master-0 kubenswrapper[18707]: I0320 08:40:56.745210 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-sys\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.745839 master-0 kubenswrapper[18707]: I0320 08:40:56.745232 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:40:56.745839 master-0 kubenswrapper[18707]: I0320 08:40:56.745253 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-proxy-ca-bundles\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:40:56.745839 master-0 kubenswrapper[18707]: I0320 08:40:56.745460 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f046860d-2d54-4746-8ba2-f8e90fa55e38-serving-cert\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:40:56.745839 master-0 kubenswrapper[18707]: I0320 08:40:56.745819 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:40:56.747812 master-0 kubenswrapper[18707]: I0320 08:40:56.746258 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 08:40:56.747812 master-0 kubenswrapper[18707]: I0320 08:40:56.746478 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 08:40:56.747812 master-0 kubenswrapper[18707]: I0320 08:40:56.746681 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 20 08:40:56.747812 master-0 kubenswrapper[18707]: I0320 08:40:56.746847 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 08:40:56.747812 master-0 kubenswrapper[18707]: I0320 08:40:56.746968 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/325f0a83-d56d-4b62-977b-088a7d5f0e00-config\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:40:56.747812 master-0 kubenswrapper[18707]: I0320 08:40:56.747167 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:40:56.747812 master-0 kubenswrapper[18707]: I0320 08:40:56.747268 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ab175f7e-a5e8-4fda-98c9-6d052a221a83-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:40:56.747812 master-0 kubenswrapper[18707]: I0320 08:40:56.747340 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:40:56.747812 master-0 kubenswrapper[18707]: I0320 08:40:56.747445 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:40:56.747812 master-0 kubenswrapper[18707]: I0320 08:40:56.747552 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b4291bfd-53d9-4c78-b7cb-d7eb46560528-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:40:56.748164 master-0 kubenswrapper[18707]: I0320 08:40:56.747746 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:40:56.748237 master-0 kubenswrapper[18707]: I0320 08:40:56.748171 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:40:56.748347 master-0 kubenswrapper[18707]: I0320 08:40:56.748294 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acb704a9-6c8d-4378-ae93-e7095b1fce85-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:40:56.750488 master-0 kubenswrapper[18707]: I0320 08:40:56.748317 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:40:56.750558 master-0 kubenswrapper[18707]: I0320 08:40:56.748457 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 08:40:56.750834 master-0 kubenswrapper[18707]: I0320 08:40:56.750800 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-serving-cert\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:40:56.750884 master-0 kubenswrapper[18707]: I0320 08:40:56.750162 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68252533-bd64-4fc5-838a-cc350cbe77f0-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:40:56.750884 master-0 kubenswrapper[18707]: I0320 08:40:56.750313 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ee3cc021-67d8-4b7f-b443-16f18228712e-iptables-alerter-script\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:40:56.752115 master-0 kubenswrapper[18707]: I0320 08:40:56.752083 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 08:40:56.752287 master-0 kubenswrapper[18707]: I0320 08:40:56.745272 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27j9q\" (UniqueName: \"kubernetes.io/projected/4ddac301-a604-4f07-8849-5928befd336e-kube-api-access-27j9q\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:40:56.752287 master-0 kubenswrapper[18707]: I0320 08:40:56.752261 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-ca\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:40:56.752287 master-0 kubenswrapper[18707]: I0320 08:40:56.752284 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-config\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:40:56.752392 master-0 kubenswrapper[18707]: I0320 08:40:56.752308 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-cabundle\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:40:56.752392 master-0 kubenswrapper[18707]: I0320 08:40:56.752333 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:40:56.752392 master-0 kubenswrapper[18707]: I0320 08:40:56.752355 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:40:56.752392 master-0 kubenswrapper[18707]: I0320 08:40:56.752375 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-slash\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.752520 master-0 kubenswrapper[18707]: I0320 08:40:56.752397 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-webhook-cert\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:40:56.752520 master-0 kubenswrapper[18707]: I0320 08:40:56.752421 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2zzd\" (UniqueName: \"kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd\") pod \"network-check-target-xnrw6\" (UID: \"c0142d4e-9fd4-4375-a773-bb89b38af654\") " pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:40:56.752520 master-0 kubenswrapper[18707]: I0320 08:40:56.752444 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-images\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:40:56.759336 master-0 kubenswrapper[18707]: I0320 08:40:56.759254 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-hostroot\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.759336 master-0 kubenswrapper[18707]: I0320 08:40:56.759315 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrws5\" (UniqueName: \"kubernetes.io/projected/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-kube-api-access-wrws5\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:40:56.759336 master-0 kubenswrapper[18707]: I0320 08:40:56.759337 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:40:56.759488 master-0 kubenswrapper[18707]: I0320 08:40:56.759362 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6mb5\" (UniqueName: \"kubernetes.io/projected/ad692349-5089-4afc-85b2-9b6e7997567c-kube-api-access-h6mb5\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:40:56.759488 master-0 kubenswrapper[18707]: I0320 08:40:56.759383 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zpkn\" (UniqueName: \"kubernetes.io/projected/45e8b72b-564c-4bb1-b911-baff2d6c87ad-kube-api-access-9zpkn\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:40:56.759488 master-0 kubenswrapper[18707]: I0320 08:40:56.759404 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-multus-certs\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.759579 master-0 kubenswrapper[18707]: I0320 08:40:56.759545 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:40:56.759579 master-0 kubenswrapper[18707]: I0320 08:40:56.759568 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1375da42-ecaf-4d86-b554-25fd1c3d00bd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:40:56.759653 master-0 kubenswrapper[18707]: I0320 08:40:56.759590 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7k8m\" (UniqueName: \"kubernetes.io/projected/86cb5d23-df7f-4f67-8086-1789d8e68544-kube-api-access-j7k8m\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:40:56.759653 master-0 kubenswrapper[18707]: I0320 08:40:56.759611 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-k8s-cni-cncf-io\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.759653 master-0 kubenswrapper[18707]: I0320 08:40:56.759634 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-etcd-serving-ca\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:56.759741 master-0 kubenswrapper[18707]: I0320 08:40:56.759655 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:40:56.759741 master-0 kubenswrapper[18707]: I0320 08:40:56.759677 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2028761b8522f874dcebf13c4683d033\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:40:56.759741 master-0 kubenswrapper[18707]: I0320 08:40:56.759698 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbvtp\" (UniqueName: \"kubernetes.io/projected/f5782718-9118-4682-a287-7998cd0304b3-kube-api-access-bbvtp\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:40:56.759741 master-0 kubenswrapper[18707]: I0320 08:40:56.759719 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1375da42-ecaf-4d86-b554-25fd1c3d00bd-service-ca\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:40:56.759741 master-0 kubenswrapper[18707]: I0320 08:40:56.759742 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:56.760025 master-0 kubenswrapper[18707]: I0320 08:40:56.759768 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-trusted-ca\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:40:56.760025 master-0 kubenswrapper[18707]: I0320 08:40:56.759790 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-bound-sa-token\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:40:56.760025 master-0 kubenswrapper[18707]: I0320 08:40:56.759807 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:56.760120 master-0 kubenswrapper[18707]: I0320 08:40:56.758546 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:40:56.760156 master-0 kubenswrapper[18707]: I0320 08:40:56.753721 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-config\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:40:56.760203 master-0 kubenswrapper[18707]: I0320 08:40:56.749172 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 08:40:56.760327 master-0 kubenswrapper[18707]: I0320 08:40:56.760307 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4291bfd-53d9-4c78-b7cb-d7eb46560528-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:40:56.760439 master-0 kubenswrapper[18707]: I0320 08:40:56.758781 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:40:56.760582 master-0 kubenswrapper[18707]: I0320 08:40:56.758933 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f53bc282-5937-49ac-ac98-2ee37ccb268d-images\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:40:56.760622 master-0 kubenswrapper[18707]: I0320 08:40:56.757898 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-config\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:40:56.760690 master-0 kubenswrapper[18707]: I0320 08:40:56.756177 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f046860d-2d54-4746-8ba2-f8e90fa55e38-etcd-ca\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:40:56.760862 master-0 kubenswrapper[18707]: I0320 08:40:56.760843 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:40:56.760906 master-0 kubenswrapper[18707]: I0320 08:40:56.757542 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-proxy-tls\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:40:56.761067 master-0 kubenswrapper[18707]: I0320 08:40:56.758148 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/248a3d2f-3be4-46bf-959c-79d28736c0d6-ovnkube-script-lib\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.761816 master-0 kubenswrapper[18707]: I0320 08:40:56.761743 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa16c3bf-2350-46d1-afa0-9477b3ec8877-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-vlq7h\" (UID: \"aa16c3bf-2350-46d1-afa0-9477b3ec8877\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:40:56.761816 master-0 kubenswrapper[18707]: I0320 08:40:56.758324 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab175f7e-a5e8-4fda-98c9-6d052a221a83-serving-cert\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:40:56.761997 master-0 kubenswrapper[18707]: I0320 08:40:56.755950 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 08:40:56.762109 master-0 kubenswrapper[18707]: I0320 08:40:56.762062 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 08:40:56.762200 master-0 kubenswrapper[18707]: I0320 08:40:56.757151 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 20 08:40:56.762420 master-0 kubenswrapper[18707]: I0320 08:40:56.762394 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 08:40:56.762551 master-0 kubenswrapper[18707]: I0320 08:40:56.762525 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htv9s\" (UniqueName: \"kubernetes.io/projected/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e-kube-api-access-htv9s\") pod \"control-plane-machine-set-operator-6f97756bc8-7t5qv\" (UID: \"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:40:56.762605 master-0 kubenswrapper[18707]: I0320 08:40:56.762563 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-etcd-client\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:56.762605 master-0 kubenswrapper[18707]: I0320 08:40:56.762586 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg6th\" (UniqueName: \"kubernetes.io/projected/7b489385-2c96-4a97-8b31-362162de020e-kube-api-access-pg6th\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:40:56.762671 master-0 kubenswrapper[18707]: I0320 08:40:56.762608 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l4b7\" (UniqueName: \"kubernetes.io/projected/ee3cc021-67d8-4b7f-b443-16f18228712e-kube-api-access-7l4b7\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:40:56.762671 master-0 kubenswrapper[18707]: I0320 08:40:56.762632 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa759777-de22-4440-a3d3-ad429a3b8e7b-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:40:56.762671 master-0 kubenswrapper[18707]: I0320 08:40:56.762655 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-stats-auth\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:40:56.762766 master-0 kubenswrapper[18707]: I0320 08:40:56.762679 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/86cb5d23-df7f-4f67-8086-1789d8e68544-operand-assets\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:40:56.762766 master-0 kubenswrapper[18707]: I0320 08:40:56.762701 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.762766 master-0 kubenswrapper[18707]: I0320 08:40:56.762721 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/325f0a83-d56d-4b62-977b-088a7d5f0e00-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:40:56.762766 master-0 kubenswrapper[18707]: I0320 08:40:56.762742 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c2rq\" (UniqueName: \"kubernetes.io/projected/b543f82e-683d-47c1-af73-4dcede4cf4df-kube-api-access-4c2rq\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:40:56.762766 master-0 kubenswrapper[18707]: I0320 08:40:56.762761 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1375da42-ecaf-4d86-b554-25fd1c3d00bd-kube-api-access\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:40:56.762893 master-0 kubenswrapper[18707]: I0320 08:40:56.762784 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxg4z\" (UniqueName: \"kubernetes.io/projected/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-kube-api-access-fxg4z\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.762893 master-0 kubenswrapper[18707]: I0320 08:40:56.762803 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75e3e2cc-aa56-41f3-8859-1c086f419d05-serving-cert\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:40:56.762893 master-0 kubenswrapper[18707]: I0320 08:40:56.762823 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f5782718-9118-4682-a287-7998cd0304b3-rootfs\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:40:56.762893 master-0 kubenswrapper[18707]: I0320 08:40:56.762865 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa16c3bf-2350-46d1-afa0-9477b3ec8877-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-vlq7h\" (UID: \"aa16c3bf-2350-46d1-afa0-9477b3ec8877\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:40:56.763119 master-0 kubenswrapper[18707]: I0320 08:40:56.763090 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/45e8b72b-564c-4bb1-b911-baff2d6c87ad-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:40:56.763211 master-0 kubenswrapper[18707]: I0320 08:40:56.763165 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa759777-de22-4440-a3d3-ad429a3b8e7b-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:40:56.763501 master-0 kubenswrapper[18707]: I0320 08:40:56.763421 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/325f0a83-d56d-4b62-977b-088a7d5f0e00-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:40:56.763501 master-0 kubenswrapper[18707]: I0320 08:40:56.763497 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75e3e2cc-aa56-41f3-8859-1c086f419d05-serving-cert\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:40:56.763634 master-0 kubenswrapper[18707]: I0320 08:40:56.763587 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/86cb5d23-df7f-4f67-8086-1789d8e68544-operand-assets\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:40:56.766950 master-0 kubenswrapper[18707]: I0320 08:40:56.766905 18707 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 20 08:40:56.769032 master-0 kubenswrapper[18707]: E0320 08:40:56.768971 18707 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.787170 master-0 kubenswrapper[18707]: I0320 08:40:56.786891 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 08:40:56.797573 master-0 kubenswrapper[18707]: I0320 08:40:56.797512 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 08:40:56.807647 master-0 kubenswrapper[18707]: I0320 08:40:56.807594 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-daemon-config\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.815159 master-0 kubenswrapper[18707]: I0320 08:40:56.815110 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 08:40:56.824205 master-0 kubenswrapper[18707]: I0320 08:40:56.822662 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a57854ac-809a-4745-aaa1-774f0a08a560-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:40:56.838452 master-0 kubenswrapper[18707]: I0320 08:40:56.837632 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 08:40:56.846612 master-0 kubenswrapper[18707]: I0320 08:40:56.846551 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a57854ac-809a-4745-aaa1-774f0a08a560-config\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:40:56.857529 master-0 kubenswrapper[18707]: I0320 08:40:56.857483 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 08:40:56.864356 master-0 kubenswrapper[18707]: I0320 08:40:56.864258 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-root\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:56.864356 master-0 kubenswrapper[18707]: I0320 08:40:56.864302 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-var-lock\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:40:56.864356 master-0 kubenswrapper[18707]: I0320 08:40:56.864342 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:40:56.864491 master-0 kubenswrapper[18707]: I0320 08:40:56.864362 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-systemd\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.864491 master-0 kubenswrapper[18707]: I0320 08:40:56.864383 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a25248c0-8de7-4624-b785-f053665fcb23-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:40:56.864491 master-0 kubenswrapper[18707]: I0320 08:40:56.864403 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b639e578-628e-404d-b759-8b6e84e771d9-catalog-content\") pod \"community-operators-dtqgc\" (UID: \"b639e578-628e-404d-b759-8b6e84e771d9\") " pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:40:56.864491 master-0 kubenswrapper[18707]: I0320 08:40:56.864427 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/d88ba8e1-ee42-423f-9839-e71cb0265c6c-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:40:56.864491 master-0 kubenswrapper[18707]: I0320 08:40:56.864447 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-os-release\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.864491 master-0 kubenswrapper[18707]: I0320 08:40:56.864483 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.864651 master-0 kubenswrapper[18707]: I0320 08:40:56.864504 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysctl-d\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.864651 master-0 kubenswrapper[18707]: I0320 08:40:56.864524 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:40:56.864651 master-0 kubenswrapper[18707]: I0320 08:40:56.864581 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-server-tls\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:56.864651 master-0 kubenswrapper[18707]: I0320 08:40:56.864604 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-socket-dir-parent\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.864651 master-0 kubenswrapper[18707]: I0320 08:40:56.864637 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-log-socket\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.864799 master-0 kubenswrapper[18707]: I0320 08:40:56.864660 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnm6c\" (UniqueName: \"kubernetes.io/projected/f91d1788-027d-432b-be33-ca952a95046a-kube-api-access-lnm6c\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:40:56.864799 master-0 kubenswrapper[18707]: I0320 08:40:56.864688 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-system-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.864799 master-0 kubenswrapper[18707]: I0320 08:40:56.864712 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-var-lib-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.864799 master-0 kubenswrapper[18707]: I0320 08:40:56.864730 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.864799 master-0 kubenswrapper[18707]: I0320 08:40:56.864760 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.864799 master-0 kubenswrapper[18707]: I0320 08:40:56.864779 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:40:56.864799 master-0 kubenswrapper[18707]: I0320 08:40:56.864795 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-tls\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:56.865009 master-0 kubenswrapper[18707]: I0320 08:40:56.864816 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a638c468-010c-4da3-ad62-26f5f2bbdbb9-serving-cert\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:40:56.865009 master-0 kubenswrapper[18707]: I0320 08:40:56.864844 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-node-log\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.865009 master-0 kubenswrapper[18707]: I0320 08:40:56.864890 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:40:56.865009 master-0 kubenswrapper[18707]: I0320 08:40:56.864908 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.865009 master-0 kubenswrapper[18707]: I0320 08:40:56.864927 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.865009 master-0 kubenswrapper[18707]: I0320 08:40:56.864964 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.865009 master-0 kubenswrapper[18707]: I0320 08:40:56.864982 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4np\" (UniqueName: \"kubernetes.io/projected/d88ba8e1-ee42-423f-9839-e71cb0265c6c-kube-api-access-lw4np\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:40:56.865009 master-0 kubenswrapper[18707]: I0320 08:40:56.865007 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqmv5\" (UniqueName: \"kubernetes.io/projected/f1468ec0-2aa4-461c-a62f-e9f067be490f-kube-api-access-bqmv5\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:56.865264 master-0 kubenswrapper[18707]: I0320 08:40:56.865033 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/70020125-af49-47d7-8853-fb951c561dc4-hosts-file\") pod \"node-resolver-qnp9w\" (UID: \"70020125-af49-47d7-8853-fb951c561dc4\") " pod="openshift-dns/node-resolver-qnp9w" Mar 20 08:40:56.865264 master-0 kubenswrapper[18707]: I0320 08:40:56.865053 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-lib-modules\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.865264 master-0 kubenswrapper[18707]: I0320 08:40:56.865072 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/654b5b1c-2764-415c-bb13-aa06899f4076-catalog-content\") pod \"redhat-marketplace-hqqrk\" (UID: \"654b5b1c-2764-415c-bb13-aa06899f4076\") " pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:40:56.865264 master-0 kubenswrapper[18707]: I0320 08:40:56.865093 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zp8f\" (UniqueName: \"kubernetes.io/projected/b639e578-628e-404d-b759-8b6e84e771d9-kube-api-access-9zp8f\") pod \"community-operators-dtqgc\" (UID: \"b639e578-628e-404d-b759-8b6e84e771d9\") " pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:40:56.865264 master-0 kubenswrapper[18707]: I0320 08:40:56.865154 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:40:56.865264 master-0 kubenswrapper[18707]: I0320 08:40:56.865236 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:40:56.865459 master-0 kubenswrapper[18707]: I0320 08:40:56.865271 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:56.865459 master-0 kubenswrapper[18707]: I0320 08:40:56.865293 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/469183dd-dc54-467d-82a1-611132ae8ec4-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:40:56.865459 master-0 kubenswrapper[18707]: I0320 08:40:56.865312 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-kubelet\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.865459 master-0 kubenswrapper[18707]: I0320 08:40:56.865333 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.865459 master-0 kubenswrapper[18707]: I0320 08:40:56.865353 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-sys\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.865459 master-0 kubenswrapper[18707]: I0320 08:40:56.865379 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee3cc021-67d8-4b7f-b443-16f18228712e-host-slash\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:40:56.865459 master-0 kubenswrapper[18707]: I0320 08:40:56.865401 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f2217de0-7805-4f5f-8ea5-93b81b7e0236-snapshots\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:40:56.865459 master-0 kubenswrapper[18707]: I0320 08:40:56.865433 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a17669-a122-44aa-bdda-581bf1fc4649-utilities\") pod \"certified-operators-cc955\" (UID: \"c0a17669-a122-44aa-bdda-581bf1fc4649\") " pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:40:56.865459 master-0 kubenswrapper[18707]: I0320 08:40:56.865450 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:40:56.865792 master-0 kubenswrapper[18707]: I0320 08:40:56.865470 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-slash\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.865792 master-0 kubenswrapper[18707]: I0320 08:40:56.865501 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1db4d695-5a6a-4fbe-b610-3777bfebed79-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:40:56.865792 master-0 kubenswrapper[18707]: I0320 08:40:56.865519 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whmmk\" (UniqueName: \"kubernetes.io/projected/1db4d695-5a6a-4fbe-b610-3777bfebed79-kube-api-access-whmmk\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:40:56.865792 master-0 kubenswrapper[18707]: I0320 08:40:56.865537 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82x7p\" (UniqueName: \"kubernetes.io/projected/f2217de0-7805-4f5f-8ea5-93b81b7e0236-kube-api-access-82x7p\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:40:56.865792 master-0 kubenswrapper[18707]: I0320 08:40:56.865568 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-hostroot\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.865792 master-0 kubenswrapper[18707]: I0320 08:40:56.865617 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-multus-certs\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.865792 master-0 kubenswrapper[18707]: I0320 08:40:56.865635 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:40:56.865792 master-0 kubenswrapper[18707]: I0320 08:40:56.865667 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-client-ca\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:40:56.865792 master-0 kubenswrapper[18707]: I0320 08:40:56.865683 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-k8s-cni-cncf-io\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.865792 master-0 kubenswrapper[18707]: I0320 08:40:56.865703 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1375da42-ecaf-4d86-b554-25fd1c3d00bd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:40:56.865792 master-0 kubenswrapper[18707]: I0320 08:40:56.865730 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2028761b8522f874dcebf13c4683d033\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:40:56.866086 master-0 kubenswrapper[18707]: I0320 08:40:56.865928 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.866086 master-0 kubenswrapper[18707]: I0320 08:40:56.865912 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-config\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:40:56.866086 master-0 kubenswrapper[18707]: I0320 08:40:56.865960 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ae39c09b-7aef-4615-8ced-0dcad39f23a5-machine-approver-tls\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:40:56.866086 master-0 kubenswrapper[18707]: I0320 08:40:56.865992 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:56.866086 master-0 kubenswrapper[18707]: I0320 08:40:56.866024 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:56.866086 master-0 kubenswrapper[18707]: I0320 08:40:56.866045 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:40:56.866263 master-0 kubenswrapper[18707]: I0320 08:40:56.866092 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:40:56.866263 master-0 kubenswrapper[18707]: I0320 08:40:56.866093 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:40:56.866263 master-0 kubenswrapper[18707]: I0320 08:40:56.866116 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsht7\" (UniqueName: \"kubernetes.io/projected/21bebade-17fa-444e-92a9-eea53d6cd673-kube-api-access-zsht7\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:40:56.866263 master-0 kubenswrapper[18707]: I0320 08:40:56.866165 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcp8t\" (UniqueName: \"kubernetes.io/projected/654b5b1c-2764-415c-bb13-aa06899f4076-kube-api-access-xcp8t\") pod \"redhat-marketplace-hqqrk\" (UID: \"654b5b1c-2764-415c-bb13-aa06899f4076\") " pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:40:56.866263 master-0 kubenswrapper[18707]: I0320 08:40:56.866240 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.866263 master-0 kubenswrapper[18707]: I0320 08:40:56.866254 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/70020125-af49-47d7-8853-fb951c561dc4-hosts-file\") pod \"node-resolver-qnp9w\" (UID: \"70020125-af49-47d7-8853-fb951c561dc4\") " pod="openshift-dns/node-resolver-qnp9w" Mar 20 08:40:56.866430 master-0 kubenswrapper[18707]: I0320 08:40:56.866276 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a17669-a122-44aa-bdda-581bf1fc4649-catalog-content\") pod \"certified-operators-cc955\" (UID: \"c0a17669-a122-44aa-bdda-581bf1fc4649\") " pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:40:56.866430 master-0 kubenswrapper[18707]: I0320 08:40:56.866301 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b639e578-628e-404d-b759-8b6e84e771d9-utilities\") pod \"community-operators-dtqgc\" (UID: \"b639e578-628e-404d-b759-8b6e84e771d9\") " pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:40:56.866430 master-0 kubenswrapper[18707]: I0320 08:40:56.866341 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c593e31d-82b5-4d42-992e-6b050ccf3019-catalog-content\") pod \"redhat-operators-jstrn\" (UID: \"c593e31d-82b5-4d42-992e-6b050ccf3019\") " pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:40:56.866430 master-0 kubenswrapper[18707]: I0320 08:40:56.866365 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f5782718-9118-4682-a287-7998cd0304b3-rootfs\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:40:56.866430 master-0 kubenswrapper[18707]: I0320 08:40:56.866408 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-lib-modules\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.866430 master-0 kubenswrapper[18707]: I0320 08:40:56.866413 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-textfile\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:56.866592 master-0 kubenswrapper[18707]: I0320 08:40:56.866466 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-ovn\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.866592 master-0 kubenswrapper[18707]: I0320 08:40:56.866487 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.866592 master-0 kubenswrapper[18707]: I0320 08:40:56.866503 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-etc-kubernetes\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.866592 master-0 kubenswrapper[18707]: I0320 08:40:56.866503 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-textfile\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:56.866592 master-0 kubenswrapper[18707]: I0320 08:40:56.866556 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-systemd\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.866724 master-0 kubenswrapper[18707]: I0320 08:40:56.866633 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/654b5b1c-2764-415c-bb13-aa06899f4076-catalog-content\") pod \"redhat-marketplace-hqqrk\" (UID: \"654b5b1c-2764-415c-bb13-aa06899f4076\") " pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:40:56.866724 master-0 kubenswrapper[18707]: I0320 08:40:56.866716 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-kubelet\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.866781 master-0 kubenswrapper[18707]: I0320 08:40:56.866737 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a25248c0-8de7-4624-b785-f053665fcb23-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.866796 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b639e578-628e-404d-b759-8b6e84e771d9-catalog-content\") pod \"community-operators-dtqgc\" (UID: \"b639e578-628e-404d-b759-8b6e84e771d9\") " pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.866863 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-os-release\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.866916 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867035 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a17669-a122-44aa-bdda-581bf1fc4649-catalog-content\") pod \"certified-operators-cc955\" (UID: \"c0a17669-a122-44aa-bdda-581bf1fc4649\") " pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867064 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-log-socket\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867129 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-multus-certs\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867244 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c593e31d-82b5-4d42-992e-6b050ccf3019-catalog-content\") pod \"redhat-operators-jstrn\" (UID: \"c593e31d-82b5-4d42-992e-6b050ccf3019\") " pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867248 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysctl-d\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867286 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-bin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867324 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-bin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867312 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867330 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-etc-kubernetes\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867363 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a17669-a122-44aa-bdda-581bf1fc4649-utilities\") pod \"certified-operators-cc955\" (UID: \"c0a17669-a122-44aa-bdda-581bf1fc4649\") " pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867360 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysctl-conf\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867409 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867414 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1375da42-ecaf-4d86-b554-25fd1c3d00bd-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867432 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-k8s-cni-cncf-io\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867433 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/654b5b1c-2764-415c-bb13-aa06899f4076-utilities\") pod \"redhat-marketplace-hqqrk\" (UID: \"654b5b1c-2764-415c-bb13-aa06899f4076\") " pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867472 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867476 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2028761b8522f874dcebf13c4683d033\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:40:56.867520 master-0 kubenswrapper[18707]: I0320 08:40:56.867477 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/7e451189-850e-4d19-a40c-40f642d08511-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.867572 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-sys\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.867600 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.867629 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-slash\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.867673 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-var-lib-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.867755 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-socket-dir-parent\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.867789 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.867867 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/7e451189-850e-4d19-a40c-40f642d08511-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.867898 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.867918 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-bin\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.867944 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.867959 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.867969 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.868033 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/654b5b1c-2764-415c-bb13-aa06899f4076-utilities\") pod \"redhat-marketplace-hqqrk\" (UID: \"654b5b1c-2764-415c-bb13-aa06899f4076\") " pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.868002 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.868069 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.868090 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-node-log\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.868122 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.868155 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.868167 master-0 kubenswrapper[18707]: I0320 08:40:56.867989 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868235 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-ovn\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868239 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysctl-conf\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868259 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868301 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b639e578-628e-404d-b759-8b6e84e771d9-utilities\") pod \"community-operators-dtqgc\" (UID: \"b639e578-628e-404d-b759-8b6e84e771d9\") " pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868317 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf485\" (UniqueName: \"kubernetes.io/projected/c0a17669-a122-44aa-bdda-581bf1fc4649-kube-api-access-xf485\") pod \"certified-operators-cc955\" (UID: \"c0a17669-a122-44aa-bdda-581bf1fc4649\") " pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868352 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-system-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868377 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-client-certs\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868403 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f2217de0-7805-4f5f-8ea5-93b81b7e0236-snapshots\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868417 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21bebade-17fa-444e-92a9-eea53d6cd673-cert\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868448 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee3cc021-67d8-4b7f-b443-16f18228712e-host-slash\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868487 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c593e31d-82b5-4d42-992e-6b050ccf3019-utilities\") pod \"redhat-operators-jstrn\" (UID: \"c593e31d-82b5-4d42-992e-6b050ccf3019\") " pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868511 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-hostroot\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868528 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868577 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-cni-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868580 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-kubelet\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868594 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-bin\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868592 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868320 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f5782718-9118-4682-a287-7998cd0304b3-rootfs\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868614 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-var-lib-kubelet\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868632 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868642 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-kubelet\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868648 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2217de0-7805-4f5f-8ea5-93b81b7e0236-serving-cert\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:40:56.868706 master-0 kubenswrapper[18707]: I0320 08:40:56.868674 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c593e31d-82b5-4d42-992e-6b050ccf3019-utilities\") pod \"redhat-operators-jstrn\" (UID: \"c593e31d-82b5-4d42-992e-6b050ccf3019\") " pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:40:56.869419 master-0 kubenswrapper[18707]: I0320 08:40:56.868742 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/d88ba8e1-ee42-423f-9839-e71cb0265c6c-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:40:56.869419 master-0 kubenswrapper[18707]: I0320 08:40:56.868783 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxmkh\" (UniqueName: \"kubernetes.io/projected/c593e31d-82b5-4d42-992e-6b050ccf3019-kube-api-access-gxmkh\") pod \"redhat-operators-jstrn\" (UID: \"c593e31d-82b5-4d42-992e-6b050ccf3019\") " pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:40:56.869419 master-0 kubenswrapper[18707]: I0320 08:40:56.868835 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:56.869419 master-0 kubenswrapper[18707]: I0320 08:40:56.868876 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-var-lib-kubelet\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.869419 master-0 kubenswrapper[18707]: I0320 08:40:56.868885 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:40:56.869419 master-0 kubenswrapper[18707]: I0320 08:40:56.868922 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-run\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.869709 master-0 kubenswrapper[18707]: I0320 08:40:56.869619 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-conf-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.869709 master-0 kubenswrapper[18707]: I0320 08:40:56.869652 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-run\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.869782 master-0 kubenswrapper[18707]: I0320 08:40:56.869709 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-multus-conf-dir\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.869782 master-0 kubenswrapper[18707]: I0320 08:40:56.869761 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1468ec0-2aa4-461c-a62f-e9f067be490f-metrics-client-ca\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:56.869841 master-0 kubenswrapper[18707]: I0320 08:40:56.869803 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-metrics-server-audit-profiles\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:56.869884 master-0 kubenswrapper[18707]: I0320 08:40:56.869863 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6p8v\" (UniqueName: \"kubernetes.io/projected/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-kube-api-access-c6p8v\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:56.869920 master-0 kubenswrapper[18707]: I0320 08:40:56.869912 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46de2acc-9f5d-4ecf-befe-a480f86466f5-audit-dir\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:56.869999 master-0 kubenswrapper[18707]: I0320 08:40:56.869961 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysconfig\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.870033 master-0 kubenswrapper[18707]: I0320 08:40:56.870000 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46de2acc-9f5d-4ecf-befe-a480f86466f5-audit-dir\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:56.870067 master-0 kubenswrapper[18707]: I0320 08:40:56.870057 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-sysconfig\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.870107 master-0 kubenswrapper[18707]: I0320 08:40:56.870091 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f91d1788-027d-432b-be33-ca952a95046a-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:40:56.870146 master-0 kubenswrapper[18707]: I0320 08:40:56.870127 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxn2f\" (UniqueName: \"kubernetes.io/projected/ae39c09b-7aef-4615-8ced-0dcad39f23a5-kube-api-access-rxn2f\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:40:56.870179 master-0 kubenswrapper[18707]: I0320 08:40:56.870149 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dtbl\" (UniqueName: \"kubernetes.io/projected/469183dd-dc54-467d-82a1-611132ae8ec4-kube-api-access-8dtbl\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:40:56.870179 master-0 kubenswrapper[18707]: I0320 08:40:56.870171 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:40:56.870344 master-0 kubenswrapper[18707]: I0320 08:40:56.870312 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:40:56.870386 master-0 kubenswrapper[18707]: I0320 08:40:56.870369 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-audit-log\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:56.870433 master-0 kubenswrapper[18707]: I0320 08:40:56.870414 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-system-cni-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.870433 master-0 kubenswrapper[18707]: I0320 08:40:56.870423 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-audit-log\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:56.870487 master-0 kubenswrapper[18707]: I0320 08:40:56.870468 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-system-cni-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.870515 master-0 kubenswrapper[18707]: I0320 08:40:56.870459 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-mwfgx\" (UID: \"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:40:56.870549 master-0 kubenswrapper[18707]: I0320 08:40:56.870528 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:40:56.870578 master-0 kubenswrapper[18707]: I0320 08:40:56.870559 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.870653 master-0 kubenswrapper[18707]: I0320 08:40:56.870633 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2028761b8522f874dcebf13c4683d033\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:40:56.870684 master-0 kubenswrapper[18707]: I0320 08:40:56.870655 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:40:56.870718 master-0 kubenswrapper[18707]: I0320 08:40:56.870678 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-cnibin\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.870718 master-0 kubenswrapper[18707]: I0320 08:40:56.870691 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2028761b8522f874dcebf13c4683d033\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:40:56.870718 master-0 kubenswrapper[18707]: I0320 08:40:56.870707 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.870802 master-0 kubenswrapper[18707]: I0320 08:40:56.870722 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-cnibin\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.870802 master-0 kubenswrapper[18707]: I0320 08:40:56.870745 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46de2acc-9f5d-4ecf-befe-a480f86466f5-node-pullsecrets\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:56.870802 master-0 kubenswrapper[18707]: I0320 08:40:56.870771 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/469183dd-dc54-467d-82a1-611132ae8ec4-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:40:56.870802 master-0 kubenswrapper[18707]: I0320 08:40:56.870790 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-kubernetes\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.870802 master-0 kubenswrapper[18707]: I0320 08:40:56.870799 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b98b4efc-6117-487f-9cfc-38ce66dd9570-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:56.870933 master-0 kubenswrapper[18707]: I0320 08:40:56.870810 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:40:56.870933 master-0 kubenswrapper[18707]: I0320 08:40:56.870877 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-kubernetes\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.870933 master-0 kubenswrapper[18707]: I0320 08:40:56.870912 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-images\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:40:56.871016 master-0 kubenswrapper[18707]: I0320 08:40:56.870949 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:56.871016 master-0 kubenswrapper[18707]: I0320 08:40:56.870962 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46de2acc-9f5d-4ecf-befe-a480f86466f5-node-pullsecrets\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:56.871016 master-0 kubenswrapper[18707]: I0320 08:40:56.870981 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3de37144-a9ab-45fb-a23f-2287a5198edf-audit-dir\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:56.871096 master-0 kubenswrapper[18707]: I0320 08:40:56.871025 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8dd3d3608fe9c86b0f65904ec2353df4-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8dd3d3608fe9c86b0f65904ec2353df4\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:56.871096 master-0 kubenswrapper[18707]: I0320 08:40:56.871031 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3de37144-a9ab-45fb-a23f-2287a5198edf-audit-dir\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:56.871096 master-0 kubenswrapper[18707]: I0320 08:40:56.871051 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-systemd-units\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.871096 master-0 kubenswrapper[18707]: I0320 08:40:56.871067 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-systemd-units\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.871096 master-0 kubenswrapper[18707]: I0320 08:40:56.871094 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.871254 master-0 kubenswrapper[18707]: I0320 08:40:56.871167 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:40:56.871254 master-0 kubenswrapper[18707]: I0320 08:40:56.871173 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1375da42-ecaf-4d86-b554-25fd1c3d00bd-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:40:56.871254 master-0 kubenswrapper[18707]: I0320 08:40:56.871229 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-wtmp\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:56.871344 master-0 kubenswrapper[18707]: I0320 08:40:56.871257 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/7e451189-850e-4d19-a40c-40f642d08511-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:40:56.871344 master-0 kubenswrapper[18707]: I0320 08:40:56.871297 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1375da42-ecaf-4d86-b554-25fd1c3d00bd-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:40:56.871402 master-0 kubenswrapper[18707]: I0320 08:40:56.871342 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-netns\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.871402 master-0 kubenswrapper[18707]: I0320 08:40:56.871367 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:40:56.871402 master-0 kubenswrapper[18707]: I0320 08:40:56.871391 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-client-ca-bundle\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:56.871545 master-0 kubenswrapper[18707]: I0320 08:40:56.871429 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cnibin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.871545 master-0 kubenswrapper[18707]: I0320 08:40:56.871444 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/7e451189-850e-4d19-a40c-40f642d08511-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:40:56.871545 master-0 kubenswrapper[18707]: I0320 08:40:56.871499 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-modprobe-d\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.871545 master-0 kubenswrapper[18707]: I0320 08:40:56.871514 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:40:56.871545 master-0 kubenswrapper[18707]: I0320 08:40:56.871532 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cnibin\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.871545 master-0 kubenswrapper[18707]: I0320 08:40:56.871538 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:40:56.871704 master-0 kubenswrapper[18707]: I0320 08:40:56.871565 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-run-netns\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.871704 master-0 kubenswrapper[18707]: I0320 08:40:56.871568 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:40:56.871704 master-0 kubenswrapper[18707]: I0320 08:40:56.871614 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21bebade-17fa-444e-92a9-eea53d6cd673-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:40:56.871704 master-0 kubenswrapper[18707]: I0320 08:40:56.871618 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-etc-modprobe-d\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.871704 master-0 kubenswrapper[18707]: I0320 08:40:56.871630 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:40:56.871704 master-0 kubenswrapper[18707]: I0320 08:40:56.871662 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-config\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:40:56.871704 master-0 kubenswrapper[18707]: I0320 08:40:56.871683 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkccn\" (UniqueName: \"kubernetes.io/projected/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466-kube-api-access-gkccn\") pod \"cluster-samples-operator-85f7577d78-mwfgx\" (UID: \"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:40:56.871886 master-0 kubenswrapper[18707]: I0320 08:40:56.871732 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-sys\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:56.871886 master-0 kubenswrapper[18707]: I0320 08:40:56.871756 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-netns\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.871886 master-0 kubenswrapper[18707]: I0320 08:40:56.871817 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plc2q\" (UniqueName: \"kubernetes.io/projected/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-kube-api-access-plc2q\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:40:56.871886 master-0 kubenswrapper[18707]: I0320 08:40:56.871838 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-config\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:40:56.871886 master-0 kubenswrapper[18707]: I0320 08:40:56.871880 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:56.872021 master-0 kubenswrapper[18707]: I0320 08:40:56.871911 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-auth-proxy-config\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:40:56.872021 master-0 kubenswrapper[18707]: I0320 08:40:56.871935 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ad692349-5089-4afc-85b2-9b6e7997567c-host-etc-kube\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:40:56.872021 master-0 kubenswrapper[18707]: I0320 08:40:56.871817 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-run-netns\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.872021 master-0 kubenswrapper[18707]: I0320 08:40:56.871966 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-os-release\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.872021 master-0 kubenswrapper[18707]: I0320 08:40:56.871993 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-etc-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.872150 master-0 kubenswrapper[18707]: I0320 08:40:56.872043 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-os-release\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.872150 master-0 kubenswrapper[18707]: I0320 08:40:56.872074 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ad692349-5089-4afc-85b2-9b6e7997567c-host-etc-kube\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:40:56.872150 master-0 kubenswrapper[18707]: I0320 08:40:56.872094 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-etc-openvswitch\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.872150 master-0 kubenswrapper[18707]: I0320 08:40:56.872116 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdbds\" (UniqueName: \"kubernetes.io/projected/a638c468-010c-4da3-ad62-26f5f2bbdbb9-kube-api-access-sdbds\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:40:56.872150 master-0 kubenswrapper[18707]: I0320 08:40:56.872140 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:40:56.872389 master-0 kubenswrapper[18707]: I0320 08:40:56.872158 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-multus\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.872389 master-0 kubenswrapper[18707]: I0320 08:40:56.872207 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prcgg\" (UniqueName: \"kubernetes.io/projected/a25248c0-8de7-4624-b785-f053665fcb23-kube-api-access-prcgg\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:40:56.872389 master-0 kubenswrapper[18707]: I0320 08:40:56.872260 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-host-var-lib-cni-multus\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:56.872389 master-0 kubenswrapper[18707]: I0320 08:40:56.872301 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-host\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.872389 master-0 kubenswrapper[18707]: I0320 08:40:56.872320 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-systemd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.872389 master-0 kubenswrapper[18707]: I0320 08:40:56.872339 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:56.872389 master-0 kubenswrapper[18707]: I0320 08:40:56.872369 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-run-systemd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.872389 master-0 kubenswrapper[18707]: I0320 08:40:56.872391 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:40:56.872671 master-0 kubenswrapper[18707]: I0320 08:40:56.872421 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a18b9230-de78-41b8-a61e-361b8bb1fbb3-host\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:56.872671 master-0 kubenswrapper[18707]: I0320 08:40:56.872443 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-netd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.872671 master-0 kubenswrapper[18707]: I0320 08:40:56.872528 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/248a3d2f-3be4-46bf-959c-79d28736c0d6-host-cni-netd\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:56.896262 master-0 kubenswrapper[18707]: E0320 08:40:56.894258 18707 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"openshift-kube-scheduler-master-0\" already exists" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:40:56.896262 master-0 kubenswrapper[18707]: I0320 08:40:56.895673 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 08:40:56.914586 master-0 kubenswrapper[18707]: I0320 08:40:56.914519 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 08:40:56.924259 master-0 kubenswrapper[18707]: I0320 08:40:56.924174 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-etcd-client\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:56.935281 master-0 kubenswrapper[18707]: I0320 08:40:56.935227 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 08:40:56.942685 master-0 kubenswrapper[18707]: I0320 08:40:56.942633 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-serving-cert\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:56.955179 master-0 kubenswrapper[18707]: I0320 08:40:56.955133 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 08:40:56.960198 master-0 kubenswrapper[18707]: I0320 08:40:56.960136 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3de37144-a9ab-45fb-a23f-2287a5198edf-encryption-config\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:56.974870 master-0 kubenswrapper[18707]: I0320 08:40:56.974820 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/d88ba8e1-ee42-423f-9839-e71cb0265c6c-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:40:56.975069 master-0 kubenswrapper[18707]: I0320 08:40:56.975020 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/d88ba8e1-ee42-423f-9839-e71cb0265c6c-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:40:56.975143 master-0 kubenswrapper[18707]: I0320 08:40:56.975091 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 08:40:56.975276 master-0 kubenswrapper[18707]: I0320 08:40:56.975245 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:40:56.975521 master-0 kubenswrapper[18707]: I0320 08:40:56.975461 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:40:56.975717 master-0 kubenswrapper[18707]: I0320 08:40:56.975684 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-wtmp\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:56.975806 master-0 kubenswrapper[18707]: I0320 08:40:56.975779 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-wtmp\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:56.976001 master-0 kubenswrapper[18707]: I0320 08:40:56.975973 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-sys\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:56.976050 master-0 kubenswrapper[18707]: I0320 08:40:56.976022 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-sys\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:56.976137 master-0 kubenswrapper[18707]: I0320 08:40:56.976115 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-var-lock\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:40:56.976180 master-0 kubenswrapper[18707]: I0320 08:40:56.976162 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-root\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:56.976251 master-0 kubenswrapper[18707]: I0320 08:40:56.976225 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f1468ec0-2aa4-461c-a62f-e9f067be490f-root\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:56.976289 master-0 kubenswrapper[18707]: I0320 08:40:56.976162 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-var-lock\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:40:56.976331 master-0 kubenswrapper[18707]: I0320 08:40:56.976311 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:40:56.984143 master-0 kubenswrapper[18707]: I0320 08:40:56.984104 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bbc0b783-28d5-4554-b49d-c66082546f44-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:40:56.995871 master-0 kubenswrapper[18707]: I0320 08:40:56.995819 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 08:40:57.016492 master-0 kubenswrapper[18707]: I0320 08:40:57.016433 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 08:40:57.020482 master-0 kubenswrapper[18707]: I0320 08:40:57.020426 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-audit-policies\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:57.035000 master-0 kubenswrapper[18707]: I0320 08:40:57.034952 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 08:40:57.038836 master-0 kubenswrapper[18707]: I0320 08:40:57.038794 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-binary-copy\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:57.041201 master-0 kubenswrapper[18707]: I0320 08:40:57.041159 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-cni-binary-copy\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:40:57.054938 master-0 kubenswrapper[18707]: I0320 08:40:57.054874 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 08:40:57.061489 master-0 kubenswrapper[18707]: I0320 08:40:57.061405 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-etcd-serving-ca\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:57.074468 master-0 kubenswrapper[18707]: I0320 08:40:57.074415 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 08:40:57.080722 master-0 kubenswrapper[18707]: I0320 08:40:57.080676 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3de37144-a9ab-45fb-a23f-2287a5198edf-trusted-ca-bundle\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:57.095053 master-0 kubenswrapper[18707]: I0320 08:40:57.095004 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 08:40:57.106276 master-0 kubenswrapper[18707]: I0320 08:40:57.104902 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-etcd-client\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:57.108468 master-0 kubenswrapper[18707]: I0320 08:40:57.108225 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 20 08:40:57.115080 master-0 kubenswrapper[18707]: I0320 08:40:57.114947 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 08:40:57.119011 master-0 kubenswrapper[18707]: I0320 08:40:57.118898 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-serving-cert\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:57.138344 master-0 kubenswrapper[18707]: I0320 08:40:57.138083 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 08:40:57.146329 master-0 kubenswrapper[18707]: I0320 08:40:57.146272 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/46de2acc-9f5d-4ecf-befe-a480f86466f5-encryption-config\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:57.156385 master-0 kubenswrapper[18707]: I0320 08:40:57.155802 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 08:40:57.156385 master-0 kubenswrapper[18707]: I0320 08:40:57.156318 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-config\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:57.177147 master-0 kubenswrapper[18707]: I0320 08:40:57.176945 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 08:40:57.181268 master-0 kubenswrapper[18707]: I0320 08:40:57.180334 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-audit\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:57.196205 master-0 kubenswrapper[18707]: I0320 08:40:57.196125 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 08:40:57.200804 master-0 kubenswrapper[18707]: I0320 08:40:57.200732 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-etcd-serving-ca\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:57.214362 master-0 kubenswrapper[18707]: I0320 08:40:57.214302 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 08:40:57.218678 master-0 kubenswrapper[18707]: I0320 08:40:57.218652 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b489385-2c96-4a97-8b31-362162de020e-srv-cert\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:40:57.235049 master-0 kubenswrapper[18707]: I0320 08:40:57.234976 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 08:40:57.237115 master-0 kubenswrapper[18707]: I0320 08:40:57.237067 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-image-import-ca\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:57.261229 master-0 kubenswrapper[18707]: I0320 08:40:57.261159 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 08:40:57.271034 master-0 kubenswrapper[18707]: I0320 08:40:57.270966 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46de2acc-9f5d-4ecf-befe-a480f86466f5-trusted-ca-bundle\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:57.280097 master-0 kubenswrapper[18707]: I0320 08:40:57.280038 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 08:40:57.294579 master-0 kubenswrapper[18707]: I0320 08:40:57.294523 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 08:40:57.315372 master-0 kubenswrapper[18707]: I0320 08:40:57.315312 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 08:40:57.334567 master-0 kubenswrapper[18707]: I0320 08:40:57.334508 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 08:40:57.354944 master-0 kubenswrapper[18707]: I0320 08:40:57.354903 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 20 08:40:57.358446 master-0 kubenswrapper[18707]: I0320 08:40:57.358397 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:57.375006 master-0 kubenswrapper[18707]: I0320 08:40:57.374873 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 08:40:57.378998 master-0 kubenswrapper[18707]: I0320 08:40:57.378758 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-metrics-tls\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:40:57.400564 master-0 kubenswrapper[18707]: I0320 08:40:57.400500 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 08:40:57.404728 master-0 kubenswrapper[18707]: I0320 08:40:57.404673 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-trusted-ca\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:40:57.414750 master-0 kubenswrapper[18707]: I0320 08:40:57.414706 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 08:40:57.435756 master-0 kubenswrapper[18707]: I0320 08:40:57.435710 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 08:40:57.436928 master-0 kubenswrapper[18707]: I0320 08:40:57.436903 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5e3ddf9e-eeb5-4266-b675-092fd4e27623-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:40:57.454889 master-0 kubenswrapper[18707]: I0320 08:40:57.454834 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 08:40:57.463130 master-0 kubenswrapper[18707]: I0320 08:40:57.463069 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b98b4efc-6117-487f-9cfc-38ce66dd9570-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:40:57.505416 master-0 kubenswrapper[18707]: I0320 08:40:57.502850 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 20 08:40:57.505416 master-0 kubenswrapper[18707]: I0320 08:40:57.503325 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 08:40:57.515898 master-0 kubenswrapper[18707]: I0320 08:40:57.515861 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 08:40:57.518096 master-0 kubenswrapper[18707]: I0320 08:40:57.518033 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91b2899e-8d24-41a0-bec8-d11c67b8f955-service-ca-bundle\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:40:57.542255 master-0 kubenswrapper[18707]: I0320 08:40:57.541429 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 20 08:40:57.555532 master-0 kubenswrapper[18707]: I0320 08:40:57.555457 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 20 08:40:57.563561 master-0 kubenswrapper[18707]: I0320 08:40:57.563497 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/7e451189-850e-4d19-a40c-40f642d08511-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:40:57.577726 master-0 kubenswrapper[18707]: I0320 08:40:57.576604 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 20 08:40:57.593640 master-0 kubenswrapper[18707]: I0320 08:40:57.593581 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-gzg9m_ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/ingress-operator/1.log" Mar 20 08:40:57.595853 master-0 kubenswrapper[18707]: I0320 08:40:57.595495 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:40:57.597284 master-0 kubenswrapper[18707]: I0320 08:40:57.597252 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 20 08:40:57.598952 master-0 kubenswrapper[18707]: I0320 08:40:57.598879 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:40:57.608130 master-0 kubenswrapper[18707]: I0320 08:40:57.608108 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:40:57.620044 master-0 kubenswrapper[18707]: I0320 08:40:57.619991 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 20 08:40:57.636709 master-0 kubenswrapper[18707]: I0320 08:40:57.636586 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 20 08:40:57.639707 master-0 kubenswrapper[18707]: I0320 08:40:57.639659 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:40:57.640941 master-0 kubenswrapper[18707]: E0320 08:40:57.640853 18707 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 20 08:40:57.653362 master-0 kubenswrapper[18707]: I0320 08:40:57.653313 18707 request.go:700] Waited for 1.001871198s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-dns/configmaps?fieldSelector=metadata.name%3Ddns-default&limit=500&resourceVersion=0 Mar 20 08:40:57.654705 master-0 kubenswrapper[18707]: I0320 08:40:57.654672 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 08:40:57.660802 master-0 kubenswrapper[18707]: I0320 08:40:57.660771 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-config-volume\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:40:57.675001 master-0 kubenswrapper[18707]: I0320 08:40:57.674922 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 08:40:57.679741 master-0 kubenswrapper[18707]: I0320 08:40:57.679702 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-default-certificate\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:40:57.694931 master-0 kubenswrapper[18707]: I0320 08:40:57.694879 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 08:40:57.705128 master-0 kubenswrapper[18707]: I0320 08:40:57.704478 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-stats-auth\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:40:57.705128 master-0 kubenswrapper[18707]: I0320 08:40:57.704519 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-kubelet-dir\") pod \"d245e5b2-a30d-45c8-9b79-6e8096765c14\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " Mar 20 08:40:57.705128 master-0 kubenswrapper[18707]: I0320 08:40:57.704641 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-var-lock\") pod \"d245e5b2-a30d-45c8-9b79-6e8096765c14\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " Mar 20 08:40:57.705128 master-0 kubenswrapper[18707]: I0320 08:40:57.704638 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d245e5b2-a30d-45c8-9b79-6e8096765c14" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:57.705128 master-0 kubenswrapper[18707]: I0320 08:40:57.704725 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-var-lock" (OuterVolumeSpecName: "var-lock") pod "d245e5b2-a30d-45c8-9b79-6e8096765c14" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:40:57.706024 master-0 kubenswrapper[18707]: I0320 08:40:57.705987 18707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:57.706024 master-0 kubenswrapper[18707]: I0320 08:40:57.706016 18707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d245e5b2-a30d-45c8-9b79-6e8096765c14-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:40:57.714703 master-0 kubenswrapper[18707]: I0320 08:40:57.714672 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 08:40:57.721619 master-0 kubenswrapper[18707]: I0320 08:40:57.721580 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91b2899e-8d24-41a0-bec8-d11c67b8f955-metrics-certs\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:40:57.722563 master-0 kubenswrapper[18707]: E0320 08:40:57.722528 18707 secret.go:189] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.722631 master-0 kubenswrapper[18707]: E0320 08:40:57.722621 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5782718-9118-4682-a287-7998cd0304b3-proxy-tls podName:f5782718-9118-4682-a287-7998cd0304b3 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.222594355 +0000 UTC m=+3.378774711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f5782718-9118-4682-a287-7998cd0304b3-proxy-tls") pod "machine-config-daemon-9t8x6" (UID: "f5782718-9118-4682-a287-7998cd0304b3") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.734544 master-0 kubenswrapper[18707]: I0320 08:40:57.734493 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 08:40:57.737901 master-0 kubenswrapper[18707]: E0320 08:40:57.737864 18707 secret.go:189] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.738036 master-0 kubenswrapper[18707]: E0320 08:40:57.737984 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-proxy-tls podName:de6078d7-2aad-46fe-b17a-b6b38e4eaa41 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.237955724 +0000 UTC m=+3.394136080 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-proxy-tls") pod "machine-config-controller-b4f87c5b9-pj7rj" (UID: "de6078d7-2aad-46fe-b17a-b6b38e4eaa41") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.739292 master-0 kubenswrapper[18707]: E0320 08:40:57.739263 18707 secret.go:189] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.739352 master-0 kubenswrapper[18707]: E0320 08:40:57.739324 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-key podName:2f844652-225b-4713-a9ad-cf9bcc348f47 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.239310383 +0000 UTC m=+3.395490929 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-key") pod "service-ca-79bc6b8d76-72j8t" (UID: "2f844652-225b-4713-a9ad-cf9bcc348f47") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.739352 master-0 kubenswrapper[18707]: E0320 08:40:57.739349 18707 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.739445 master-0 kubenswrapper[18707]: E0320 08:40:57.739388 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a9ecf2-c476-4962-8333-21f242dbcb89-serving-cert podName:a9a9ecf2-c476-4962-8333-21f242dbcb89 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.239378835 +0000 UTC m=+3.395559401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a9a9ecf2-c476-4962-8333-21f242dbcb89-serving-cert") pod "controller-manager-fc56bb77c-qd4sn" (UID: "a9a9ecf2-c476-4962-8333-21f242dbcb89") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.739881 master-0 kubenswrapper[18707]: E0320 08:40:57.739852 18707 secret.go:189] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.739939 master-0 kubenswrapper[18707]: E0320 08:40:57.739902 18707 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.739939 master-0 kubenswrapper[18707]: E0320 08:40:57.739912 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e-control-plane-machine-set-operator-tls podName:e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.23989329 +0000 UTC m=+3.396073846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-6f97756bc8-7t5qv" (UID: "e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.740022 master-0 kubenswrapper[18707]: E0320 08:40:57.739942 18707 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.740022 master-0 kubenswrapper[18707]: E0320 08:40:57.739949 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47eadda0-35a6-4b5c-a96c-24854be15098-tls-certificates podName:47eadda0-35a6-4b5c-a96c-24854be15098 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.239938402 +0000 UTC m=+3.396118758 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/47eadda0-35a6-4b5c-a96c-24854be15098-tls-certificates") pod "prometheus-operator-admission-webhook-69c6b55594-tkc2j" (UID: "47eadda0-35a6-4b5c-a96c-24854be15098") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.740022 master-0 kubenswrapper[18707]: E0320 08:40:57.739980 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-metrics-tls podName:12e1d9e5-96b5-4367-81a5-d87b3f93d8da nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.239968932 +0000 UTC m=+3.396149528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-metrics-tls") pod "dns-default-v5h69" (UID: "12e1d9e5-96b5-4367-81a5-d87b3f93d8da") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.741401 master-0 kubenswrapper[18707]: E0320 08:40:57.741373 18707 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.741488 master-0 kubenswrapper[18707]: E0320 08:40:57.741429 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-client-ca podName:a9a9ecf2-c476-4962-8333-21f242dbcb89 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.241418025 +0000 UTC m=+3.397598381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-client-ca") pod "controller-manager-fc56bb77c-qd4sn" (UID: "a9a9ecf2-c476-4962-8333-21f242dbcb89") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.741488 master-0 kubenswrapper[18707]: E0320 08:40:57.741454 18707 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.741583 master-0 kubenswrapper[18707]: E0320 08:40:57.741513 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-apiservice-cert podName:b543f82e-683d-47c1-af73-4dcede4cf4df nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.241498867 +0000 UTC m=+3.397679463 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-apiservice-cert") pod "packageserver-6c85f64bb9-fmpsg" (UID: "b543f82e-683d-47c1-af73-4dcede4cf4df") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.744399 master-0 kubenswrapper[18707]: E0320 08:40:57.743837 18707 secret.go:189] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.744399 master-0 kubenswrapper[18707]: E0320 08:40:57.743922 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-node-bootstrap-token podName:4ddac301-a604-4f07-8849-5928befd336e nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.243909027 +0000 UTC m=+3.400089583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-node-bootstrap-token") pod "machine-config-server-gj4pm" (UID: "4ddac301-a604-4f07-8849-5928befd336e") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.746573 master-0 kubenswrapper[18707]: E0320 08:40:57.746312 18707 secret.go:189] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.746573 master-0 kubenswrapper[18707]: E0320 08:40:57.746346 18707 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.746573 master-0 kubenswrapper[18707]: E0320 08:40:57.746332 18707 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.746573 master-0 kubenswrapper[18707]: E0320 08:40:57.746434 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-certs podName:4ddac301-a604-4f07-8849-5928befd336e nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.24640936 +0000 UTC m=+3.402589716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-certs") pod "machine-config-server-gj4pm" (UID: "4ddac301-a604-4f07-8849-5928befd336e") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.746573 master-0 kubenswrapper[18707]: E0320 08:40:57.746482 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1375da42-ecaf-4d86-b554-25fd1c3d00bd-serving-cert podName:1375da42-ecaf-4d86-b554-25fd1c3d00bd nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.246470892 +0000 UTC m=+3.402651248 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/1375da42-ecaf-4d86-b554-25fd1c3d00bd-serving-cert") pod "cluster-version-operator-7d58488df-qmm8h" (UID: "1375da42-ecaf-4d86-b554-25fd1c3d00bd") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.746573 master-0 kubenswrapper[18707]: E0320 08:40:57.746503 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-config podName:a9a9ecf2-c476-4962-8333-21f242dbcb89 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.246495323 +0000 UTC m=+3.402675679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-config") pod "controller-manager-fc56bb77c-qd4sn" (UID: "a9a9ecf2-c476-4962-8333-21f242dbcb89") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.748115 master-0 kubenswrapper[18707]: E0320 08:40:57.748087 18707 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.748195 master-0 kubenswrapper[18707]: E0320 08:40:57.748159 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-proxy-ca-bundles podName:a9a9ecf2-c476-4962-8333-21f242dbcb89 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.248143071 +0000 UTC m=+3.404323617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-proxy-ca-bundles") pod "controller-manager-fc56bb77c-qd4sn" (UID: "a9a9ecf2-c476-4962-8333-21f242dbcb89") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.754653 master-0 kubenswrapper[18707]: I0320 08:40:57.754619 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 08:40:57.757700 master-0 kubenswrapper[18707]: E0320 08:40:57.757662 18707 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.758588 master-0 kubenswrapper[18707]: E0320 08:40:57.757746 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-webhook-cert podName:b543f82e-683d-47c1-af73-4dcede4cf4df nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.257726771 +0000 UTC m=+3.413907127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-webhook-cert") pod "packageserver-6c85f64bb9-fmpsg" (UID: "b543f82e-683d-47c1-af73-4dcede4cf4df") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.758588 master-0 kubenswrapper[18707]: E0320 08:40:57.757670 18707 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.758588 master-0 kubenswrapper[18707]: E0320 08:40:57.757802 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-cabundle podName:2f844652-225b-4713-a9ad-cf9bcc348f47 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.257792493 +0000 UTC m=+3.413972849 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-cabundle") pod "service-ca-79bc6b8d76-72j8t" (UID: "2f844652-225b-4713-a9ad-cf9bcc348f47") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.761381 master-0 kubenswrapper[18707]: E0320 08:40:57.761353 18707 configmap.go:193] Couldn't get configMap openshift-cluster-version/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.761527 master-0 kubenswrapper[18707]: E0320 08:40:57.761428 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1375da42-ecaf-4d86-b554-25fd1c3d00bd-service-ca podName:1375da42-ecaf-4d86-b554-25fd1c3d00bd nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.261411188 +0000 UTC m=+3.417591544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca" (UniqueName: "kubernetes.io/configmap/1375da42-ecaf-4d86-b554-25fd1c3d00bd-service-ca") pod "cluster-version-operator-7d58488df-qmm8h" (UID: "1375da42-ecaf-4d86-b554-25fd1c3d00bd") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.774865 master-0 kubenswrapper[18707]: I0320 08:40:57.774821 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 08:40:57.795066 master-0 kubenswrapper[18707]: I0320 08:40:57.794993 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 08:40:57.814696 master-0 kubenswrapper[18707]: I0320 08:40:57.814427 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 20 08:40:57.835733 master-0 kubenswrapper[18707]: I0320 08:40:57.835653 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 08:40:57.854911 master-0 kubenswrapper[18707]: I0320 08:40:57.854863 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 08:40:57.866317 master-0 kubenswrapper[18707]: E0320 08:40:57.866276 18707 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.866396 master-0 kubenswrapper[18707]: E0320 08:40:57.866368 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-config podName:a638c468-010c-4da3-ad62-26f5f2bbdbb9 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.366348451 +0000 UTC m=+3.522528807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-config") pod "route-controller-manager-56f686584b-fdcx5" (UID: "a638c468-010c-4da3-ad62-26f5f2bbdbb9") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.866760 master-0 kubenswrapper[18707]: E0320 08:40:57.866733 18707 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.866806 master-0 kubenswrapper[18707]: E0320 08:40:57.866755 18707 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.866874 master-0 kubenswrapper[18707]: E0320 08:40:57.866784 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-kube-rbac-proxy-config podName:a25248c0-8de7-4624-b785-f053665fcb23 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.366773803 +0000 UTC m=+3.522954159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-kube-rbac-proxy-config") pod "kube-state-metrics-7bbc969446-qh6vq" (UID: "a25248c0-8de7-4624-b785-f053665fcb23") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.866947 master-0 kubenswrapper[18707]: E0320 08:40:57.866884 18707 secret.go:189] Couldn't get secret openshift-cloud-credential-operator/cloud-credential-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.866947 master-0 kubenswrapper[18707]: E0320 08:40:57.866889 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-images podName:d88ba8e1-ee42-423f-9839-e71cb0265c6c nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.366866706 +0000 UTC m=+3.523047072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-images") pod "cluster-cloud-controller-manager-operator-7dff898856-7vxxr" (UID: "d88ba8e1-ee42-423f-9839-e71cb0265c6c") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.866947 master-0 kubenswrapper[18707]: E0320 08:40:57.866926 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/469183dd-dc54-467d-82a1-611132ae8ec4-cloud-credential-operator-serving-cert podName:469183dd-dc54-467d-82a1-611132ae8ec4 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.366917697 +0000 UTC m=+3.523098243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-credential-operator-serving-cert" (UniqueName: "kubernetes.io/secret/469183dd-dc54-467d-82a1-611132ae8ec4-cloud-credential-operator-serving-cert") pod "cloud-credential-operator-744f9dbf77-r4qvh" (UID: "469183dd-dc54-467d-82a1-611132ae8ec4") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.866947 master-0 kubenswrapper[18707]: E0320 08:40:57.866931 18707 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.867069 master-0 kubenswrapper[18707]: E0320 08:40:57.866971 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-kube-rbac-proxy-config podName:f1468ec0-2aa4-461c-a62f-e9f067be490f nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.366961828 +0000 UTC m=+3.523142384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-kube-rbac-proxy-config") pod "node-exporter-lb4t5" (UID: "f1468ec0-2aa4-461c-a62f-e9f067be490f") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.867439 master-0 kubenswrapper[18707]: E0320 08:40:57.867389 18707 configmap.go:193] Couldn't get configMap openshift-insights/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.867439 master-0 kubenswrapper[18707]: E0320 08:40:57.867421 18707 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.867536 master-0 kubenswrapper[18707]: E0320 08:40:57.867428 18707 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.867536 master-0 kubenswrapper[18707]: E0320 08:40:57.867459 18707 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.867536 master-0 kubenswrapper[18707]: E0320 08:40:57.867387 18707 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.867536 master-0 kubenswrapper[18707]: E0320 08:40:57.867503 18707 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.867536 master-0 kubenswrapper[18707]: E0320 08:40:57.867532 18707 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.867717 master-0 kubenswrapper[18707]: E0320 08:40:57.867468 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-service-ca-bundle podName:f2217de0-7805-4f5f-8ea5-93b81b7e0236 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.367452803 +0000 UTC m=+3.523633159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-service-ca-bundle") pod "insights-operator-68bf6ff9d6-mvfn5" (UID: "f2217de0-7805-4f5f-8ea5-93b81b7e0236") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.867717 master-0 kubenswrapper[18707]: E0320 08:40:57.867613 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1db4d695-5a6a-4fbe-b610-3777bfebed79-metrics-client-ca podName:1db4d695-5a6a-4fbe-b610-3777bfebed79 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.367599777 +0000 UTC m=+3.523780183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/1db4d695-5a6a-4fbe-b610-3777bfebed79-metrics-client-ca") pod "openshift-state-metrics-5dc6c74576-j8tmv" (UID: "1db4d695-5a6a-4fbe-b610-3777bfebed79") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.867717 master-0 kubenswrapper[18707]: E0320 08:40:57.867629 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d88ba8e1-ee42-423f-9839-e71cb0265c6c-cloud-controller-manager-operator-tls podName:d88ba8e1-ee42-423f-9839-e71cb0265c6c nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.367623628 +0000 UTC m=+3.523803984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/d88ba8e1-ee42-423f-9839-e71cb0265c6c-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-7dff898856-7vxxr" (UID: "d88ba8e1-ee42-423f-9839-e71cb0265c6c") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.867717 master-0 kubenswrapper[18707]: E0320 08:40:57.867644 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-server-tls podName:a69e8d3a-a0b1-4688-8631-d9f265aa4c69 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.367637338 +0000 UTC m=+3.523817694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-server-tls") pod "metrics-server-64c67d44c4-s7vfs" (UID: "a69e8d3a-a0b1-4688-8631-d9f265aa4c69") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.867717 master-0 kubenswrapper[18707]: E0320 08:40:57.867658 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-tls podName:a25248c0-8de7-4624-b785-f053665fcb23 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.367651759 +0000 UTC m=+3.523832115 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-tls") pod "kube-state-metrics-7bbc969446-qh6vq" (UID: "a25248c0-8de7-4624-b785-f053665fcb23") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.867717 master-0 kubenswrapper[18707]: E0320 08:40:57.867675 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a638c468-010c-4da3-ad62-26f5f2bbdbb9-serving-cert podName:a638c468-010c-4da3-ad62-26f5f2bbdbb9 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.367667059 +0000 UTC m=+3.523847415 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a638c468-010c-4da3-ad62-26f5f2bbdbb9-serving-cert") pod "route-controller-manager-56f686584b-fdcx5" (UID: "a638c468-010c-4da3-ad62-26f5f2bbdbb9") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.867717 master-0 kubenswrapper[18707]: E0320 08:40:57.867688 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-kube-rbac-proxy-config podName:1db4d695-5a6a-4fbe-b610-3777bfebed79 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.367680879 +0000 UTC m=+3.523861235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-kube-rbac-proxy-config") pod "openshift-state-metrics-5dc6c74576-j8tmv" (UID: "1db4d695-5a6a-4fbe-b610-3777bfebed79") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.867717 master-0 kubenswrapper[18707]: E0320 08:40:57.867723 18707 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.867717 master-0 kubenswrapper[18707]: E0320 08:40:57.867728 18707 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.868004 master-0 kubenswrapper[18707]: E0320 08:40:57.867756 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-tls podName:1db4d695-5a6a-4fbe-b610-3777bfebed79 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.367748891 +0000 UTC m=+3.523929247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-tls") pod "openshift-state-metrics-5dc6c74576-j8tmv" (UID: "1db4d695-5a6a-4fbe-b610-3777bfebed79") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.868004 master-0 kubenswrapper[18707]: E0320 08:40:57.867789 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-machine-api-operator-tls podName:c7f5e6cd-e093-409a-8758-d3db7a7eb32c nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.367783962 +0000 UTC m=+3.523964318 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-machine-api-operator-tls") pod "machine-api-operator-6fbb6cf6f9-n8tnn" (UID: "c7f5e6cd-e093-409a-8758-d3db7a7eb32c") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.868875 master-0 kubenswrapper[18707]: E0320 08:40:57.868841 18707 secret.go:189] Couldn't get secret openshift-insights/openshift-insights-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.868875 master-0 kubenswrapper[18707]: E0320 08:40:57.868870 18707 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.868996 master-0 kubenswrapper[18707]: E0320 08:40:57.868915 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2217de0-7805-4f5f-8ea5-93b81b7e0236-serving-cert podName:f2217de0-7805-4f5f-8ea5-93b81b7e0236 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.368883985 +0000 UTC m=+3.525064531 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f2217de0-7805-4f5f-8ea5-93b81b7e0236-serving-cert") pod "insights-operator-68bf6ff9d6-mvfn5" (UID: "f2217de0-7805-4f5f-8ea5-93b81b7e0236") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.868996 master-0 kubenswrapper[18707]: E0320 08:40:57.868952 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-tls podName:f1468ec0-2aa4-461c-a62f-e9f067be490f nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.368936476 +0000 UTC m=+3.525117122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-tls") pod "node-exporter-lb4t5" (UID: "f1468ec0-2aa4-461c-a62f-e9f067be490f") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.868996 master-0 kubenswrapper[18707]: E0320 08:40:57.868957 18707 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.868996 master-0 kubenswrapper[18707]: E0320 08:40:57.868982 18707 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.869165 master-0 kubenswrapper[18707]: E0320 08:40:57.868999 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-client-certs podName:a69e8d3a-a0b1-4688-8631-d9f265aa4c69 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.368991098 +0000 UTC m=+3.525171664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-client-certs") pod "metrics-server-64c67d44c4-s7vfs" (UID: "a69e8d3a-a0b1-4688-8631-d9f265aa4c69") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.869165 master-0 kubenswrapper[18707]: E0320 08:40:57.869031 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae39c09b-7aef-4615-8ced-0dcad39f23a5-machine-approver-tls podName:ae39c09b-7aef-4615-8ced-0dcad39f23a5 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.369012258 +0000 UTC m=+3.525192754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/ae39c09b-7aef-4615-8ced-0dcad39f23a5-machine-approver-tls") pod "machine-approver-5c6485487f-qb94j" (UID: "ae39c09b-7aef-4615-8ced-0dcad39f23a5") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.869165 master-0 kubenswrapper[18707]: E0320 08:40:57.869054 18707 secret.go:189] Couldn't get secret openshift-machine-api/cluster-autoscaler-operator-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.869165 master-0 kubenswrapper[18707]: E0320 08:40:57.869079 18707 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.869165 master-0 kubenswrapper[18707]: E0320 08:40:57.869083 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21bebade-17fa-444e-92a9-eea53d6cd673-cert podName:21bebade-17fa-444e-92a9-eea53d6cd673 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.36907434 +0000 UTC m=+3.525254686 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21bebade-17fa-444e-92a9-eea53d6cd673-cert") pod "cluster-autoscaler-operator-866dc4744-xwxg7" (UID: "21bebade-17fa-444e-92a9-eea53d6cd673") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.869165 master-0 kubenswrapper[18707]: E0320 08:40:57.869133 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-client-ca podName:a638c468-010c-4da3-ad62-26f5f2bbdbb9 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.369124102 +0000 UTC m=+3.525304648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-client-ca") pod "route-controller-manager-56f686584b-fdcx5" (UID: "a638c468-010c-4da3-ad62-26f5f2bbdbb9") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.870519 master-0 kubenswrapper[18707]: E0320 08:40:57.870490 18707 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.870519 master-0 kubenswrapper[18707]: E0320 08:40:57.870506 18707 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.870633 master-0 kubenswrapper[18707]: E0320 08:40:57.870503 18707 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.870633 master-0 kubenswrapper[18707]: E0320 08:40:57.870554 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-kube-rbac-proxy-config podName:f91d1788-027d-432b-be33-ca952a95046a nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.370538733 +0000 UTC m=+3.526719249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-kube-rbac-proxy-config") pod "prometheus-operator-6c8df6d4b-5m857" (UID: "f91d1788-027d-432b-be33-ca952a95046a") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.870633 master-0 kubenswrapper[18707]: E0320 08:40:57.870516 18707 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.870633 master-0 kubenswrapper[18707]: E0320 08:40:57.870529 18707 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.870773 master-0 kubenswrapper[18707]: E0320 08:40:57.870586 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f1468ec0-2aa4-461c-a62f-e9f067be490f-metrics-client-ca podName:f1468ec0-2aa4-461c-a62f-e9f067be490f nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.370574374 +0000 UTC m=+3.526754930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/f1468ec0-2aa4-461c-a62f-e9f067be490f-metrics-client-ca") pod "node-exporter-lb4t5" (UID: "f1468ec0-2aa4-461c-a62f-e9f067be490f") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.870773 master-0 kubenswrapper[18707]: E0320 08:40:57.870681 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f91d1788-027d-432b-be33-ca952a95046a-metrics-client-ca podName:f91d1788-027d-432b-be33-ca952a95046a nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.370670577 +0000 UTC m=+3.526850923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/f91d1788-027d-432b-be33-ca952a95046a-metrics-client-ca") pod "prometheus-operator-6c8df6d4b-5m857" (UID: "f91d1788-027d-432b-be33-ca952a95046a") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.870773 master-0 kubenswrapper[18707]: E0320 08:40:57.870703 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-auth-proxy-config podName:d88ba8e1-ee42-423f-9839-e71cb0265c6c nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.370693747 +0000 UTC m=+3.526874103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-7dff898856-7vxxr" (UID: "d88ba8e1-ee42-423f-9839-e71cb0265c6c") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.870773 master-0 kubenswrapper[18707]: E0320 08:40:57.870716 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-metrics-server-audit-profiles podName:a69e8d3a-a0b1-4688-8631-d9f265aa4c69 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.370709118 +0000 UTC m=+3.526889474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-metrics-server-audit-profiles") pod "metrics-server-64c67d44c4-s7vfs" (UID: "a69e8d3a-a0b1-4688-8631-d9f265aa4c69") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.870773 master-0 kubenswrapper[18707]: E0320 08:40:57.870721 18707 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.870773 master-0 kubenswrapper[18707]: E0320 08:40:57.870726 18707 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.870773 master-0 kubenswrapper[18707]: E0320 08:40:57.870760 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-metrics-client-ca podName:a25248c0-8de7-4624-b785-f053665fcb23 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.370750629 +0000 UTC m=+3.526930985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-metrics-client-ca") pod "kube-state-metrics-7bbc969446-qh6vq" (UID: "a25248c0-8de7-4624-b785-f053665fcb23") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.870773 master-0 kubenswrapper[18707]: E0320 08:40:57.870773 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466-samples-operator-tls podName:e3bf8eaf-5f6c-41a6-aaeb-6c921d789466 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.37076789 +0000 UTC m=+3.526948246 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466-samples-operator-tls") pod "cluster-samples-operator-85f7577d78-mwfgx" (UID: "e3bf8eaf-5f6c-41a6-aaeb-6c921d789466") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.871855 master-0 kubenswrapper[18707]: E0320 08:40:57.871818 18707 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy-cluster-autoscaler-operator: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.871920 master-0 kubenswrapper[18707]: E0320 08:40:57.871876 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/21bebade-17fa-444e-92a9-eea53d6cd673-auth-proxy-config podName:21bebade-17fa-444e-92a9-eea53d6cd673 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.371863692 +0000 UTC m=+3.528044058 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/21bebade-17fa-444e-92a9-eea53d6cd673-auth-proxy-config") pod "cluster-autoscaler-operator-866dc4744-xwxg7" (UID: "21bebade-17fa-444e-92a9-eea53d6cd673") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872030 master-0 kubenswrapper[18707]: E0320 08:40:57.871984 18707 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872030 master-0 kubenswrapper[18707]: E0320 08:40:57.872000 18707 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872111 master-0 kubenswrapper[18707]: E0320 08:40:57.872045 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-config podName:c7f5e6cd-e093-409a-8758-d3db7a7eb32c nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.372029426 +0000 UTC m=+3.528209982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-config") pod "machine-api-operator-6fbb6cf6f9-n8tnn" (UID: "c7f5e6cd-e093-409a-8758-d3db7a7eb32c") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872111 master-0 kubenswrapper[18707]: E0320 08:40:57.872060 18707 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872111 master-0 kubenswrapper[18707]: E0320 08:40:57.872061 18707 configmap.go:193] Couldn't get configMap openshift-monitoring/kube-state-metrics-custom-resource-state-configmap: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872111 master-0 kubenswrapper[18707]: E0320 08:40:57.872082 18707 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-40c28rqu4fltf: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.872111 master-0 kubenswrapper[18707]: E0320 08:40:57.872066 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-config podName:ae39c09b-7aef-4615-8ced-0dcad39f23a5 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.372057647 +0000 UTC m=+3.528238233 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-config") pod "machine-approver-5c6485487f-qb94j" (UID: "ae39c09b-7aef-4615-8ced-0dcad39f23a5") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872111 master-0 kubenswrapper[18707]: E0320 08:40:57.872093 18707 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872111 master-0 kubenswrapper[18707]: E0320 08:40:57.872106 18707 configmap.go:193] Couldn't get configMap openshift-cloud-credential-operator/cco-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872389 master-0 kubenswrapper[18707]: E0320 08:40:57.872066 18707 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872389 master-0 kubenswrapper[18707]: E0320 08:40:57.872113 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-auth-proxy-config podName:ae39c09b-7aef-4615-8ced-0dcad39f23a5 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.372102659 +0000 UTC m=+3.528283025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-auth-proxy-config") pod "machine-approver-5c6485487f-qb94j" (UID: "ae39c09b-7aef-4615-8ced-0dcad39f23a5") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872389 master-0 kubenswrapper[18707]: E0320 08:40:57.872289 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-custom-resource-state-configmap podName:a25248c0-8de7-4624-b785-f053665fcb23 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.372266233 +0000 UTC m=+3.528446629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-custom-resource-state-configmap" (UniqueName: "kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-custom-resource-state-configmap") pod "kube-state-metrics-7bbc969446-qh6vq" (UID: "a25248c0-8de7-4624-b785-f053665fcb23") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872389 master-0 kubenswrapper[18707]: E0320 08:40:57.872318 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-client-ca-bundle podName:a69e8d3a-a0b1-4688-8631-d9f265aa4c69 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.372304074 +0000 UTC m=+3.528484470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-client-ca-bundle") pod "metrics-server-64c67d44c4-s7vfs" (UID: "a69e8d3a-a0b1-4688-8631-d9f265aa4c69") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.872389 master-0 kubenswrapper[18707]: E0320 08:40:57.872346 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-configmap-kubelet-serving-ca-bundle podName:a69e8d3a-a0b1-4688-8631-d9f265aa4c69 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.372330855 +0000 UTC m=+3.528511251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-configmap-kubelet-serving-ca-bundle") pod "metrics-server-64c67d44c4-s7vfs" (UID: "a69e8d3a-a0b1-4688-8631-d9f265aa4c69") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872389 master-0 kubenswrapper[18707]: E0320 08:40:57.872376 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/469183dd-dc54-467d-82a1-611132ae8ec4-cco-trusted-ca podName:469183dd-dc54-467d-82a1-611132ae8ec4 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.372360036 +0000 UTC m=+3.528540642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cco-trusted-ca" (UniqueName: "kubernetes.io/configmap/469183dd-dc54-467d-82a1-611132ae8ec4-cco-trusted-ca") pod "cloud-credential-operator-744f9dbf77-r4qvh" (UID: "469183dd-dc54-467d-82a1-611132ae8ec4") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872619 master-0 kubenswrapper[18707]: E0320 08:40:57.872407 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-images podName:c7f5e6cd-e093-409a-8758-d3db7a7eb32c nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.372391737 +0000 UTC m=+3.528572323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-images") pod "machine-api-operator-6fbb6cf6f9-n8tnn" (UID: "c7f5e6cd-e093-409a-8758-d3db7a7eb32c") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872619 master-0 kubenswrapper[18707]: E0320 08:40:57.872116 18707 configmap.go:193] Couldn't get configMap openshift-insights/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872619 master-0 kubenswrapper[18707]: E0320 08:40:57.872483 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-trusted-ca-bundle podName:f2217de0-7805-4f5f-8ea5-93b81b7e0236 nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.372466159 +0000 UTC m=+3.528646685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-trusted-ca-bundle") pod "insights-operator-68bf6ff9d6-mvfn5" (UID: "f2217de0-7805-4f5f-8ea5-93b81b7e0236") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:40:57.872619 master-0 kubenswrapper[18707]: E0320 08:40:57.872597 18707 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.872765 master-0 kubenswrapper[18707]: E0320 08:40:57.872668 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-tls podName:f91d1788-027d-432b-be33-ca952a95046a nodeName:}" failed. No retries permitted until 2026-03-20 08:40:58.372653925 +0000 UTC m=+3.528834461 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-tls") pod "prometheus-operator-6c8df6d4b-5m857" (UID: "f91d1788-027d-432b-be33-ca952a95046a") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:40:57.874558 master-0 kubenswrapper[18707]: I0320 08:40:57.874525 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 08:40:57.894975 master-0 kubenswrapper[18707]: I0320 08:40:57.894823 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 08:40:57.916086 master-0 kubenswrapper[18707]: I0320 08:40:57.916004 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 08:40:57.936140 master-0 kubenswrapper[18707]: I0320 08:40:57.936069 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 08:40:57.953991 master-0 kubenswrapper[18707]: I0320 08:40:57.953932 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 08:40:57.976196 master-0 kubenswrapper[18707]: I0320 08:40:57.976125 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 08:40:57.994321 master-0 kubenswrapper[18707]: I0320 08:40:57.994260 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 08:40:58.015125 master-0 kubenswrapper[18707]: I0320 08:40:58.015066 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 08:40:58.034942 master-0 kubenswrapper[18707]: I0320 08:40:58.034881 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 08:40:58.055813 master-0 kubenswrapper[18707]: I0320 08:40:58.055589 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 08:40:58.074637 master-0 kubenswrapper[18707]: I0320 08:40:58.074444 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 08:40:58.095147 master-0 kubenswrapper[18707]: I0320 08:40:58.094957 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 08:40:58.114568 master-0 kubenswrapper[18707]: I0320 08:40:58.114522 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 08:40:58.134522 master-0 kubenswrapper[18707]: I0320 08:40:58.134467 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 08:40:58.158370 master-0 kubenswrapper[18707]: I0320 08:40:58.157013 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 08:40:58.174283 master-0 kubenswrapper[18707]: I0320 08:40:58.174221 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 20 08:40:58.202660 master-0 kubenswrapper[18707]: I0320 08:40:58.202600 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 20 08:40:58.215129 master-0 kubenswrapper[18707]: I0320 08:40:58.215077 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 20 08:40:58.236281 master-0 kubenswrapper[18707]: I0320 08:40:58.236206 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 08:40:58.255313 master-0 kubenswrapper[18707]: I0320 08:40:58.255244 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 20 08:40:58.274869 master-0 kubenswrapper[18707]: I0320 08:40:58.274807 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 08:40:58.294793 master-0 kubenswrapper[18707]: I0320 08:40:58.294731 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 08:40:58.317971 master-0 kubenswrapper[18707]: I0320 08:40:58.317910 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-node-bootstrap-token\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:40:58.318377 master-0 kubenswrapper[18707]: I0320 08:40:58.318330 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-config\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:40:58.318422 master-0 kubenswrapper[18707]: I0320 08:40:58.318400 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1375da42-ecaf-4d86-b554-25fd1c3d00bd-serving-cert\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:40:58.318568 master-0 kubenswrapper[18707]: I0320 08:40:58.318545 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-certs\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:40:58.318840 master-0 kubenswrapper[18707]: I0320 08:40:58.318785 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-config\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:40:58.318898 master-0 kubenswrapper[18707]: I0320 08:40:58.318854 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1375da42-ecaf-4d86-b554-25fd1c3d00bd-serving-cert\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:40:58.318898 master-0 kubenswrapper[18707]: I0320 08:40:58.318853 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-proxy-ca-bundles\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:40:58.318973 master-0 kubenswrapper[18707]: I0320 08:40:58.318935 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-certs\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:40:58.318973 master-0 kubenswrapper[18707]: I0320 08:40:58.318957 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-cabundle\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:40:58.319043 master-0 kubenswrapper[18707]: I0320 08:40:58.318996 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-webhook-cert\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:40:58.319215 master-0 kubenswrapper[18707]: I0320 08:40:58.319171 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-cabundle\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:40:58.319270 master-0 kubenswrapper[18707]: I0320 08:40:58.319174 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4ddac301-a604-4f07-8849-5928befd336e-node-bootstrap-token\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:40:58.319308 master-0 kubenswrapper[18707]: I0320 08:40:58.319289 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1375da42-ecaf-4d86-b554-25fd1c3d00bd-service-ca\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:40:58.319342 master-0 kubenswrapper[18707]: I0320 08:40:58.319319 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-webhook-cert\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:40:58.319467 master-0 kubenswrapper[18707]: I0320 08:40:58.319435 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5782718-9118-4682-a287-7998cd0304b3-proxy-tls\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:40:58.319512 master-0 kubenswrapper[18707]: I0320 08:40:58.319488 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/47eadda0-35a6-4b5c-a96c-24854be15098-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-tkc2j\" (UID: \"47eadda0-35a6-4b5c-a96c-24854be15098\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" Mar 20 08:40:58.319604 master-0 kubenswrapper[18707]: I0320 08:40:58.319582 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1375da42-ecaf-4d86-b554-25fd1c3d00bd-service-ca\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:40:58.319717 master-0 kubenswrapper[18707]: I0320 08:40:58.319695 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-metrics-tls\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:40:58.319758 master-0 kubenswrapper[18707]: I0320 08:40:58.319720 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/47eadda0-35a6-4b5c-a96c-24854be15098-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-tkc2j\" (UID: \"47eadda0-35a6-4b5c-a96c-24854be15098\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" Mar 20 08:40:58.319832 master-0 kubenswrapper[18707]: I0320 08:40:58.319801 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5782718-9118-4682-a287-7998cd0304b3-proxy-tls\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:40:58.319879 master-0 kubenswrapper[18707]: I0320 08:40:58.319813 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:40:58.319920 master-0 kubenswrapper[18707]: I0320 08:40:58.319907 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-metrics-tls\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:40:58.319975 master-0 kubenswrapper[18707]: I0320 08:40:58.319946 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9a9ecf2-c476-4962-8333-21f242dbcb89-serving-cert\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:40:58.320011 master-0 kubenswrapper[18707]: I0320 08:40:58.319985 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:40:58.320073 master-0 kubenswrapper[18707]: I0320 08:40:58.320050 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 08:40:58.320108 master-0 kubenswrapper[18707]: I0320 08:40:58.320085 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-key\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:40:58.320295 master-0 kubenswrapper[18707]: I0320 08:40:58.320259 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9a9ecf2-c476-4962-8333-21f242dbcb89-serving-cert\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:40:58.320295 master-0 kubenswrapper[18707]: I0320 08:40:58.320270 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2f844652-225b-4713-a9ad-cf9bcc348f47-signing-key\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:40:58.320376 master-0 kubenswrapper[18707]: I0320 08:40:58.320331 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-7t5qv\" (UID: \"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:40:58.320506 master-0 kubenswrapper[18707]: I0320 08:40:58.320482 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-client-ca\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:40:58.320556 master-0 kubenswrapper[18707]: I0320 08:40:58.320526 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-apiservice-cert\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:40:58.320762 master-0 kubenswrapper[18707]: I0320 08:40:58.320736 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-client-ca\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:40:58.320811 master-0 kubenswrapper[18707]: I0320 08:40:58.320789 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b543f82e-683d-47c1-af73-4dcede4cf4df-apiservice-cert\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:40:58.330377 master-0 kubenswrapper[18707]: I0320 08:40:58.330254 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-proxy-ca-bundles\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:40:58.335252 master-0 kubenswrapper[18707]: I0320 08:40:58.335228 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 08:40:58.360855 master-0 kubenswrapper[18707]: I0320 08:40:58.360812 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 20 08:40:58.375248 master-0 kubenswrapper[18707]: I0320 08:40:58.375213 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 20 08:40:58.395411 master-0 kubenswrapper[18707]: I0320 08:40:58.395373 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 08:40:58.415799 master-0 kubenswrapper[18707]: I0320 08:40:58.415661 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 08:40:58.421376 master-0 kubenswrapper[18707]: I0320 08:40:58.421334 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-7t5qv\" (UID: \"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:40:58.421681 master-0 kubenswrapper[18707]: I0320 08:40:58.421638 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:40:58.421744 master-0 kubenswrapper[18707]: I0320 08:40:58.421697 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-images\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:40:58.421805 master-0 kubenswrapper[18707]: I0320 08:40:58.421751 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-client-ca-bundle\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:58.421805 master-0 kubenswrapper[18707]: I0320 08:40:58.421779 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:40:58.421805 master-0 kubenswrapper[18707]: I0320 08:40:58.421797 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-config\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:40:58.422047 master-0 kubenswrapper[18707]: I0320 08:40:58.422014 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21bebade-17fa-444e-92a9-eea53d6cd673-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:40:58.422109 master-0 kubenswrapper[18707]: I0320 08:40:58.422077 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-config\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:40:58.422109 master-0 kubenswrapper[18707]: I0320 08:40:58.422103 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:58.422291 master-0 kubenswrapper[18707]: I0320 08:40:58.422260 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-auth-proxy-config\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:40:58.422405 master-0 kubenswrapper[18707]: I0320 08:40:58.422374 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:40:58.422475 master-0 kubenswrapper[18707]: I0320 08:40:58.422445 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/d88ba8e1-ee42-423f-9839-e71cb0265c6c-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:40:58.422521 master-0 kubenswrapper[18707]: I0320 08:40:58.422496 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:40:58.422567 master-0 kubenswrapper[18707]: I0320 08:40:58.422440 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/21bebade-17fa-444e-92a9-eea53d6cd673-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:40:58.422567 master-0 kubenswrapper[18707]: I0320 08:40:58.422521 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-server-tls\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:58.422652 master-0 kubenswrapper[18707]: I0320 08:40:58.422627 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-tls\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:58.422652 master-0 kubenswrapper[18707]: I0320 08:40:58.422651 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a638c468-010c-4da3-ad62-26f5f2bbdbb9-serving-cert\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:40:58.422749 master-0 kubenswrapper[18707]: I0320 08:40:58.422678 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:40:58.422852 master-0 kubenswrapper[18707]: I0320 08:40:58.422825 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:40:58.422907 master-0 kubenswrapper[18707]: I0320 08:40:58.422853 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:40:58.422907 master-0 kubenswrapper[18707]: I0320 08:40:58.422876 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:58.422907 master-0 kubenswrapper[18707]: I0320 08:40:58.422893 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a638c468-010c-4da3-ad62-26f5f2bbdbb9-serving-cert\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:40:58.422907 master-0 kubenswrapper[18707]: I0320 08:40:58.422899 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/469183dd-dc54-467d-82a1-611132ae8ec4-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:40:58.423066 master-0 kubenswrapper[18707]: I0320 08:40:58.422981 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1db4d695-5a6a-4fbe-b610-3777bfebed79-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:40:58.423134 master-0 kubenswrapper[18707]: I0320 08:40:58.423106 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:40:58.423210 master-0 kubenswrapper[18707]: I0320 08:40:58.423146 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-client-ca\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:40:58.423210 master-0 kubenswrapper[18707]: I0320 08:40:58.423175 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-config\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:40:58.423291 master-0 kubenswrapper[18707]: I0320 08:40:58.423218 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/469183dd-dc54-467d-82a1-611132ae8ec4-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:40:58.423333 master-0 kubenswrapper[18707]: I0320 08:40:58.423282 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ae39c09b-7aef-4615-8ced-0dcad39f23a5-machine-approver-tls\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:40:58.423416 master-0 kubenswrapper[18707]: I0320 08:40:58.423383 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:40:58.423468 master-0 kubenswrapper[18707]: I0320 08:40:58.423419 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-client-ca\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:40:58.423468 master-0 kubenswrapper[18707]: I0320 08:40:58.423443 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:40:58.423541 master-0 kubenswrapper[18707]: I0320 08:40:58.423486 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-config\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:40:58.423654 master-0 kubenswrapper[18707]: I0320 08:40:58.423626 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-client-certs\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:58.423712 master-0 kubenswrapper[18707]: I0320 08:40:58.423657 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21bebade-17fa-444e-92a9-eea53d6cd673-cert\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:40:58.423712 master-0 kubenswrapper[18707]: I0320 08:40:58.423701 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2217de0-7805-4f5f-8ea5-93b81b7e0236-serving-cert\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:40:58.423841 master-0 kubenswrapper[18707]: I0320 08:40:58.423812 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:40:58.423894 master-0 kubenswrapper[18707]: I0320 08:40:58.423869 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1468ec0-2aa4-461c-a62f-e9f067be490f-metrics-client-ca\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:58.423938 master-0 kubenswrapper[18707]: I0320 08:40:58.423903 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-metrics-server-audit-profiles\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:58.423985 master-0 kubenswrapper[18707]: I0320 08:40:58.423959 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21bebade-17fa-444e-92a9-eea53d6cd673-cert\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:40:58.423985 master-0 kubenswrapper[18707]: I0320 08:40:58.423972 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f91d1788-027d-432b-be33-ca952a95046a-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:40:58.424067 master-0 kubenswrapper[18707]: I0320 08:40:58.424049 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:40:58.424106 master-0 kubenswrapper[18707]: I0320 08:40:58.424094 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-mwfgx\" (UID: \"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:40:58.424153 master-0 kubenswrapper[18707]: I0320 08:40:58.424115 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:40:58.424214 master-0 kubenswrapper[18707]: I0320 08:40:58.424156 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/469183dd-dc54-467d-82a1-611132ae8ec4-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:40:58.424399 master-0 kubenswrapper[18707]: I0320 08:40:58.424372 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-mwfgx\" (UID: \"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:40:58.424533 master-0 kubenswrapper[18707]: I0320 08:40:58.424505 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/469183dd-dc54-467d-82a1-611132ae8ec4-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:40:58.435218 master-0 kubenswrapper[18707]: I0320 08:40:58.435155 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 08:40:58.454154 master-0 kubenswrapper[18707]: I0320 08:40:58.454108 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-vd4cn" Mar 20 08:40:58.476046 master-0 kubenswrapper[18707]: I0320 08:40:58.475993 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-wq6zb" Mar 20 08:40:58.504175 master-0 kubenswrapper[18707]: I0320 08:40:58.504104 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 20 08:40:58.513005 master-0 kubenswrapper[18707]: I0320 08:40:58.512959 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:40:58.515291 master-0 kubenswrapper[18707]: I0320 08:40:58.515172 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 08:40:58.522850 master-0 kubenswrapper[18707]: I0320 08:40:58.522817 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-images\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:40:58.534731 master-0 kubenswrapper[18707]: I0320 08:40:58.534696 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 20 08:40:58.544195 master-0 kubenswrapper[18707]: I0320 08:40:58.544155 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2217de0-7805-4f5f-8ea5-93b81b7e0236-serving-cert\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:40:58.555416 master-0 kubenswrapper[18707]: I0320 08:40:58.555392 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 20 08:40:58.574141 master-0 kubenswrapper[18707]: I0320 08:40:58.574113 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 20 08:40:58.594470 master-0 kubenswrapper[18707]: I0320 08:40:58.594435 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 20 08:40:58.599946 master-0 kubenswrapper[18707]: I0320 08:40:58.599906 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:40:58.603697 master-0 kubenswrapper[18707]: I0320 08:40:58.603649 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2217de0-7805-4f5f-8ea5-93b81b7e0236-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:40:58.615067 master-0 kubenswrapper[18707]: I0320 08:40:58.615028 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 08:40:58.624443 master-0 kubenswrapper[18707]: I0320 08:40:58.624385 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:40:58.635270 master-0 kubenswrapper[18707]: I0320 08:40:58.635217 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 08:40:58.642692 master-0 kubenswrapper[18707]: I0320 08:40:58.642654 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-config\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:40:58.653385 master-0 kubenswrapper[18707]: I0320 08:40:58.653327 18707 request.go:700] Waited for 1.993461825s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/secrets?fieldSelector=metadata.name%3Dprometheus-operator-tls&limit=500&resourceVersion=0 Mar 20 08:40:58.654975 master-0 kubenswrapper[18707]: I0320 08:40:58.654941 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 20 08:40:58.662988 master-0 kubenswrapper[18707]: I0320 08:40:58.662929 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:40:58.674706 master-0 kubenswrapper[18707]: I0320 08:40:58.674619 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-qvkkb" Mar 20 08:40:58.694339 master-0 kubenswrapper[18707]: I0320 08:40:58.694311 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 20 08:40:58.705098 master-0 kubenswrapper[18707]: I0320 08:40:58.705043 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f91d1788-027d-432b-be33-ca952a95046a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:40:58.718444 master-0 kubenswrapper[18707]: I0320 08:40:58.718348 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 20 08:40:58.727019 master-0 kubenswrapper[18707]: I0320 08:40:58.726749 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f91d1788-027d-432b-be33-ca952a95046a-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:40:58.727019 master-0 kubenswrapper[18707]: I0320 08:40:58.726818 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1db4d695-5a6a-4fbe-b610-3777bfebed79-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:40:58.727019 master-0 kubenswrapper[18707]: I0320 08:40:58.726868 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:40:58.727019 master-0 kubenswrapper[18707]: I0320 08:40:58.726974 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1468ec0-2aa4-461c-a62f-e9f067be490f-metrics-client-ca\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:58.770269 master-0 kubenswrapper[18707]: I0320 08:40:58.734773 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 08:40:58.773648 master-0 kubenswrapper[18707]: I0320 08:40:58.773587 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-rfqnk" Mar 20 08:40:58.795100 master-0 kubenswrapper[18707]: I0320 08:40:58.795048 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 08:40:58.805872 master-0 kubenswrapper[18707]: I0320 08:40:58.805830 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ae39c09b-7aef-4615-8ced-0dcad39f23a5-machine-approver-tls\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:40:58.815658 master-0 kubenswrapper[18707]: I0320 08:40:58.815601 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 08:40:58.824329 master-0 kubenswrapper[18707]: I0320 08:40:58.823307 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-auth-proxy-config\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:40:58.834968 master-0 kubenswrapper[18707]: I0320 08:40:58.834898 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 20 08:40:58.844341 master-0 kubenswrapper[18707]: I0320 08:40:58.844286 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:40:58.855432 master-0 kubenswrapper[18707]: I0320 08:40:58.855379 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-d7bxn" Mar 20 08:40:58.875480 master-0 kubenswrapper[18707]: I0320 08:40:58.875409 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 08:40:58.882720 master-0 kubenswrapper[18707]: I0320 08:40:58.882652 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae39c09b-7aef-4615-8ced-0dcad39f23a5-config\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:40:58.895199 master-0 kubenswrapper[18707]: I0320 08:40:58.895139 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 20 08:40:58.905468 master-0 kubenswrapper[18707]: I0320 08:40:58.905412 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d88ba8e1-ee42-423f-9839-e71cb0265c6c-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:40:58.915795 master-0 kubenswrapper[18707]: I0320 08:40:58.915741 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 20 08:40:58.923631 master-0 kubenswrapper[18707]: I0320 08:40:58.923569 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/d88ba8e1-ee42-423f-9839-e71cb0265c6c-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:40:58.935053 master-0 kubenswrapper[18707]: I0320 08:40:58.934918 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:40:58.955697 master-0 kubenswrapper[18707]: I0320 08:40:58.955639 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 08:40:58.976154 master-0 kubenswrapper[18707]: I0320 08:40:58.976084 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 08:40:58.995374 master-0 kubenswrapper[18707]: I0320 08:40:58.995295 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-frnfd" Mar 20 08:40:59.014818 master-0 kubenswrapper[18707]: I0320 08:40:59.014729 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 20 08:40:59.023804 master-0 kubenswrapper[18707]: I0320 08:40:59.023715 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:40:59.034975 master-0 kubenswrapper[18707]: I0320 08:40:59.034900 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 20 08:40:59.043729 master-0 kubenswrapper[18707]: I0320 08:40:59.043671 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1db4d695-5a6a-4fbe-b610-3777bfebed79-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:40:59.057166 master-0 kubenswrapper[18707]: I0320 08:40:59.057053 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-n6dht" Mar 20 08:40:59.075770 master-0 kubenswrapper[18707]: I0320 08:40:59.075702 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-js69c" Mar 20 08:40:59.096672 master-0 kubenswrapper[18707]: I0320 08:40:59.096593 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 20 08:40:59.102418 master-0 kubenswrapper[18707]: I0320 08:40:59.102378 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:40:59.114427 master-0 kubenswrapper[18707]: I0320 08:40:59.114361 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 20 08:40:59.127686 master-0 kubenswrapper[18707]: I0320 08:40:59.124154 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:40:59.134574 master-0 kubenswrapper[18707]: I0320 08:40:59.134509 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-bksjt" Mar 20 08:40:59.154959 master-0 kubenswrapper[18707]: I0320 08:40:59.154900 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 20 08:40:59.163343 master-0 kubenswrapper[18707]: I0320 08:40:59.163297 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a25248c0-8de7-4624-b785-f053665fcb23-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:40:59.174912 master-0 kubenswrapper[18707]: I0320 08:40:59.174854 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 20 08:40:59.184319 master-0 kubenswrapper[18707]: I0320 08:40:59.184285 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:59.194868 master-0 kubenswrapper[18707]: I0320 08:40:59.194780 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-62tl6" Mar 20 08:40:59.214951 master-0 kubenswrapper[18707]: I0320 08:40:59.214899 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 20 08:40:59.223798 master-0 kubenswrapper[18707]: I0320 08:40:59.223751 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f1468ec0-2aa4-461c-a62f-e9f067be490f-node-exporter-tls\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:40:59.234695 master-0 kubenswrapper[18707]: I0320 08:40:59.234635 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 20 08:40:59.244115 master-0 kubenswrapper[18707]: I0320 08:40:59.244059 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-server-tls\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:59.255171 master-0 kubenswrapper[18707]: I0320 08:40:59.255138 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-2trhv" Mar 20 08:40:59.275880 master-0 kubenswrapper[18707]: I0320 08:40:59.275830 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-40c28rqu4fltf" Mar 20 08:40:59.282611 master-0 kubenswrapper[18707]: I0320 08:40:59.282553 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-client-ca-bundle\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:59.297773 master-0 kubenswrapper[18707]: I0320 08:40:59.297681 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 20 08:40:59.305372 master-0 kubenswrapper[18707]: I0320 08:40:59.305182 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-client-certs\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:59.316055 master-0 kubenswrapper[18707]: I0320 08:40:59.315990 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 20 08:40:59.323211 master-0 kubenswrapper[18707]: I0320 08:40:59.323150 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:59.336482 master-0 kubenswrapper[18707]: I0320 08:40:59.336417 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 20 08:40:59.345240 master-0 kubenswrapper[18707]: I0320 08:40:59.345172 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-metrics-server-audit-profiles\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:40:59.396792 master-0 kubenswrapper[18707]: I0320 08:40:59.396710 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd6vv\" (UniqueName: \"kubernetes.io/projected/5e3ddf9e-eeb5-4266-b675-092fd4e27623-kube-api-access-xd6vv\") pod \"ovnkube-control-plane-57f769d897-z2zpj\" (UID: \"5e3ddf9e-eeb5-4266-b675-092fd4e27623\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" Mar 20 08:40:59.415946 master-0 kubenswrapper[18707]: I0320 08:40:59.415843 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqdlf\" (UniqueName: \"kubernetes.io/projected/325f0a83-d56d-4b62-977b-088a7d5f0e00-kube-api-access-lqdlf\") pod \"openshift-apiserver-operator-d65958b8-th2vj\" (UID: \"325f0a83-d56d-4b62-977b-088a7d5f0e00\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-th2vj" Mar 20 08:40:59.432730 master-0 kubenswrapper[18707]: I0320 08:40:59.432662 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66kz7\" (UniqueName: \"kubernetes.io/projected/de6078d7-2aad-46fe-b17a-b6b38e4eaa41-kube-api-access-66kz7\") pod \"machine-config-controller-b4f87c5b9-pj7rj\" (UID: \"de6078d7-2aad-46fe-b17a-b6b38e4eaa41\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" Mar 20 08:40:59.448076 master-0 kubenswrapper[18707]: I0320 08:40:59.447940 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr8br\" (UniqueName: \"kubernetes.io/projected/3de37144-a9ab-45fb-a23f-2287a5198edf-kube-api-access-zr8br\") pod \"apiserver-bc9b556d6-vdnq2\" (UID: \"3de37144-a9ab-45fb-a23f-2287a5198edf\") " pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:40:59.467899 master-0 kubenswrapper[18707]: I0320 08:40:59.467843 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2qf7\" (UniqueName: \"kubernetes.io/projected/12e1d9e5-96b5-4367-81a5-d87b3f93d8da-kube-api-access-g2qf7\") pod \"dns-default-v5h69\" (UID: \"12e1d9e5-96b5-4367-81a5-d87b3f93d8da\") " pod="openshift-dns/dns-default-v5h69" Mar 20 08:40:59.491326 master-0 kubenswrapper[18707]: I0320 08:40:59.491249 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8crkc\" (UniqueName: \"kubernetes.io/projected/a18b9230-de78-41b8-a61e-361b8bb1fbb3-kube-api-access-8crkc\") pod \"tuned-hb77b\" (UID: \"a18b9230-de78-41b8-a61e-361b8bb1fbb3\") " pod="openshift-cluster-node-tuning-operator/tuned-hb77b" Mar 20 08:40:59.509402 master-0 kubenswrapper[18707]: I0320 08:40:59.509347 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5v9f\" (UniqueName: \"kubernetes.io/projected/a9a9ecf2-c476-4962-8333-21f242dbcb89-kube-api-access-d5v9f\") pod \"controller-manager-fc56bb77c-qd4sn\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:40:59.540286 master-0 kubenswrapper[18707]: I0320 08:40:59.540162 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qns9g\" (UniqueName: \"kubernetes.io/projected/2bf90db0-f943-464c-8599-e36b4fc32e1c-kube-api-access-qns9g\") pod \"migrator-8487694857-w5tlr\" (UID: \"2bf90db0-f943-464c-8599-e36b4fc32e1c\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-w5tlr" Mar 20 08:40:59.547703 master-0 kubenswrapper[18707]: I0320 08:40:59.547653 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa759777-de22-4440-a3d3-ad429a3b8e7b-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-qcpb4\" (UID: \"fa759777-de22-4440-a3d3-ad429a3b8e7b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-qcpb4" Mar 20 08:40:59.571247 master-0 kubenswrapper[18707]: I0320 08:40:59.571160 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvjkr\" (UniqueName: \"kubernetes.io/projected/7c4e7e57-43be-4d31-b523-f7e4d316dce3-kube-api-access-bvjkr\") pod \"csi-snapshot-controller-operator-5f5d689c6b-4dqrh\" (UID: \"7c4e7e57-43be-4d31-b523-f7e4d316dce3\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh" Mar 20 08:40:59.587327 master-0 kubenswrapper[18707]: I0320 08:40:59.587270 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27qvw\" (UniqueName: \"kubernetes.io/projected/bbc0b783-28d5-4554-b49d-c66082546f44-kube-api-access-27qvw\") pod \"package-server-manager-7b95f86987-2pg77\" (UID: \"bbc0b783-28d5-4554-b49d-c66082546f44\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:40:59.606659 master-0 kubenswrapper[18707]: I0320 08:40:59.606599 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqd2v\" (UniqueName: \"kubernetes.io/projected/6a80bd6f-2263-4251-8197-5173193f8afc-kube-api-access-tqd2v\") pod \"multus-admission-controller-5dbbb8b86f-5rrrh\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:40:59.628059 master-0 kubenswrapper[18707]: I0320 08:40:59.627997 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtcnq\" (UniqueName: \"kubernetes.io/projected/df428d5a-c722-4536-8e7f-cdd85c560481-kube-api-access-dtcnq\") pod \"catalog-operator-68f85b4d6c-fzm28\" (UID: \"df428d5a-c722-4536-8e7f-cdd85c560481\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:40:59.646840 master-0 kubenswrapper[18707]: I0320 08:40:59.646776 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qstvb\" (UniqueName: \"kubernetes.io/projected/4fea9b05-222e-4b58-95c8-735fc1cf3a8b-kube-api-access-qstvb\") pod \"catalogd-controller-manager-6864dc98f7-74mgr\" (UID: \"4fea9b05-222e-4b58-95c8-735fc1cf3a8b\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:40:59.655440 master-0 kubenswrapper[18707]: I0320 08:40:59.655386 18707 request.go:700] Waited for 2.931458327s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-config-operator/serviceaccounts/openshift-config-operator/token Mar 20 08:40:59.672903 master-0 kubenswrapper[18707]: I0320 08:40:59.672825 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmrxh\" (UniqueName: \"kubernetes.io/projected/ab175f7e-a5e8-4fda-98c9-6d052a221a83-kube-api-access-zmrxh\") pod \"openshift-config-operator-95bf4f4d-25cml\" (UID: \"ab175f7e-a5e8-4fda-98c9-6d052a221a83\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:40:59.687676 master-0 kubenswrapper[18707]: I0320 08:40:59.687614 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8bxz\" (UniqueName: \"kubernetes.io/projected/96de6024-e20f-4b52-9294-b330d65e4153-kube-api-access-z8bxz\") pod \"csi-snapshot-controller-64854d9cff-f44gr\" (UID: \"96de6024-e20f-4b52-9294-b330d65e4153\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" Mar 20 08:40:59.706642 master-0 kubenswrapper[18707]: I0320 08:40:59.706517 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1854ea4-c8e2-4289-84b6-1f18b2ac684f-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-c6vkz\" (UID: \"c1854ea4-c8e2-4289-84b6-1f18b2ac684f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" Mar 20 08:40:59.727298 master-0 kubenswrapper[18707]: I0320 08:40:59.727229 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh47j\" (UniqueName: \"kubernetes.io/projected/fb0fc10f-5796-4cd5-b8f5-72d678054c24-kube-api-access-lh47j\") pod \"network-node-identity-6t5vb\" (UID: \"fb0fc10f-5796-4cd5-b8f5-72d678054c24\") " pod="openshift-network-node-identity/network-node-identity-6t5vb" Mar 20 08:40:59.746157 master-0 kubenswrapper[18707]: I0320 08:40:59.746070 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75r7k\" (UniqueName: \"kubernetes.io/projected/68252533-bd64-4fc5-838a-cc350cbe77f0-kube-api-access-75r7k\") pod \"openshift-controller-manager-operator-8c94f4649-p7pt6\" (UID: \"68252533-bd64-4fc5-838a-cc350cbe77f0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" Mar 20 08:40:59.771695 master-0 kubenswrapper[18707]: I0320 08:40:59.771637 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvx6w\" (UniqueName: \"kubernetes.io/projected/acb704a9-6c8d-4378-ae93-e7095b1fce85-kube-api-access-xvx6w\") pod \"marketplace-operator-89ccd998f-mvn4t\" (UID: \"acb704a9-6c8d-4378-ae93-e7095b1fce85\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:40:59.789398 master-0 kubenswrapper[18707]: I0320 08:40:59.789327 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-224dg\" (UniqueName: \"kubernetes.io/projected/3f471ecc-922c-4cb1-9bdd-fdb5da08c592-kube-api-access-224dg\") pod \"dns-operator-9c5679d8f-r6dm8\" (UID: \"3f471ecc-922c-4cb1-9bdd-fdb5da08c592\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-r6dm8" Mar 20 08:40:59.808328 master-0 kubenswrapper[18707]: I0320 08:40:59.808271 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqtvp\" (UniqueName: \"kubernetes.io/projected/91b2899e-8d24-41a0-bec8-d11c67b8f955-kube-api-access-hqtvp\") pod \"router-default-7dcf5569b5-xmvwz\" (UID: \"91b2899e-8d24-41a0-bec8-d11c67b8f955\") " pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:40:59.829057 master-0 kubenswrapper[18707]: I0320 08:40:59.828991 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtc2p\" (UniqueName: \"kubernetes.io/projected/248a3d2f-3be4-46bf-959c-79d28736c0d6-kube-api-access-mtc2p\") pod \"ovnkube-node-rxdwp\" (UID: \"248a3d2f-3be4-46bf-959c-79d28736c0d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:40:59.846640 master-0 kubenswrapper[18707]: I0320 08:40:59.846579 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snmpq\" (UniqueName: \"kubernetes.io/projected/7e451189-850e-4d19-a40c-40f642d08511-kube-api-access-snmpq\") pod \"operator-controller-controller-manager-57777556ff-nk2rf\" (UID: \"7e451189-850e-4d19-a40c-40f642d08511\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:40:59.866255 master-0 kubenswrapper[18707]: I0320 08:40:59.866146 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nvl4\" (UniqueName: \"kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-kube-api-access-9nvl4\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:40:59.887850 master-0 kubenswrapper[18707]: I0320 08:40:59.887776 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2d4r\" (UniqueName: \"kubernetes.io/projected/f53bc282-5937-49ac-ac98-2ee37ccb268d-kube-api-access-b2d4r\") pod \"cluster-baremetal-operator-6f69995874-dv6cd\" (UID: \"f53bc282-5937-49ac-ac98-2ee37ccb268d\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" Mar 20 08:40:59.906056 master-0 kubenswrapper[18707]: I0320 08:40:59.906009 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f9vt\" (UniqueName: \"kubernetes.io/projected/46de2acc-9f5d-4ecf-befe-a480f86466f5-kube-api-access-4f9vt\") pod \"apiserver-779f85678d-lrzfz\" (UID: \"46de2acc-9f5d-4ecf-befe-a480f86466f5\") " pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:40:59.932216 master-0 kubenswrapper[18707]: I0320 08:40:59.929891 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d57k\" (UniqueName: \"kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-kube-api-access-8d57k\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:40:59.946355 master-0 kubenswrapper[18707]: I0320 08:40:59.946302 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njjkt\" (UniqueName: \"kubernetes.io/projected/813f91c2-2b37-4681-968d-4217e286e22f-kube-api-access-njjkt\") pod \"network-metrics-daemon-srdjm\" (UID: \"813f91c2-2b37-4681-968d-4217e286e22f\") " pod="openshift-multus/network-metrics-daemon-srdjm" Mar 20 08:40:59.968325 master-0 kubenswrapper[18707]: I0320 08:40:59.968228 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clz6w\" (UniqueName: \"kubernetes.io/projected/75e3e2cc-aa56-41f3-8859-1c086f419d05-kube-api-access-clz6w\") pod \"service-ca-operator-b865698dc-qzb2h\" (UID: \"75e3e2cc-aa56-41f3-8859-1c086f419d05\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" Mar 20 08:40:59.988749 master-0 kubenswrapper[18707]: I0320 08:40:59.988694 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhkh7\" (UniqueName: \"kubernetes.io/projected/f046860d-2d54-4746-8ba2-f8e90fa55e38-kube-api-access-xhkh7\") pod \"etcd-operator-8544cbcf9c-brhw4\" (UID: \"f046860d-2d54-4746-8ba2-f8e90fa55e38\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" Mar 20 08:41:00.009659 master-0 kubenswrapper[18707]: I0320 08:41:00.009580 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s69rd\" (UniqueName: \"kubernetes.io/projected/29b5b089-fb1d-46a1-bd67-2e0ba03c76a6-kube-api-access-s69rd\") pod \"authentication-operator-5885bfd7f4-62zrx\" (UID: \"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" Mar 20 08:41:00.027346 master-0 kubenswrapper[18707]: I0320 08:41:00.027267 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a57854ac-809a-4745-aaa1-774f0a08a560-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-vx5d7\" (UID: \"a57854ac-809a-4745-aaa1-774f0a08a560\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" Mar 20 08:41:00.049980 master-0 kubenswrapper[18707]: I0320 08:41:00.049902 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdwvw\" (UniqueName: \"kubernetes.io/projected/2f844652-225b-4713-a9ad-cf9bcc348f47-kube-api-access-jdwvw\") pod \"service-ca-79bc6b8d76-72j8t\" (UID: \"2f844652-225b-4713-a9ad-cf9bcc348f47\") " pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" Mar 20 08:41:00.067502 master-0 kubenswrapper[18707]: I0320 08:41:00.067445 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmndg\" (UniqueName: \"kubernetes.io/projected/aa16c3bf-2350-46d1-afa0-9477b3ec8877-kube-api-access-qmndg\") pod \"cluster-storage-operator-7d87854d6-vlq7h\" (UID: \"aa16c3bf-2350-46d1-afa0-9477b3ec8877\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" Mar 20 08:41:00.088352 master-0 kubenswrapper[18707]: I0320 08:41:00.088289 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpdk5\" (UniqueName: \"kubernetes.io/projected/ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7-kube-api-access-lpdk5\") pod \"cluster-node-tuning-operator-598fbc5f8f-vxzvg\" (UID: \"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" Mar 20 08:41:00.113920 master-0 kubenswrapper[18707]: I0320 08:41:00.113870 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s2tb\" (UniqueName: \"kubernetes.io/projected/70020125-af49-47d7-8853-fb951c561dc4-kube-api-access-9s2tb\") pod \"node-resolver-qnp9w\" (UID: \"70020125-af49-47d7-8853-fb951c561dc4\") " pod="openshift-dns/node-resolver-qnp9w" Mar 20 08:41:00.127547 master-0 kubenswrapper[18707]: I0320 08:41:00.127506 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btdjt\" (UniqueName: \"kubernetes.io/projected/c2a23d24-9e09-431e-8c3b-8456ff51a8d0-kube-api-access-btdjt\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5\" (UID: \"c2a23d24-9e09-431e-8c3b-8456ff51a8d0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" Mar 20 08:41:00.149332 master-0 kubenswrapper[18707]: I0320 08:41:00.149229 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c5ch\" (UniqueName: \"kubernetes.io/projected/b98b4efc-6117-487f-9cfc-38ce66dd9570-kube-api-access-6c5ch\") pod \"multus-additional-cni-plugins-rpbcn\" (UID: \"b98b4efc-6117-487f-9cfc-38ce66dd9570\") " pod="openshift-multus/multus-additional-cni-plugins-rpbcn" Mar 20 08:41:00.167703 master-0 kubenswrapper[18707]: I0320 08:41:00.167636 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b4291bfd-53d9-4c78-b7cb-d7eb46560528-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-fmhbq\" (UID: \"b4291bfd-53d9-4c78-b7cb-d7eb46560528\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" Mar 20 08:41:00.192466 master-0 kubenswrapper[18707]: I0320 08:41:00.192392 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27j9q\" (UniqueName: \"kubernetes.io/projected/4ddac301-a604-4f07-8849-5928befd336e-kube-api-access-27j9q\") pod \"machine-config-server-gj4pm\" (UID: \"4ddac301-a604-4f07-8849-5928befd336e\") " pod="openshift-machine-config-operator/machine-config-server-gj4pm" Mar 20 08:41:00.207144 master-0 kubenswrapper[18707]: I0320 08:41:00.207085 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wl7f\" (UniqueName: \"kubernetes.io/projected/e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97-kube-api-access-6wl7f\") pod \"network-check-source-b4bf74f6-fhvg6\" (UID: \"e98e5c8d-e8a6-46bf-8b86-8ac96b03dc97\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-fhvg6" Mar 20 08:41:00.227302 master-0 kubenswrapper[18707]: I0320 08:41:00.227156 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2zzd\" (UniqueName: \"kubernetes.io/projected/c0142d4e-9fd4-4375-a773-bb89b38af654-kube-api-access-w2zzd\") pod \"network-check-target-xnrw6\" (UID: \"c0142d4e-9fd4-4375-a773-bb89b38af654\") " pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:41:00.247854 master-0 kubenswrapper[18707]: I0320 08:41:00.247791 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7k8m\" (UniqueName: \"kubernetes.io/projected/86cb5d23-df7f-4f67-8086-1789d8e68544-kube-api-access-j7k8m\") pod \"cluster-olm-operator-67dcd4998-c5742\" (UID: \"86cb5d23-df7f-4f67-8086-1789d8e68544\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" Mar 20 08:41:00.267058 master-0 kubenswrapper[18707]: I0320 08:41:00.267008 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrws5\" (UniqueName: \"kubernetes.io/projected/42df77ec-94aa-48ba-bb35-7b1f1e8b8e97-kube-api-access-wrws5\") pod \"machine-config-operator-84d549f6d5-gm4qr\" (UID: \"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" Mar 20 08:41:00.291145 master-0 kubenswrapper[18707]: I0320 08:41:00.291077 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6mb5\" (UniqueName: \"kubernetes.io/projected/ad692349-5089-4afc-85b2-9b6e7997567c-kube-api-access-h6mb5\") pod \"network-operator-7bd846bfc4-mt454\" (UID: \"ad692349-5089-4afc-85b2-9b6e7997567c\") " pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" Mar 20 08:41:00.308884 master-0 kubenswrapper[18707]: I0320 08:41:00.308784 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbvtp\" (UniqueName: \"kubernetes.io/projected/f5782718-9118-4682-a287-7998cd0304b3-kube-api-access-bbvtp\") pod \"machine-config-daemon-9t8x6\" (UID: \"f5782718-9118-4682-a287-7998cd0304b3\") " pod="openshift-machine-config-operator/machine-config-daemon-9t8x6" Mar 20 08:41:00.326383 master-0 kubenswrapper[18707]: I0320 08:41:00.326279 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zpkn\" (UniqueName: \"kubernetes.io/projected/45e8b72b-564c-4bb1-b911-baff2d6c87ad-kube-api-access-9zpkn\") pod \"cluster-monitoring-operator-58845fbb57-nljsr\" (UID: \"45e8b72b-564c-4bb1-b911-baff2d6c87ad\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-nljsr" Mar 20 08:41:00.346666 master-0 kubenswrapper[18707]: I0320 08:41:00.345887 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02-bound-sa-token\") pod \"ingress-operator-66b84d69b-gzg9m\" (UID: \"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" Mar 20 08:41:00.366855 master-0 kubenswrapper[18707]: I0320 08:41:00.366779 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htv9s\" (UniqueName: \"kubernetes.io/projected/e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e-kube-api-access-htv9s\") pod \"control-plane-machine-set-operator-6f97756bc8-7t5qv\" (UID: \"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" Mar 20 08:41:00.387317 master-0 kubenswrapper[18707]: I0320 08:41:00.387256 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg6th\" (UniqueName: \"kubernetes.io/projected/7b489385-2c96-4a97-8b31-362162de020e-kube-api-access-pg6th\") pod \"olm-operator-5c9796789-tjm9l\" (UID: \"7b489385-2c96-4a97-8b31-362162de020e\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:41:00.407254 master-0 kubenswrapper[18707]: I0320 08:41:00.407169 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l4b7\" (UniqueName: \"kubernetes.io/projected/ee3cc021-67d8-4b7f-b443-16f18228712e-kube-api-access-7l4b7\") pod \"iptables-alerter-dd9wv\" (UID: \"ee3cc021-67d8-4b7f-b443-16f18228712e\") " pod="openshift-network-operator/iptables-alerter-dd9wv" Mar 20 08:41:00.425484 master-0 kubenswrapper[18707]: I0320 08:41:00.425438 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c2rq\" (UniqueName: \"kubernetes.io/projected/b543f82e-683d-47c1-af73-4dcede4cf4df-kube-api-access-4c2rq\") pod \"packageserver-6c85f64bb9-fmpsg\" (UID: \"b543f82e-683d-47c1-af73-4dcede4cf4df\") " pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:41:00.445951 master-0 kubenswrapper[18707]: I0320 08:41:00.445893 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1375da42-ecaf-4d86-b554-25fd1c3d00bd-kube-api-access\") pod \"cluster-version-operator-7d58488df-qmm8h\" (UID: \"1375da42-ecaf-4d86-b554-25fd1c3d00bd\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" Mar 20 08:41:00.465824 master-0 kubenswrapper[18707]: I0320 08:41:00.465721 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxg4z\" (UniqueName: \"kubernetes.io/projected/5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7-kube-api-access-fxg4z\") pod \"multus-2fp4b\" (UID: \"5f7e0e63-ea0d-4e48-8ceb-ddfb676f94b7\") " pod="openshift-multus/multus-2fp4b" Mar 20 08:41:00.497508 master-0 kubenswrapper[18707]: I0320 08:41:00.497337 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqmv5\" (UniqueName: \"kubernetes.io/projected/f1468ec0-2aa4-461c-a62f-e9f067be490f-kube-api-access-bqmv5\") pod \"node-exporter-lb4t5\" (UID: \"f1468ec0-2aa4-461c-a62f-e9f067be490f\") " pod="openshift-monitoring/node-exporter-lb4t5" Mar 20 08:41:00.510595 master-0 kubenswrapper[18707]: I0320 08:41:00.510544 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsht7\" (UniqueName: \"kubernetes.io/projected/21bebade-17fa-444e-92a9-eea53d6cd673-kube-api-access-zsht7\") pod \"cluster-autoscaler-operator-866dc4744-xwxg7\" (UID: \"21bebade-17fa-444e-92a9-eea53d6cd673\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" Mar 20 08:41:00.528718 master-0 kubenswrapper[18707]: I0320 08:41:00.528661 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zp8f\" (UniqueName: \"kubernetes.io/projected/b639e578-628e-404d-b759-8b6e84e771d9-kube-api-access-9zp8f\") pod \"community-operators-dtqgc\" (UID: \"b639e578-628e-404d-b759-8b6e84e771d9\") " pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:41:00.548207 master-0 kubenswrapper[18707]: I0320 08:41:00.548127 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnm6c\" (UniqueName: \"kubernetes.io/projected/f91d1788-027d-432b-be33-ca952a95046a-kube-api-access-lnm6c\") pod \"prometheus-operator-6c8df6d4b-5m857\" (UID: \"f91d1788-027d-432b-be33-ca952a95046a\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-5m857" Mar 20 08:41:00.550138 master-0 kubenswrapper[18707]: I0320 08:41:00.550099 18707 scope.go:117] "RemoveContainer" containerID="32acfc021b8f8071fac0cc1a8b0129efcea8236c65c56620ec15567dda3b37db" Mar 20 08:41:00.568121 master-0 kubenswrapper[18707]: I0320 08:41:00.568036 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw4np\" (UniqueName: \"kubernetes.io/projected/d88ba8e1-ee42-423f-9839-e71cb0265c6c-kube-api-access-lw4np\") pod \"cluster-cloud-controller-manager-operator-7dff898856-7vxxr\" (UID: \"d88ba8e1-ee42-423f-9839-e71cb0265c6c\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" Mar 20 08:41:00.586590 master-0 kubenswrapper[18707]: I0320 08:41:00.585978 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whmmk\" (UniqueName: \"kubernetes.io/projected/1db4d695-5a6a-4fbe-b610-3777bfebed79-kube-api-access-whmmk\") pod \"openshift-state-metrics-5dc6c74576-j8tmv\" (UID: \"1db4d695-5a6a-4fbe-b610-3777bfebed79\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-j8tmv" Mar 20 08:41:00.607946 master-0 kubenswrapper[18707]: I0320 08:41:00.607895 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcp8t\" (UniqueName: \"kubernetes.io/projected/654b5b1c-2764-415c-bb13-aa06899f4076-kube-api-access-xcp8t\") pod \"redhat-marketplace-hqqrk\" (UID: \"654b5b1c-2764-415c-bb13-aa06899f4076\") " pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:41:00.634881 master-0 kubenswrapper[18707]: I0320 08:41:00.634835 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82x7p\" (UniqueName: \"kubernetes.io/projected/f2217de0-7805-4f5f-8ea5-93b81b7e0236-kube-api-access-82x7p\") pod \"insights-operator-68bf6ff9d6-mvfn5\" (UID: \"f2217de0-7805-4f5f-8ea5-93b81b7e0236\") " pod="openshift-insights/insights-operator-68bf6ff9d6-mvfn5" Mar 20 08:41:00.649686 master-0 kubenswrapper[18707]: I0320 08:41:00.649611 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf485\" (UniqueName: \"kubernetes.io/projected/c0a17669-a122-44aa-bdda-581bf1fc4649-kube-api-access-xf485\") pod \"certified-operators-cc955\" (UID: \"c0a17669-a122-44aa-bdda-581bf1fc4649\") " pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:41:00.668291 master-0 kubenswrapper[18707]: I0320 08:41:00.668233 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxmkh\" (UniqueName: \"kubernetes.io/projected/c593e31d-82b5-4d42-992e-6b050ccf3019-kube-api-access-gxmkh\") pod \"redhat-operators-jstrn\" (UID: \"c593e31d-82b5-4d42-992e-6b050ccf3019\") " pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:41:00.672894 master-0 kubenswrapper[18707]: I0320 08:41:00.672852 18707 request.go:700] Waited for 3.802838467s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/serviceaccounts/metrics-server/token Mar 20 08:41:00.687762 master-0 kubenswrapper[18707]: I0320 08:41:00.687510 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6p8v\" (UniqueName: \"kubernetes.io/projected/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-kube-api-access-c6p8v\") pod \"metrics-server-64c67d44c4-s7vfs\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:41:00.706428 master-0 kubenswrapper[18707]: I0320 08:41:00.706349 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxn2f\" (UniqueName: \"kubernetes.io/projected/ae39c09b-7aef-4615-8ced-0dcad39f23a5-kube-api-access-rxn2f\") pod \"machine-approver-5c6485487f-qb94j\" (UID: \"ae39c09b-7aef-4615-8ced-0dcad39f23a5\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" Mar 20 08:41:00.749788 master-0 kubenswrapper[18707]: I0320 08:41:00.749638 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkccn\" (UniqueName: \"kubernetes.io/projected/e3bf8eaf-5f6c-41a6-aaeb-6c921d789466-kube-api-access-gkccn\") pod \"cluster-samples-operator-85f7577d78-mwfgx\" (UID: \"e3bf8eaf-5f6c-41a6-aaeb-6c921d789466\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-mwfgx" Mar 20 08:41:00.750996 master-0 kubenswrapper[18707]: I0320 08:41:00.750929 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dtbl\" (UniqueName: \"kubernetes.io/projected/469183dd-dc54-467d-82a1-611132ae8ec4-kube-api-access-8dtbl\") pod \"cloud-credential-operator-744f9dbf77-r4qvh\" (UID: \"469183dd-dc54-467d-82a1-611132ae8ec4\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-r4qvh" Mar 20 08:41:00.768536 master-0 kubenswrapper[18707]: I0320 08:41:00.768414 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plc2q\" (UniqueName: \"kubernetes.io/projected/c7f5e6cd-e093-409a-8758-d3db7a7eb32c-kube-api-access-plc2q\") pod \"machine-api-operator-6fbb6cf6f9-n8tnn\" (UID: \"c7f5e6cd-e093-409a-8758-d3db7a7eb32c\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" Mar 20 08:41:00.788044 master-0 kubenswrapper[18707]: I0320 08:41:00.787958 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdbds\" (UniqueName: \"kubernetes.io/projected/a638c468-010c-4da3-ad62-26f5f2bbdbb9-kube-api-access-sdbds\") pod \"route-controller-manager-56f686584b-fdcx5\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:41:00.811133 master-0 kubenswrapper[18707]: I0320 08:41:00.811026 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prcgg\" (UniqueName: \"kubernetes.io/projected/a25248c0-8de7-4624-b785-f053665fcb23-kube-api-access-prcgg\") pod \"kube-state-metrics-7bbc969446-qh6vq\" (UID: \"a25248c0-8de7-4624-b785-f053665fcb23\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-qh6vq" Mar 20 08:41:00.828712 master-0 kubenswrapper[18707]: E0320 08:41:00.828265 18707 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:00.828712 master-0 kubenswrapper[18707]: E0320 08:41:00.828322 18707 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:00.828712 master-0 kubenswrapper[18707]: E0320 08:41:00.828423 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access podName:d245e5b2-a30d-45c8-9b79-6e8096765c14 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:01.328387961 +0000 UTC m=+6.484568317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access") pod "installer-3-master-0" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:00.867891 master-0 kubenswrapper[18707]: E0320 08:41:00.867822 18707 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="3.774s" Mar 20 08:41:00.867891 master-0 kubenswrapper[18707]: I0320 08:41:00.867896 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:41:00.868259 master-0 kubenswrapper[18707]: I0320 08:41:00.867920 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 20 08:41:00.868259 master-0 kubenswrapper[18707]: I0320 08:41:00.867934 18707 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="c89340c6-97f7-4855-950d-1c17da08b16a" Mar 20 08:41:00.868844 master-0 kubenswrapper[18707]: I0320 08:41:00.868773 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:00.885142 master-0 kubenswrapper[18707]: I0320 08:41:00.885034 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 20 08:41:00.896095 master-0 kubenswrapper[18707]: I0320 08:41:00.896052 18707 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 20 08:41:00.896444 master-0 kubenswrapper[18707]: I0320 08:41:00.896432 18707 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 20 08:41:00.936989 master-0 kubenswrapper[18707]: I0320 08:41:00.936248 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 20 08:41:00.936989 master-0 kubenswrapper[18707]: I0320 08:41:00.936344 18707 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="c89340c6-97f7-4855-950d-1c17da08b16a" Mar 20 08:41:00.936989 master-0 kubenswrapper[18707]: I0320 08:41:00.936454 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:41:00.936989 master-0 kubenswrapper[18707]: I0320 08:41:00.936480 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:41:00.936989 master-0 kubenswrapper[18707]: I0320 08:41:00.936497 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 20 08:41:00.936989 master-0 kubenswrapper[18707]: I0320 08:41:00.936549 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 20 08:41:00.936989 master-0 kubenswrapper[18707]: I0320 08:41:00.936559 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:41:00.936989 master-0 kubenswrapper[18707]: I0320 08:41:00.936571 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:41:00.936989 master-0 kubenswrapper[18707]: I0320 08:41:00.936605 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-v5h69" Mar 20 08:41:00.936989 master-0 kubenswrapper[18707]: I0320 08:41:00.936625 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-v5h69" Mar 20 08:41:00.936989 master-0 kubenswrapper[18707]: I0320 08:41:00.936649 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:41:00.936989 master-0 kubenswrapper[18707]: I0320 08:41:00.936666 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:41:00.936989 master-0 kubenswrapper[18707]: I0320 08:41:00.936674 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:41:00.936989 master-0 kubenswrapper[18707]: I0320 08:41:00.936698 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:41:00.955281 master-0 kubenswrapper[18707]: I0320 08:41:00.955231 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:41:01.009605 master-0 kubenswrapper[18707]: I0320 08:41:01.009458 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:41:01.140051 master-0 kubenswrapper[18707]: I0320 08:41:01.139950 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:41:01.145486 master-0 kubenswrapper[18707]: I0320 08:41:01.145433 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:41:01.290011 master-0 kubenswrapper[18707]: I0320 08:41:01.289801 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:41:01.294942 master-0 kubenswrapper[18707]: I0320 08:41:01.294884 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:41:01.391974 master-0 kubenswrapper[18707]: I0320 08:41:01.391878 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=15.391857084 podStartE2EDuration="15.391857084s" podCreationTimestamp="2026-03-20 08:40:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:41:01.288664493 +0000 UTC m=+6.444844859" watchObservedRunningTime="2026-03-20 08:41:01.391857084 +0000 UTC m=+6.548037440" Mar 20 08:41:01.397383 master-0 kubenswrapper[18707]: I0320 08:41:01.397305 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:41:01.397599 master-0 kubenswrapper[18707]: E0320 08:41:01.397547 18707 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:01.397599 master-0 kubenswrapper[18707]: E0320 08:41:01.397598 18707 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:01.397695 master-0 kubenswrapper[18707]: E0320 08:41:01.397663 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access podName:d245e5b2-a30d-45c8-9b79-6e8096765c14 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:02.397639113 +0000 UTC m=+7.553819469 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access") pod "installer-3-master-0" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:01.451740 master-0 kubenswrapper[18707]: I0320 08:41:01.451645 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=15.451621108 podStartE2EDuration="15.451621108s" podCreationTimestamp="2026-03-20 08:40:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:41:01.450795514 +0000 UTC m=+6.606975860" watchObservedRunningTime="2026-03-20 08:41:01.451621108 +0000 UTC m=+6.607801464" Mar 20 08:41:01.579459 master-0 kubenswrapper[18707]: I0320 08:41:01.579402 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:41:01.587473 master-0 kubenswrapper[18707]: I0320 08:41:01.587406 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:41:01.624531 master-0 kubenswrapper[18707]: I0320 08:41:01.624443 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-gzg9m_ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02/ingress-operator/1.log" Mar 20 08:41:01.625167 master-0 kubenswrapper[18707]: I0320 08:41:01.625103 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-gzg9m" event={"ID":"ea47ffd1-47ce-4f28-b0a8-3b932a2e3a02","Type":"ContainerStarted","Data":"873f18d468203ee5b027d25221b42ea0f7f3617b36e8d754eb1c01877f38a136"} Mar 20 08:41:01.625557 master-0 kubenswrapper[18707]: I0320 08:41:01.625523 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:01.630141 master-0 kubenswrapper[18707]: I0320 08:41:01.630087 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.663941 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cst2b"] Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: E0320 08:41:01.664218 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b1c7a56-5d00-468a-bb8d-dbaf8e854951" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664230 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b1c7a56-5d00-468a-bb8d-dbaf8e854951" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: E0320 08:41:01.664244 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d245e5b2-a30d-45c8-9b79-6e8096765c14" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664250 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d245e5b2-a30d-45c8-9b79-6e8096765c14" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: E0320 08:41:01.664261 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdfdabb8-83d6-4b38-a709-9e354062ba1a" containerName="assisted-installer-controller" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664269 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdfdabb8-83d6-4b38-a709-9e354062ba1a" containerName="assisted-installer-controller" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: E0320 08:41:01.664276 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler-recovery-controller" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664282 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler-recovery-controller" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: E0320 08:41:01.664294 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler-cert-syncer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664300 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler-cert-syncer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: E0320 08:41:01.664309 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e219558-98b7-4528-88cf-97b87cd1eb6c" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664314 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e219558-98b7-4528-88cf-97b87cd1eb6c" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: E0320 08:41:01.664329 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac672fa-7660-449e-a0d1-244dc6282d76" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664336 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac672fa-7660-449e-a0d1-244dc6282d76" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: E0320 08:41:01.664348 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664355 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: E0320 08:41:01.664363 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d21f11-7386-4a04-a82e-5a03f3602a3b" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664370 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d21f11-7386-4a04-a82e-5a03f3602a3b" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: E0320 08:41:01.664392 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412becc8-c1a7-422c-94d1-dd1849070ef1" containerName="prober" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664399 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="412becc8-c1a7-422c-94d1-dd1849070ef1" containerName="prober" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: E0320 08:41:01.664411 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664416 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: E0320 08:41:01.664426 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4490a747-da2d-4f1a-8986-bc2c1c58424b" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664432 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4490a747-da2d-4f1a-8986-bc2c1c58424b" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: E0320 08:41:01.664442 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="wait-for-host-port" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664448 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="wait-for-host-port" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664592 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler-recovery-controller" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664608 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdfdabb8-83d6-4b38-a709-9e354062ba1a" containerName="assisted-installer-controller" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664633 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e219558-98b7-4528-88cf-97b87cd1eb6c" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664655 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d245e5b2-a30d-45c8-9b79-6e8096765c14" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664668 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664678 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664694 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac672fa-7660-449e-a0d1-244dc6282d76" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664705 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1d21f11-7386-4a04-a82e-5a03f3602a3b" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664720 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4490a747-da2d-4f1a-8986-bc2c1c58424b" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664729 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b1c7a56-5d00-468a-bb8d-dbaf8e854951" containerName="installer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664739 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="412becc8-c1a7-422c-94d1-dd1849070ef1" containerName="prober" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664748 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="kube-scheduler-cert-syncer" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.664758 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c8b9da1cd5cef8ca0690a6bbf5a601" containerName="wait-for-host-port" Mar 20 08:41:01.666703 master-0 kubenswrapper[18707]: I0320 08:41:01.665248 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cst2b" Mar 20 08:41:01.691214 master-0 kubenswrapper[18707]: I0320 08:41:01.691013 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cst2b"] Mar 20 08:41:01.704951 master-0 kubenswrapper[18707]: I0320 08:41:01.704612 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 08:41:01.705404 master-0 kubenswrapper[18707]: I0320 08:41:01.703507 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c6gf\" (UniqueName: \"kubernetes.io/projected/05530257-7cf7-49c1-ae64-eb866cca8588-kube-api-access-5c6gf\") pod \"ingress-canary-cst2b\" (UID: \"05530257-7cf7-49c1-ae64-eb866cca8588\") " pod="openshift-ingress-canary/ingress-canary-cst2b" Mar 20 08:41:01.706314 master-0 kubenswrapper[18707]: I0320 08:41:01.706291 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert\") pod \"ingress-canary-cst2b\" (UID: \"05530257-7cf7-49c1-ae64-eb866cca8588\") " pod="openshift-ingress-canary/ingress-canary-cst2b" Mar 20 08:41:01.719014 master-0 kubenswrapper[18707]: I0320 08:41:01.718760 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 08:41:01.734725 master-0 kubenswrapper[18707]: I0320 08:41:01.734660 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 08:41:01.779416 master-0 kubenswrapper[18707]: I0320 08:41:01.779355 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 20 08:41:01.792917 master-0 kubenswrapper[18707]: I0320 08:41:01.792864 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 20 08:41:01.809028 master-0 kubenswrapper[18707]: I0320 08:41:01.808943 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert\") pod \"ingress-canary-cst2b\" (UID: \"05530257-7cf7-49c1-ae64-eb866cca8588\") " pod="openshift-ingress-canary/ingress-canary-cst2b" Mar 20 08:41:01.809362 master-0 kubenswrapper[18707]: E0320 08:41:01.809116 18707 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 20 08:41:01.809362 master-0 kubenswrapper[18707]: E0320 08:41:01.809271 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert podName:05530257-7cf7-49c1-ae64-eb866cca8588 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:02.309245855 +0000 UTC m=+7.465426231 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert") pod "ingress-canary-cst2b" (UID: "05530257-7cf7-49c1-ae64-eb866cca8588") : secret "canary-serving-cert" not found Mar 20 08:41:01.809884 master-0 kubenswrapper[18707]: I0320 08:41:01.809845 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c6gf\" (UniqueName: \"kubernetes.io/projected/05530257-7cf7-49c1-ae64-eb866cca8588-kube-api-access-5c6gf\") pod \"ingress-canary-cst2b\" (UID: \"05530257-7cf7-49c1-ae64-eb866cca8588\") " pod="openshift-ingress-canary/ingress-canary-cst2b" Mar 20 08:41:01.812153 master-0 kubenswrapper[18707]: I0320 08:41:01.811476 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:41:01.817787 master-0 kubenswrapper[18707]: I0320 08:41:01.817729 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-bc9b556d6-vdnq2" Mar 20 08:41:01.848544 master-0 kubenswrapper[18707]: I0320 08:41:01.848425 18707 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 08:41:01.852120 master-0 kubenswrapper[18707]: I0320 08:41:01.852074 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c6gf\" (UniqueName: \"kubernetes.io/projected/05530257-7cf7-49c1-ae64-eb866cca8588-kube-api-access-5c6gf\") pod \"ingress-canary-cst2b\" (UID: \"05530257-7cf7-49c1-ae64-eb866cca8588\") " pod="openshift-ingress-canary/ingress-canary-cst2b" Mar 20 08:41:01.979409 master-0 kubenswrapper[18707]: I0320 08:41:01.979287 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:41:01.980215 master-0 kubenswrapper[18707]: I0320 08:41:01.979788 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:01.986063 master-0 kubenswrapper[18707]: I0320 08:41:01.985209 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:41:02.195854 master-0 kubenswrapper[18707]: I0320 08:41:02.195494 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:41:02.246007 master-0 kubenswrapper[18707]: I0320 08:41:02.245941 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:41:02.318437 master-0 kubenswrapper[18707]: I0320 08:41:02.318343 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert\") pod \"ingress-canary-cst2b\" (UID: \"05530257-7cf7-49c1-ae64-eb866cca8588\") " pod="openshift-ingress-canary/ingress-canary-cst2b" Mar 20 08:41:02.318714 master-0 kubenswrapper[18707]: E0320 08:41:02.318550 18707 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 20 08:41:02.318714 master-0 kubenswrapper[18707]: E0320 08:41:02.318649 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert podName:05530257-7cf7-49c1-ae64-eb866cca8588 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:03.31862804 +0000 UTC m=+8.474808396 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert") pod "ingress-canary-cst2b" (UID: "05530257-7cf7-49c1-ae64-eb866cca8588") : secret "canary-serving-cert" not found Mar 20 08:41:02.328702 master-0 kubenswrapper[18707]: I0320 08:41:02.328645 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:41:02.420940 master-0 kubenswrapper[18707]: I0320 08:41:02.420787 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:41:02.421286 master-0 kubenswrapper[18707]: E0320 08:41:02.421052 18707 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:02.421286 master-0 kubenswrapper[18707]: E0320 08:41:02.421123 18707 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:02.421286 master-0 kubenswrapper[18707]: E0320 08:41:02.421233 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access podName:d245e5b2-a30d-45c8-9b79-6e8096765c14 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:04.421177673 +0000 UTC m=+9.577358209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access") pod "installer-3-master-0" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:02.492111 master-0 kubenswrapper[18707]: I0320 08:41:02.491946 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:41:02.599019 master-0 kubenswrapper[18707]: I0320 08:41:02.598956 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:41:02.599562 master-0 kubenswrapper[18707]: I0320 08:41:02.599544 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:02.599658 master-0 kubenswrapper[18707]: I0320 08:41:02.599646 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:02.599739 master-0 kubenswrapper[18707]: I0320 08:41:02.599729 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:02.631143 master-0 kubenswrapper[18707]: I0320 08:41:02.631083 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:41:02.631460 master-0 kubenswrapper[18707]: I0320 08:41:02.631209 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:02.631460 master-0 kubenswrapper[18707]: I0320 08:41:02.631220 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:02.631460 master-0 kubenswrapper[18707]: I0320 08:41:02.631333 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:02.818011 master-0 kubenswrapper[18707]: I0320 08:41:02.817952 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:41:02.909654 master-0 kubenswrapper[18707]: I0320 08:41:02.909516 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:41:02.918859 master-0 kubenswrapper[18707]: I0320 08:41:02.918792 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:41:03.174699 master-0 kubenswrapper[18707]: I0320 08:41:03.173303 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:41:03.174699 master-0 kubenswrapper[18707]: I0320 08:41:03.174504 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:41:03.190444 master-0 kubenswrapper[18707]: I0320 08:41:03.188661 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:41:03.227900 master-0 kubenswrapper[18707]: I0320 08:41:03.227792 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:41:03.335757 master-0 kubenswrapper[18707]: I0320 08:41:03.335684 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert\") pod \"ingress-canary-cst2b\" (UID: \"05530257-7cf7-49c1-ae64-eb866cca8588\") " pod="openshift-ingress-canary/ingress-canary-cst2b" Mar 20 08:41:03.336022 master-0 kubenswrapper[18707]: E0320 08:41:03.335854 18707 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 20 08:41:03.336022 master-0 kubenswrapper[18707]: E0320 08:41:03.335909 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert podName:05530257-7cf7-49c1-ae64-eb866cca8588 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:05.335893126 +0000 UTC m=+10.492073482 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert") pod "ingress-canary-cst2b" (UID: "05530257-7cf7-49c1-ae64-eb866cca8588") : secret "canary-serving-cert" not found Mar 20 08:41:03.639003 master-0 kubenswrapper[18707]: I0320 08:41:03.638932 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:03.639569 master-0 kubenswrapper[18707]: I0320 08:41:03.639501 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:03.644012 master-0 kubenswrapper[18707]: I0320 08:41:03.643963 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-779f85678d-lrzfz" Mar 20 08:41:03.960311 master-0 kubenswrapper[18707]: I0320 08:41:03.960130 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:41:03.960311 master-0 kubenswrapper[18707]: I0320 08:41:03.960290 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:03.964904 master-0 kubenswrapper[18707]: I0320 08:41:03.964826 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:41:04.083606 master-0 kubenswrapper[18707]: I0320 08:41:04.083542 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:41:04.084131 master-0 kubenswrapper[18707]: I0320 08:41:04.084112 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:04.089882 master-0 kubenswrapper[18707]: I0320 08:41:04.089766 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:41:04.091228 master-0 kubenswrapper[18707]: I0320 08:41:04.091137 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-6c85f64bb9-fmpsg" Mar 20 08:41:04.459442 master-0 kubenswrapper[18707]: I0320 08:41:04.459358 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:41:04.460031 master-0 kubenswrapper[18707]: E0320 08:41:04.459687 18707 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:04.460031 master-0 kubenswrapper[18707]: E0320 08:41:04.459747 18707 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:04.460031 master-0 kubenswrapper[18707]: E0320 08:41:04.459843 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access podName:d245e5b2-a30d-45c8-9b79-6e8096765c14 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:08.459814925 +0000 UTC m=+13.615995281 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access") pod "installer-3-master-0" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:04.644897 master-0 kubenswrapper[18707]: I0320 08:41:04.644836 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:04.739839 master-0 kubenswrapper[18707]: I0320 08:41:04.739672 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:41:04.740086 master-0 kubenswrapper[18707]: I0320 08:41:04.739937 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:04.783476 master-0 kubenswrapper[18707]: I0320 08:41:04.783419 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cc955" Mar 20 08:41:04.830681 master-0 kubenswrapper[18707]: I0320 08:41:04.829860 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:41:04.830681 master-0 kubenswrapper[18707]: I0320 08:41:04.830060 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:04.836305 master-0 kubenswrapper[18707]: I0320 08:41:04.834952 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:41:05.142453 master-0 kubenswrapper[18707]: I0320 08:41:05.142386 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:41:05.142719 master-0 kubenswrapper[18707]: I0320 08:41:05.142582 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:05.148701 master-0 kubenswrapper[18707]: I0320 08:41:05.148641 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-tjm9l" Mar 20 08:41:05.257161 master-0 kubenswrapper[18707]: I0320 08:41:05.257086 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:41:05.314366 master-0 kubenswrapper[18707]: I0320 08:41:05.314308 18707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 20 08:41:05.314674 master-0 kubenswrapper[18707]: I0320 08:41:05.314597 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" containerID="cri-o://2f574795ee9d934844b92324a83362cb7abdf8cc28431e8355456d552139443f" gracePeriod=5 Mar 20 08:41:05.377261 master-0 kubenswrapper[18707]: I0320 08:41:05.377168 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert\") pod \"ingress-canary-cst2b\" (UID: \"05530257-7cf7-49c1-ae64-eb866cca8588\") " pod="openshift-ingress-canary/ingress-canary-cst2b" Mar 20 08:41:05.377538 master-0 kubenswrapper[18707]: E0320 08:41:05.377421 18707 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 20 08:41:05.377538 master-0 kubenswrapper[18707]: E0320 08:41:05.377499 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert podName:05530257-7cf7-49c1-ae64-eb866cca8588 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:09.377476065 +0000 UTC m=+14.533656421 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert") pod "ingress-canary-cst2b" (UID: "05530257-7cf7-49c1-ae64-eb866cca8588") : secret "canary-serving-cert" not found Mar 20 08:41:05.531921 master-0 kubenswrapper[18707]: I0320 08:41:05.531770 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:41:05.537629 master-0 kubenswrapper[18707]: I0320 08:41:05.537592 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:41:05.622323 master-0 kubenswrapper[18707]: I0320 08:41:05.622264 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:41:05.622649 master-0 kubenswrapper[18707]: I0320 08:41:05.622448 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:05.627105 master-0 kubenswrapper[18707]: I0320 08:41:05.627072 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xnrw6" Mar 20 08:41:05.656265 master-0 kubenswrapper[18707]: I0320 08:41:05.656201 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:41:06.325620 master-0 kubenswrapper[18707]: I0320 08:41:06.325558 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:41:06.325903 master-0 kubenswrapper[18707]: I0320 08:41:06.325712 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:06.332870 master-0 kubenswrapper[18707]: I0320 08:41:06.332835 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:41:06.449740 master-0 kubenswrapper[18707]: I0320 08:41:06.449686 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:41:06.450368 master-0 kubenswrapper[18707]: I0320 08:41:06.450350 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:06.452917 master-0 kubenswrapper[18707]: I0320 08:41:06.452813 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:41:07.029823 master-0 kubenswrapper[18707]: I0320 08:41:07.029751 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" Mar 20 08:41:07.030483 master-0 kubenswrapper[18707]: I0320 08:41:07.029969 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:07.034219 master-0 kubenswrapper[18707]: I0320 08:41:07.034178 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-tkc2j" Mar 20 08:41:07.725333 master-0 kubenswrapper[18707]: I0320 08:41:07.725269 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:41:07.725637 master-0 kubenswrapper[18707]: I0320 08:41:07.725472 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:07.732361 master-0 kubenswrapper[18707]: I0320 08:41:07.732284 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-fzm28" Mar 20 08:41:08.410442 master-0 kubenswrapper[18707]: I0320 08:41:08.410380 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:41:08.411212 master-0 kubenswrapper[18707]: I0320 08:41:08.411160 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:08.414291 master-0 kubenswrapper[18707]: I0320 08:41:08.413767 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:41:08.535582 master-0 kubenswrapper[18707]: I0320 08:41:08.535517 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:41:08.536932 master-0 kubenswrapper[18707]: E0320 08:41:08.536863 18707 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:08.536932 master-0 kubenswrapper[18707]: E0320 08:41:08.536916 18707 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:08.537055 master-0 kubenswrapper[18707]: E0320 08:41:08.536982 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access podName:d245e5b2-a30d-45c8-9b79-6e8096765c14 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:16.536957997 +0000 UTC m=+21.693138393 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access") pod "installer-3-master-0" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:08.736622 master-0 kubenswrapper[18707]: I0320 08:41:08.736473 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:41:08.736818 master-0 kubenswrapper[18707]: I0320 08:41:08.736661 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:08.741660 master-0 kubenswrapper[18707]: I0320 08:41:08.741615 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7dcf5569b5-xmvwz" Mar 20 08:41:09.450614 master-0 kubenswrapper[18707]: I0320 08:41:09.450537 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert\") pod \"ingress-canary-cst2b\" (UID: \"05530257-7cf7-49c1-ae64-eb866cca8588\") " pod="openshift-ingress-canary/ingress-canary-cst2b" Mar 20 08:41:09.451228 master-0 kubenswrapper[18707]: E0320 08:41:09.450825 18707 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 20 08:41:09.451228 master-0 kubenswrapper[18707]: E0320 08:41:09.450958 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert podName:05530257-7cf7-49c1-ae64-eb866cca8588 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:17.450926939 +0000 UTC m=+22.607107335 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert") pod "ingress-canary-cst2b" (UID: "05530257-7cf7-49c1-ae64-eb866cca8588") : secret "canary-serving-cert" not found Mar 20 08:41:09.540267 master-0 kubenswrapper[18707]: I0320 08:41:09.540172 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:41:09.592104 master-0 kubenswrapper[18707]: I0320 08:41:09.591929 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:41:09.718059 master-0 kubenswrapper[18707]: I0320 08:41:09.718003 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dtqgc" Mar 20 08:41:10.384944 master-0 kubenswrapper[18707]: I0320 08:41:10.384876 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:41:10.385224 master-0 kubenswrapper[18707]: I0320 08:41:10.385132 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:10.403571 master-0 kubenswrapper[18707]: I0320 08:41:10.403519 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rxdwp" Mar 20 08:41:10.688350 master-0 kubenswrapper[18707]: I0320 08:41:10.688162 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_8e7a82869988463543d3d8dd1f0b5fe3/startup-monitor/0.log" Mar 20 08:41:10.688350 master-0 kubenswrapper[18707]: I0320 08:41:10.688238 18707 generic.go:334] "Generic (PLEG): container finished" podID="8e7a82869988463543d3d8dd1f0b5fe3" containerID="2f574795ee9d934844b92324a83362cb7abdf8cc28431e8355456d552139443f" exitCode=137 Mar 20 08:41:10.741006 master-0 kubenswrapper[18707]: I0320 08:41:10.740935 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:41:10.810812 master-0 kubenswrapper[18707]: I0320 08:41:10.810745 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:41:10.887014 master-0 kubenswrapper[18707]: I0320 08:41:10.886959 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_8e7a82869988463543d3d8dd1f0b5fe3/startup-monitor/0.log" Mar 20 08:41:10.887257 master-0 kubenswrapper[18707]: I0320 08:41:10.887065 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:41:10.977870 master-0 kubenswrapper[18707]: I0320 08:41:10.977715 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 20 08:41:10.977870 master-0 kubenswrapper[18707]: I0320 08:41:10.977813 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 20 08:41:10.978158 master-0 kubenswrapper[18707]: I0320 08:41:10.977913 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 20 08:41:10.978158 master-0 kubenswrapper[18707]: I0320 08:41:10.977948 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log" (OuterVolumeSpecName: "var-log") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:41:10.978158 master-0 kubenswrapper[18707]: I0320 08:41:10.978032 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 20 08:41:10.978158 master-0 kubenswrapper[18707]: I0320 08:41:10.978070 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 20 08:41:10.978158 master-0 kubenswrapper[18707]: I0320 08:41:10.978086 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests" (OuterVolumeSpecName: "manifests") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:41:10.978329 master-0 kubenswrapper[18707]: I0320 08:41:10.978255 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:41:10.978329 master-0 kubenswrapper[18707]: I0320 08:41:10.978259 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock" (OuterVolumeSpecName: "var-lock") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:41:10.978661 master-0 kubenswrapper[18707]: I0320 08:41:10.978628 18707 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:10.978710 master-0 kubenswrapper[18707]: I0320 08:41:10.978666 18707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:10.978710 master-0 kubenswrapper[18707]: I0320 08:41:10.978690 18707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:10.978775 master-0 kubenswrapper[18707]: I0320 08:41:10.978713 18707 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:10.983504 master-0 kubenswrapper[18707]: I0320 08:41:10.983053 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:41:11.060657 master-0 kubenswrapper[18707]: I0320 08:41:11.060569 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:41:11.082260 master-0 kubenswrapper[18707]: I0320 08:41:11.081851 18707 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:11.103379 master-0 kubenswrapper[18707]: I0320 08:41:11.103314 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e7a82869988463543d3d8dd1f0b5fe3" path="/var/lib/kubelet/pods/8e7a82869988463543d3d8dd1f0b5fe3/volumes" Mar 20 08:41:11.103640 master-0 kubenswrapper[18707]: I0320 08:41:11.103626 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 20 08:41:11.120887 master-0 kubenswrapper[18707]: I0320 08:41:11.120828 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hqqrk" Mar 20 08:41:11.120887 master-0 kubenswrapper[18707]: I0320 08:41:11.120879 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 20 08:41:11.121236 master-0 kubenswrapper[18707]: I0320 08:41:11.120938 18707 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="f2c4c4df-76af-433a-a1f6-08301ea231b3" Mar 20 08:41:11.122897 master-0 kubenswrapper[18707]: I0320 08:41:11.122855 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 20 08:41:11.122990 master-0 kubenswrapper[18707]: I0320 08:41:11.122968 18707 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="f2c4c4df-76af-433a-a1f6-08301ea231b3" Mar 20 08:41:11.698734 master-0 kubenswrapper[18707]: I0320 08:41:11.698656 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_8e7a82869988463543d3d8dd1f0b5fe3/startup-monitor/0.log" Mar 20 08:41:11.699707 master-0 kubenswrapper[18707]: I0320 08:41:11.698809 18707 scope.go:117] "RemoveContainer" containerID="2f574795ee9d934844b92324a83362cb7abdf8cc28431e8355456d552139443f" Mar 20 08:41:11.699707 master-0 kubenswrapper[18707]: I0320 08:41:11.698972 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:41:11.752726 master-0 kubenswrapper[18707]: I0320 08:41:11.752665 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jstrn" Mar 20 08:41:12.446130 master-0 kubenswrapper[18707]: I0320 08:41:12.442423 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:41:12.446130 master-0 kubenswrapper[18707]: I0320 08:41:12.442675 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:41:12.459224 master-0 kubenswrapper[18707]: I0320 08:41:12.455525 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:41:15.530296 master-0 kubenswrapper[18707]: I0320 08:41:15.530180 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-76b6568d85-8b8gv"] Mar 20 08:41:15.531030 master-0 kubenswrapper[18707]: E0320 08:41:15.530530 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" Mar 20 08:41:15.531030 master-0 kubenswrapper[18707]: I0320 08:41:15.530544 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" Mar 20 08:41:15.531030 master-0 kubenswrapper[18707]: I0320 08:41:15.530690 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" Mar 20 08:41:15.531223 master-0 kubenswrapper[18707]: I0320 08:41:15.531177 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:15.535005 master-0 kubenswrapper[18707]: I0320 08:41:15.534948 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 08:41:15.535319 master-0 kubenswrapper[18707]: I0320 08:41:15.535289 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 08:41:15.535509 master-0 kubenswrapper[18707]: I0320 08:41:15.535458 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-whhgj" Mar 20 08:41:15.535739 master-0 kubenswrapper[18707]: I0320 08:41:15.535532 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 08:41:15.535810 master-0 kubenswrapper[18707]: I0320 08:41:15.535612 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 08:41:15.535861 master-0 kubenswrapper[18707]: I0320 08:41:15.535631 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 08:41:15.557960 master-0 kubenswrapper[18707]: I0320 08:41:15.557891 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b6568d85-8b8gv"] Mar 20 08:41:15.652908 master-0 kubenswrapper[18707]: I0320 08:41:15.652837 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-config\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:15.652908 master-0 kubenswrapper[18707]: I0320 08:41:15.652901 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkk5h\" (UniqueName: \"kubernetes.io/projected/348f3880-793f-43e4-9de1-8511626d2552-kube-api-access-gkk5h\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:15.653242 master-0 kubenswrapper[18707]: I0320 08:41:15.652939 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:15.653242 master-0 kubenswrapper[18707]: I0320 08:41:15.653012 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/348f3880-793f-43e4-9de1-8511626d2552-serving-cert\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:15.754970 master-0 kubenswrapper[18707]: I0320 08:41:15.754886 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/348f3880-793f-43e4-9de1-8511626d2552-serving-cert\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:15.755246 master-0 kubenswrapper[18707]: I0320 08:41:15.755027 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-config\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:15.755246 master-0 kubenswrapper[18707]: I0320 08:41:15.755063 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkk5h\" (UniqueName: \"kubernetes.io/projected/348f3880-793f-43e4-9de1-8511626d2552-kube-api-access-gkk5h\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:15.755246 master-0 kubenswrapper[18707]: I0320 08:41:15.755101 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:15.755343 master-0 kubenswrapper[18707]: E0320 08:41:15.755305 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca podName:348f3880-793f-43e4-9de1-8511626d2552 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:16.255279747 +0000 UTC m=+21.411460103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca") pod "console-operator-76b6568d85-8b8gv" (UID: "348f3880-793f-43e4-9de1-8511626d2552") : configmap references non-existent config key: ca-bundle.crt Mar 20 08:41:15.756686 master-0 kubenswrapper[18707]: I0320 08:41:15.756656 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-config\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:15.759766 master-0 kubenswrapper[18707]: I0320 08:41:15.759725 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/348f3880-793f-43e4-9de1-8511626d2552-serving-cert\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:15.778284 master-0 kubenswrapper[18707]: I0320 08:41:15.778230 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkk5h\" (UniqueName: \"kubernetes.io/projected/348f3880-793f-43e4-9de1-8511626d2552-kube-api-access-gkk5h\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:16.265092 master-0 kubenswrapper[18707]: I0320 08:41:16.264994 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:16.265462 master-0 kubenswrapper[18707]: E0320 08:41:16.265330 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca podName:348f3880-793f-43e4-9de1-8511626d2552 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:17.265294021 +0000 UTC m=+22.421474367 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca") pod "console-operator-76b6568d85-8b8gv" (UID: "348f3880-793f-43e4-9de1-8511626d2552") : configmap references non-existent config key: ca-bundle.crt Mar 20 08:41:16.570226 master-0 kubenswrapper[18707]: I0320 08:41:16.570141 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:41:16.570896 master-0 kubenswrapper[18707]: E0320 08:41:16.570411 18707 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:16.570896 master-0 kubenswrapper[18707]: E0320 08:41:16.570436 18707 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:16.570896 master-0 kubenswrapper[18707]: E0320 08:41:16.570497 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access podName:d245e5b2-a30d-45c8-9b79-6e8096765c14 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:32.570479967 +0000 UTC m=+37.726660323 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access") pod "installer-3-master-0" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:17.284072 master-0 kubenswrapper[18707]: I0320 08:41:17.283974 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:17.284427 master-0 kubenswrapper[18707]: E0320 08:41:17.284342 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca podName:348f3880-793f-43e4-9de1-8511626d2552 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:19.284292007 +0000 UTC m=+24.440472403 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca") pod "console-operator-76b6568d85-8b8gv" (UID: "348f3880-793f-43e4-9de1-8511626d2552") : configmap references non-existent config key: ca-bundle.crt Mar 20 08:41:17.486906 master-0 kubenswrapper[18707]: I0320 08:41:17.486816 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert\") pod \"ingress-canary-cst2b\" (UID: \"05530257-7cf7-49c1-ae64-eb866cca8588\") " pod="openshift-ingress-canary/ingress-canary-cst2b" Mar 20 08:41:17.487235 master-0 kubenswrapper[18707]: E0320 08:41:17.487027 18707 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 20 08:41:17.487235 master-0 kubenswrapper[18707]: E0320 08:41:17.487106 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert podName:05530257-7cf7-49c1-ae64-eb866cca8588 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:33.487083995 +0000 UTC m=+38.643264351 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert") pod "ingress-canary-cst2b" (UID: "05530257-7cf7-49c1-ae64-eb866cca8588") : secret "canary-serving-cert" not found Mar 20 08:41:19.314818 master-0 kubenswrapper[18707]: I0320 08:41:19.314740 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:19.315836 master-0 kubenswrapper[18707]: E0320 08:41:19.314998 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca podName:348f3880-793f-43e4-9de1-8511626d2552 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:23.314958018 +0000 UTC m=+28.471138374 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca") pod "console-operator-76b6568d85-8b8gv" (UID: "348f3880-793f-43e4-9de1-8511626d2552") : configmap references non-existent config key: ca-bundle.crt Mar 20 08:41:23.383976 master-0 kubenswrapper[18707]: I0320 08:41:23.383870 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:23.384771 master-0 kubenswrapper[18707]: E0320 08:41:23.384109 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca podName:348f3880-793f-43e4-9de1-8511626d2552 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:31.384081966 +0000 UTC m=+36.540262322 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca") pod "console-operator-76b6568d85-8b8gv" (UID: "348f3880-793f-43e4-9de1-8511626d2552") : configmap references non-existent config key: ca-bundle.crt Mar 20 08:41:28.119680 master-0 kubenswrapper[18707]: I0320 08:41:28.119596 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-b7d455597-jgpnw"] Mar 20 08:41:28.120764 master-0 kubenswrapper[18707]: I0320 08:41:28.120725 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-b7d455597-jgpnw" Mar 20 08:41:28.125522 master-0 kubenswrapper[18707]: I0320 08:41:28.125469 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-wftxn" Mar 20 08:41:28.125941 master-0 kubenswrapper[18707]: I0320 08:41:28.125595 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 20 08:41:28.135933 master-0 kubenswrapper[18707]: I0320 08:41:28.135888 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-b7d455597-jgpnw"] Mar 20 08:41:28.258885 master-0 kubenswrapper[18707]: I0320 08:41:28.258818 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/18ed2e60-f0ca-4b4e-86a0-48eba3a7cb8b-monitoring-plugin-cert\") pod \"monitoring-plugin-b7d455597-jgpnw\" (UID: \"18ed2e60-f0ca-4b4e-86a0-48eba3a7cb8b\") " pod="openshift-monitoring/monitoring-plugin-b7d455597-jgpnw" Mar 20 08:41:28.360396 master-0 kubenswrapper[18707]: I0320 08:41:28.360326 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/18ed2e60-f0ca-4b4e-86a0-48eba3a7cb8b-monitoring-plugin-cert\") pod \"monitoring-plugin-b7d455597-jgpnw\" (UID: \"18ed2e60-f0ca-4b4e-86a0-48eba3a7cb8b\") " pod="openshift-monitoring/monitoring-plugin-b7d455597-jgpnw" Mar 20 08:41:28.364793 master-0 kubenswrapper[18707]: I0320 08:41:28.364746 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/18ed2e60-f0ca-4b4e-86a0-48eba3a7cb8b-monitoring-plugin-cert\") pod \"monitoring-plugin-b7d455597-jgpnw\" (UID: \"18ed2e60-f0ca-4b4e-86a0-48eba3a7cb8b\") " pod="openshift-monitoring/monitoring-plugin-b7d455597-jgpnw" Mar 20 08:41:28.447285 master-0 kubenswrapper[18707]: I0320 08:41:28.447123 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-b7d455597-jgpnw" Mar 20 08:41:28.706176 master-0 kubenswrapper[18707]: I0320 08:41:28.706126 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-b7d455597-jgpnw"] Mar 20 08:41:28.713753 master-0 kubenswrapper[18707]: I0320 08:41:28.713611 18707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:41:28.826436 master-0 kubenswrapper[18707]: I0320 08:41:28.826367 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-b7d455597-jgpnw" event={"ID":"18ed2e60-f0ca-4b4e-86a0-48eba3a7cb8b","Type":"ContainerStarted","Data":"33360edfec526353b38b29c7845d1f01587aadc4ccb27dd1222766a380fb2bda"} Mar 20 08:41:28.855542 master-0 kubenswrapper[18707]: I0320 08:41:28.855483 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:41:31.420745 master-0 kubenswrapper[18707]: I0320 08:41:31.420687 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:31.421433 master-0 kubenswrapper[18707]: E0320 08:41:31.421013 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca podName:348f3880-793f-43e4-9de1-8511626d2552 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:47.420977593 +0000 UTC m=+52.577157989 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca") pod "console-operator-76b6568d85-8b8gv" (UID: "348f3880-793f-43e4-9de1-8511626d2552") : configmap references non-existent config key: ca-bundle.crt Mar 20 08:41:31.851019 master-0 kubenswrapper[18707]: I0320 08:41:31.850950 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-b7d455597-jgpnw" event={"ID":"18ed2e60-f0ca-4b4e-86a0-48eba3a7cb8b","Type":"ContainerStarted","Data":"ddf4d2a45c1cec167bb6c54e3cbb1b1eedec5765ad3a363272adf643fcd11210"} Mar 20 08:41:31.851415 master-0 kubenswrapper[18707]: I0320 08:41:31.851395 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-b7d455597-jgpnw" Mar 20 08:41:31.858486 master-0 kubenswrapper[18707]: I0320 08:41:31.858464 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-b7d455597-jgpnw" Mar 20 08:41:31.872956 master-0 kubenswrapper[18707]: I0320 08:41:31.871105 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-b7d455597-jgpnw" podStartSLOduration=1.9072063670000001 podStartE2EDuration="3.871066278s" podCreationTimestamp="2026-03-20 08:41:28 +0000 UTC" firstStartedPulling="2026-03-20 08:41:28.713537663 +0000 UTC m=+33.869718019" lastFinishedPulling="2026-03-20 08:41:30.677397574 +0000 UTC m=+35.833577930" observedRunningTime="2026-03-20 08:41:31.868370439 +0000 UTC m=+37.024550825" watchObservedRunningTime="2026-03-20 08:41:31.871066278 +0000 UTC m=+37.027246634" Mar 20 08:41:32.642268 master-0 kubenswrapper[18707]: I0320 08:41:32.642119 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:41:32.642983 master-0 kubenswrapper[18707]: E0320 08:41:32.642506 18707 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:32.642983 master-0 kubenswrapper[18707]: E0320 08:41:32.642550 18707 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:32.642983 master-0 kubenswrapper[18707]: E0320 08:41:32.642639 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access podName:d245e5b2-a30d-45c8-9b79-6e8096765c14 nodeName:}" failed. No retries permitted until 2026-03-20 08:42:04.642610194 +0000 UTC m=+69.798790590 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access") pod "installer-3-master-0" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:41:33.556294 master-0 kubenswrapper[18707]: I0320 08:41:33.556203 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert\") pod \"ingress-canary-cst2b\" (UID: \"05530257-7cf7-49c1-ae64-eb866cca8588\") " pod="openshift-ingress-canary/ingress-canary-cst2b" Mar 20 08:41:33.560711 master-0 kubenswrapper[18707]: I0320 08:41:33.560451 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05530257-7cf7-49c1-ae64-eb866cca8588-cert\") pod \"ingress-canary-cst2b\" (UID: \"05530257-7cf7-49c1-ae64-eb866cca8588\") " pod="openshift-ingress-canary/ingress-canary-cst2b" Mar 20 08:41:33.817995 master-0 kubenswrapper[18707]: I0320 08:41:33.817759 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cst2b" Mar 20 08:41:34.328620 master-0 kubenswrapper[18707]: I0320 08:41:34.328541 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cst2b"] Mar 20 08:41:34.338939 master-0 kubenswrapper[18707]: W0320 08:41:34.338838 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05530257_7cf7_49c1_ae64_eb866cca8588.slice/crio-cfff3d36ea1a6646da8977bdf95127382bf0ac707970f2dc981a847feb145c19 WatchSource:0}: Error finding container cfff3d36ea1a6646da8977bdf95127382bf0ac707970f2dc981a847feb145c19: Status 404 returned error can't find the container with id cfff3d36ea1a6646da8977bdf95127382bf0ac707970f2dc981a847feb145c19 Mar 20 08:41:34.876211 master-0 kubenswrapper[18707]: I0320 08:41:34.876098 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cst2b" event={"ID":"05530257-7cf7-49c1-ae64-eb866cca8588","Type":"ContainerStarted","Data":"1e441c475a8a3606db1282665f8579e216fc561a02bd57b3801861caf0f11d91"} Mar 20 08:41:34.876211 master-0 kubenswrapper[18707]: I0320 08:41:34.876212 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cst2b" event={"ID":"05530257-7cf7-49c1-ae64-eb866cca8588","Type":"ContainerStarted","Data":"cfff3d36ea1a6646da8977bdf95127382bf0ac707970f2dc981a847feb145c19"} Mar 20 08:41:34.903837 master-0 kubenswrapper[18707]: I0320 08:41:34.903716 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cst2b" podStartSLOduration=33.903678768 podStartE2EDuration="33.903678768s" podCreationTimestamp="2026-03-20 08:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:41:34.89962529 +0000 UTC m=+40.055805656" watchObservedRunningTime="2026-03-20 08:41:34.903678768 +0000 UTC m=+40.059859154" Mar 20 08:41:43.216389 master-0 kubenswrapper[18707]: I0320 08:41:43.216330 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-86f9447d7-znl8c"] Mar 20 08:41:43.217311 master-0 kubenswrapper[18707]: I0320 08:41:43.217290 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.220721 master-0 kubenswrapper[18707]: W0320 08:41:43.220673 18707 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-serving-cert": failed to list *v1.Secret: secrets "v4-0-config-system-serving-cert" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'master-0' and this object Mar 20 08:41:43.220721 master-0 kubenswrapper[18707]: W0320 08:41:43.220713 18707 reflector.go:561] object-"openshift-authentication"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'master-0' and this object Mar 20 08:41:43.220976 master-0 kubenswrapper[18707]: E0320 08:41:43.220744 18707 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 20 08:41:43.220976 master-0 kubenswrapper[18707]: E0320 08:41:43.220743 18707 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-serving-cert\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 20 08:41:43.221073 master-0 kubenswrapper[18707]: W0320 08:41:43.220983 18707 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-router-certs": failed to list *v1.Secret: secrets "v4-0-config-system-router-certs" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'master-0' and this object Mar 20 08:41:43.221073 master-0 kubenswrapper[18707]: E0320 08:41:43.221006 18707 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-router-certs\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 20 08:41:43.221205 master-0 kubenswrapper[18707]: W0320 08:41:43.221169 18707 reflector.go:561] object-"openshift-authentication"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'master-0' and this object Mar 20 08:41:43.221274 master-0 kubenswrapper[18707]: W0320 08:41:43.221200 18707 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-session": failed to list *v1.Secret: secrets "v4-0-config-system-session" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'master-0' and this object Mar 20 08:41:43.221274 master-0 kubenswrapper[18707]: E0320 08:41:43.221228 18707 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-session\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-session\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 20 08:41:43.221274 master-0 kubenswrapper[18707]: W0320 08:41:43.221200 18707 reflector.go:561] object-"openshift-authentication"/"oauth-openshift-dockercfg-scgh2": failed to list *v1.Secret: secrets "oauth-openshift-dockercfg-scgh2" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'master-0' and this object Mar 20 08:41:43.221274 master-0 kubenswrapper[18707]: E0320 08:41:43.221256 18707 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"oauth-openshift-dockercfg-scgh2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"oauth-openshift-dockercfg-scgh2\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 20 08:41:43.221274 master-0 kubenswrapper[18707]: E0320 08:41:43.221204 18707 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 20 08:41:43.222800 master-0 kubenswrapper[18707]: W0320 08:41:43.222771 18707 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-service-ca": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-service-ca" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'master-0' and this object Mar 20 08:41:43.222800 master-0 kubenswrapper[18707]: E0320 08:41:43.222797 18707 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-service-ca\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 20 08:41:43.224337 master-0 kubenswrapper[18707]: I0320 08:41:43.224290 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 08:41:43.226756 master-0 kubenswrapper[18707]: I0320 08:41:43.226722 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 08:41:43.226927 master-0 kubenswrapper[18707]: I0320 08:41:43.226726 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 08:41:43.227211 master-0 kubenswrapper[18707]: I0320 08:41:43.227177 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 08:41:43.228238 master-0 kubenswrapper[18707]: I0320 08:41:43.228210 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 08:41:43.243409 master-0 kubenswrapper[18707]: I0320 08:41:43.243318 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86f9447d7-znl8c"] Mar 20 08:41:43.243688 master-0 kubenswrapper[18707]: I0320 08:41:43.243561 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 08:41:43.301228 master-0 kubenswrapper[18707]: I0320 08:41:43.296439 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 08:41:43.323214 master-0 kubenswrapper[18707]: I0320 08:41:43.318266 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-session\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.323214 master-0 kubenswrapper[18707]: I0320 08:41:43.318443 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-login\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.323214 master-0 kubenswrapper[18707]: I0320 08:41:43.318479 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-error\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.323214 master-0 kubenswrapper[18707]: I0320 08:41:43.318512 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-audit-policies\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.323214 master-0 kubenswrapper[18707]: I0320 08:41:43.318583 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x22rd\" (UniqueName: \"kubernetes.io/projected/bdabb7fd-c047-419a-88e4-5e11db2445e7-kube-api-access-x22rd\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.323214 master-0 kubenswrapper[18707]: I0320 08:41:43.318744 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.323214 master-0 kubenswrapper[18707]: I0320 08:41:43.318789 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.323214 master-0 kubenswrapper[18707]: I0320 08:41:43.318940 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.323214 master-0 kubenswrapper[18707]: I0320 08:41:43.318966 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.323214 master-0 kubenswrapper[18707]: I0320 08:41:43.318998 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.323214 master-0 kubenswrapper[18707]: I0320 08:41:43.319051 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.323214 master-0 kubenswrapper[18707]: I0320 08:41:43.319083 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bdabb7fd-c047-419a-88e4-5e11db2445e7-audit-dir\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.323214 master-0 kubenswrapper[18707]: I0320 08:41:43.319145 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-service-ca\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.420807 master-0 kubenswrapper[18707]: I0320 08:41:43.420719 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.420807 master-0 kubenswrapper[18707]: I0320 08:41:43.420802 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.421115 master-0 kubenswrapper[18707]: I0320 08:41:43.420842 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.421115 master-0 kubenswrapper[18707]: I0320 08:41:43.420876 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.421115 master-0 kubenswrapper[18707]: I0320 08:41:43.420901 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bdabb7fd-c047-419a-88e4-5e11db2445e7-audit-dir\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.421115 master-0 kubenswrapper[18707]: I0320 08:41:43.420933 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-service-ca\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.421115 master-0 kubenswrapper[18707]: I0320 08:41:43.420971 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-session\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.421115 master-0 kubenswrapper[18707]: I0320 08:41:43.420999 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-login\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.421115 master-0 kubenswrapper[18707]: I0320 08:41:43.421018 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-error\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.421115 master-0 kubenswrapper[18707]: I0320 08:41:43.421034 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-audit-policies\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.421115 master-0 kubenswrapper[18707]: I0320 08:41:43.421052 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x22rd\" (UniqueName: \"kubernetes.io/projected/bdabb7fd-c047-419a-88e4-5e11db2445e7-kube-api-access-x22rd\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.421115 master-0 kubenswrapper[18707]: I0320 08:41:43.421102 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.421115 master-0 kubenswrapper[18707]: I0320 08:41:43.421120 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.422630 master-0 kubenswrapper[18707]: E0320 08:41:43.422585 18707 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 20 08:41:43.422862 master-0 kubenswrapper[18707]: I0320 08:41:43.422613 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bdabb7fd-c047-419a-88e4-5e11db2445e7-audit-dir\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.422935 master-0 kubenswrapper[18707]: E0320 08:41:43.422922 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-cliconfig podName:bdabb7fd-c047-419a-88e4-5e11db2445e7 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:43.922825589 +0000 UTC m=+49.079005945 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-cliconfig") pod "oauth-openshift-86f9447d7-znl8c" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7") : configmap "v4-0-config-system-cliconfig" not found Mar 20 08:41:43.423037 master-0 kubenswrapper[18707]: I0320 08:41:43.422961 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.423417 master-0 kubenswrapper[18707]: I0320 08:41:43.423378 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-audit-policies\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.424801 master-0 kubenswrapper[18707]: I0320 08:41:43.424770 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.425085 master-0 kubenswrapper[18707]: I0320 08:41:43.425033 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-error\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.426725 master-0 kubenswrapper[18707]: I0320 08:41:43.426658 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-login\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.427314 master-0 kubenswrapper[18707]: I0320 08:41:43.427261 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.930903 master-0 kubenswrapper[18707]: I0320 08:41:43.930789 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:43.931302 master-0 kubenswrapper[18707]: E0320 08:41:43.931029 18707 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 20 08:41:43.931302 master-0 kubenswrapper[18707]: E0320 08:41:43.931217 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-cliconfig podName:bdabb7fd-c047-419a-88e4-5e11db2445e7 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:44.931154823 +0000 UTC m=+50.087335349 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-cliconfig") pod "oauth-openshift-86f9447d7-znl8c" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7") : configmap "v4-0-config-system-cliconfig" not found Mar 20 08:41:44.109478 master-0 kubenswrapper[18707]: I0320 08:41:44.109396 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 08:41:44.240149 master-0 kubenswrapper[18707]: I0320 08:41:44.239996 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 08:41:44.252131 master-0 kubenswrapper[18707]: I0320 08:41:44.252077 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:44.324724 master-0 kubenswrapper[18707]: I0320 08:41:44.324667 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 08:41:44.334534 master-0 kubenswrapper[18707]: I0320 08:41:44.334463 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-service-ca\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:44.423568 master-0 kubenswrapper[18707]: E0320 08:41:44.423513 18707 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-session: failed to sync secret cache: timed out waiting for the condition Mar 20 08:41:44.423844 master-0 kubenswrapper[18707]: E0320 08:41:44.423646 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-session podName:bdabb7fd-c047-419a-88e4-5e11db2445e7 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:44.923615825 +0000 UTC m=+50.079796181 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-session") pod "oauth-openshift-86f9447d7-znl8c" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:41:44.423844 master-0 kubenswrapper[18707]: E0320 08:41:44.423513 18707 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-router-certs: failed to sync secret cache: timed out waiting for the condition Mar 20 08:41:44.423844 master-0 kubenswrapper[18707]: E0320 08:41:44.423719 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-router-certs podName:bdabb7fd-c047-419a-88e4-5e11db2445e7 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:44.923704667 +0000 UTC m=+50.079885013 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-router-certs" (UniqueName: "kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-router-certs") pod "oauth-openshift-86f9447d7-znl8c" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:41:44.446564 master-0 kubenswrapper[18707]: E0320 08:41:44.446519 18707 projected.go:288] Couldn't get configMap openshift-authentication/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:41:44.446809 master-0 kubenswrapper[18707]: E0320 08:41:44.446587 18707 projected.go:194] Error preparing data for projected volume kube-api-access-x22rd for pod openshift-authentication/oauth-openshift-86f9447d7-znl8c: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:41:44.446809 master-0 kubenswrapper[18707]: E0320 08:41:44.446657 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdabb7fd-c047-419a-88e4-5e11db2445e7-kube-api-access-x22rd podName:bdabb7fd-c047-419a-88e4-5e11db2445e7 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:44.946630866 +0000 UTC m=+50.102811222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x22rd" (UniqueName: "kubernetes.io/projected/bdabb7fd-c047-419a-88e4-5e11db2445e7-kube-api-access-x22rd") pod "oauth-openshift-86f9447d7-znl8c" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:41:44.606764 master-0 kubenswrapper[18707]: I0320 08:41:44.606675 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-scgh2" Mar 20 08:41:44.628677 master-0 kubenswrapper[18707]: I0320 08:41:44.628607 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 08:41:44.734706 master-0 kubenswrapper[18707]: I0320 08:41:44.734621 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 08:41:44.787427 master-0 kubenswrapper[18707]: I0320 08:41:44.787351 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 08:41:44.951254 master-0 kubenswrapper[18707]: I0320 08:41:44.951069 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:44.951254 master-0 kubenswrapper[18707]: I0320 08:41:44.951168 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-session\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:44.951254 master-0 kubenswrapper[18707]: I0320 08:41:44.951225 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x22rd\" (UniqueName: \"kubernetes.io/projected/bdabb7fd-c047-419a-88e4-5e11db2445e7-kube-api-access-x22rd\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:44.951586 master-0 kubenswrapper[18707]: I0320 08:41:44.951276 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:44.951586 master-0 kubenswrapper[18707]: E0320 08:41:44.951400 18707 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 20 08:41:44.951586 master-0 kubenswrapper[18707]: E0320 08:41:44.951459 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-cliconfig podName:bdabb7fd-c047-419a-88e4-5e11db2445e7 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:46.951442128 +0000 UTC m=+52.107622484 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-cliconfig") pod "oauth-openshift-86f9447d7-znl8c" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7") : configmap "v4-0-config-system-cliconfig" not found Mar 20 08:41:44.954845 master-0 kubenswrapper[18707]: I0320 08:41:44.954810 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-router-certs\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:44.955757 master-0 kubenswrapper[18707]: I0320 08:41:44.955729 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-session\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:44.960101 master-0 kubenswrapper[18707]: I0320 08:41:44.960069 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x22rd\" (UniqueName: \"kubernetes.io/projected/bdabb7fd-c047-419a-88e4-5e11db2445e7-kube-api-access-x22rd\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:46.986274 master-0 kubenswrapper[18707]: I0320 08:41:46.986158 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86f9447d7-znl8c\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:46.986862 master-0 kubenswrapper[18707]: E0320 08:41:46.986303 18707 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 20 08:41:46.986862 master-0 kubenswrapper[18707]: E0320 08:41:46.986398 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-cliconfig podName:bdabb7fd-c047-419a-88e4-5e11db2445e7 nodeName:}" failed. No retries permitted until 2026-03-20 08:41:50.986372073 +0000 UTC m=+56.142552429 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-cliconfig") pod "oauth-openshift-86f9447d7-znl8c" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7") : configmap "v4-0-config-system-cliconfig" not found Mar 20 08:41:47.493994 master-0 kubenswrapper[18707]: I0320 08:41:47.493912 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:41:47.494296 master-0 kubenswrapper[18707]: E0320 08:41:47.494228 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca podName:348f3880-793f-43e4-9de1-8511626d2552 nodeName:}" failed. No retries permitted until 2026-03-20 08:42:19.494176602 +0000 UTC m=+84.650356958 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca") pod "console-operator-76b6568d85-8b8gv" (UID: "348f3880-793f-43e4-9de1-8511626d2552") : configmap references non-existent config key: ca-bundle.crt Mar 20 08:41:48.239738 master-0 kubenswrapper[18707]: I0320 08:41:48.238819 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-86f9447d7-znl8c"] Mar 20 08:41:48.239738 master-0 kubenswrapper[18707]: E0320 08:41:48.239347 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[v4-0-config-system-cliconfig], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" podUID="bdabb7fd-c047-419a-88e4-5e11db2445e7" Mar 20 08:41:48.358343 master-0 kubenswrapper[18707]: I0320 08:41:48.358270 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:48.369273 master-0 kubenswrapper[18707]: I0320 08:41:48.369212 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:48.510530 master-0 kubenswrapper[18707]: I0320 08:41:48.510391 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-trusted-ca-bundle\") pod \"bdabb7fd-c047-419a-88e4-5e11db2445e7\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " Mar 20 08:41:48.510530 master-0 kubenswrapper[18707]: I0320 08:41:48.510499 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-ocp-branding-template\") pod \"bdabb7fd-c047-419a-88e4-5e11db2445e7\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " Mar 20 08:41:48.510854 master-0 kubenswrapper[18707]: I0320 08:41:48.510592 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x22rd\" (UniqueName: \"kubernetes.io/projected/bdabb7fd-c047-419a-88e4-5e11db2445e7-kube-api-access-x22rd\") pod \"bdabb7fd-c047-419a-88e4-5e11db2445e7\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " Mar 20 08:41:48.510854 master-0 kubenswrapper[18707]: I0320 08:41:48.510623 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bdabb7fd-c047-419a-88e4-5e11db2445e7-audit-dir\") pod \"bdabb7fd-c047-419a-88e4-5e11db2445e7\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " Mar 20 08:41:48.510854 master-0 kubenswrapper[18707]: I0320 08:41:48.510673 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-provider-selection\") pod \"bdabb7fd-c047-419a-88e4-5e11db2445e7\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " Mar 20 08:41:48.510854 master-0 kubenswrapper[18707]: I0320 08:41:48.510782 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bdabb7fd-c047-419a-88e4-5e11db2445e7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "bdabb7fd-c047-419a-88e4-5e11db2445e7" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:41:48.510985 master-0 kubenswrapper[18707]: I0320 08:41:48.510896 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-login\") pod \"bdabb7fd-c047-419a-88e4-5e11db2445e7\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " Mar 20 08:41:48.510985 master-0 kubenswrapper[18707]: I0320 08:41:48.510967 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-session\") pod \"bdabb7fd-c047-419a-88e4-5e11db2445e7\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " Mar 20 08:41:48.511048 master-0 kubenswrapper[18707]: I0320 08:41:48.511002 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-error\") pod \"bdabb7fd-c047-419a-88e4-5e11db2445e7\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " Mar 20 08:41:48.511048 master-0 kubenswrapper[18707]: I0320 08:41:48.511038 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-router-certs\") pod \"bdabb7fd-c047-419a-88e4-5e11db2445e7\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " Mar 20 08:41:48.513257 master-0 kubenswrapper[18707]: I0320 08:41:48.511540 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-service-ca\") pod \"bdabb7fd-c047-419a-88e4-5e11db2445e7\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " Mar 20 08:41:48.513257 master-0 kubenswrapper[18707]: I0320 08:41:48.511624 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-serving-cert\") pod \"bdabb7fd-c047-419a-88e4-5e11db2445e7\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " Mar 20 08:41:48.513257 master-0 kubenswrapper[18707]: I0320 08:41:48.511655 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-audit-policies\") pod \"bdabb7fd-c047-419a-88e4-5e11db2445e7\" (UID: \"bdabb7fd-c047-419a-88e4-5e11db2445e7\") " Mar 20 08:41:48.513257 master-0 kubenswrapper[18707]: I0320 08:41:48.511765 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "bdabb7fd-c047-419a-88e4-5e11db2445e7" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:41:48.513257 master-0 kubenswrapper[18707]: I0320 08:41:48.511990 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:48.513257 master-0 kubenswrapper[18707]: I0320 08:41:48.512006 18707 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bdabb7fd-c047-419a-88e4-5e11db2445e7-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:48.513257 master-0 kubenswrapper[18707]: I0320 08:41:48.511997 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "bdabb7fd-c047-419a-88e4-5e11db2445e7" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:41:48.513257 master-0 kubenswrapper[18707]: I0320 08:41:48.512356 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "bdabb7fd-c047-419a-88e4-5e11db2445e7" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:41:48.515917 master-0 kubenswrapper[18707]: I0320 08:41:48.515602 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "bdabb7fd-c047-419a-88e4-5e11db2445e7" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:41:48.515917 master-0 kubenswrapper[18707]: I0320 08:41:48.515659 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "bdabb7fd-c047-419a-88e4-5e11db2445e7" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:41:48.515917 master-0 kubenswrapper[18707]: I0320 08:41:48.515842 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "bdabb7fd-c047-419a-88e4-5e11db2445e7" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:41:48.516104 master-0 kubenswrapper[18707]: I0320 08:41:48.516051 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdabb7fd-c047-419a-88e4-5e11db2445e7-kube-api-access-x22rd" (OuterVolumeSpecName: "kube-api-access-x22rd") pod "bdabb7fd-c047-419a-88e4-5e11db2445e7" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7"). InnerVolumeSpecName "kube-api-access-x22rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:41:48.516613 master-0 kubenswrapper[18707]: I0320 08:41:48.516541 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "bdabb7fd-c047-419a-88e4-5e11db2445e7" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:41:48.517092 master-0 kubenswrapper[18707]: I0320 08:41:48.516749 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "bdabb7fd-c047-419a-88e4-5e11db2445e7" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:41:48.517433 master-0 kubenswrapper[18707]: I0320 08:41:48.517384 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "bdabb7fd-c047-419a-88e4-5e11db2445e7" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:41:48.526327 master-0 kubenswrapper[18707]: I0320 08:41:48.526157 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "bdabb7fd-c047-419a-88e4-5e11db2445e7" (UID: "bdabb7fd-c047-419a-88e4-5e11db2445e7"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:41:48.614022 master-0 kubenswrapper[18707]: I0320 08:41:48.613854 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:48.614022 master-0 kubenswrapper[18707]: I0320 08:41:48.613900 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:48.614022 master-0 kubenswrapper[18707]: I0320 08:41:48.613910 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:48.614022 master-0 kubenswrapper[18707]: I0320 08:41:48.613921 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:48.614022 master-0 kubenswrapper[18707]: I0320 08:41:48.613933 18707 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:48.614022 master-0 kubenswrapper[18707]: I0320 08:41:48.613945 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:48.614022 master-0 kubenswrapper[18707]: I0320 08:41:48.613956 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x22rd\" (UniqueName: \"kubernetes.io/projected/bdabb7fd-c047-419a-88e4-5e11db2445e7-kube-api-access-x22rd\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:48.614022 master-0 kubenswrapper[18707]: I0320 08:41:48.613967 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:48.614022 master-0 kubenswrapper[18707]: I0320 08:41:48.613976 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:48.614022 master-0 kubenswrapper[18707]: I0320 08:41:48.613987 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:49.366393 master-0 kubenswrapper[18707]: I0320 08:41:49.366179 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86f9447d7-znl8c" Mar 20 08:41:49.426221 master-0 kubenswrapper[18707]: I0320 08:41:49.425717 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-86f9447d7-znl8c"] Mar 20 08:41:49.430842 master-0 kubenswrapper[18707]: I0320 08:41:49.430786 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5bc4f8b544-czdmk"] Mar 20 08:41:49.432304 master-0 kubenswrapper[18707]: I0320 08:41:49.432274 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.445073 master-0 kubenswrapper[18707]: I0320 08:41:49.441179 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 08:41:49.445073 master-0 kubenswrapper[18707]: I0320 08:41:49.441669 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 08:41:49.445073 master-0 kubenswrapper[18707]: I0320 08:41:49.441713 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 08:41:49.445073 master-0 kubenswrapper[18707]: I0320 08:41:49.441892 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 08:41:49.445073 master-0 kubenswrapper[18707]: I0320 08:41:49.441955 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 08:41:49.445073 master-0 kubenswrapper[18707]: I0320 08:41:49.442069 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-scgh2" Mar 20 08:41:49.445073 master-0 kubenswrapper[18707]: I0320 08:41:49.443325 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 08:41:49.445073 master-0 kubenswrapper[18707]: I0320 08:41:49.443500 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 08:41:49.445073 master-0 kubenswrapper[18707]: I0320 08:41:49.443648 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 08:41:49.445073 master-0 kubenswrapper[18707]: I0320 08:41:49.443855 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 08:41:49.445073 master-0 kubenswrapper[18707]: I0320 08:41:49.444908 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 08:41:49.482706 master-0 kubenswrapper[18707]: I0320 08:41:49.473207 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-86f9447d7-znl8c"] Mar 20 08:41:49.482706 master-0 kubenswrapper[18707]: I0320 08:41:49.478006 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 08:41:49.482706 master-0 kubenswrapper[18707]: I0320 08:41:49.482303 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5bc4f8b544-czdmk"] Mar 20 08:41:49.488843 master-0 kubenswrapper[18707]: I0320 08:41:49.488771 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 08:41:49.503342 master-0 kubenswrapper[18707]: I0320 08:41:49.502890 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 08:41:49.529630 master-0 kubenswrapper[18707]: I0320 08:41:49.529580 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.529871 master-0 kubenswrapper[18707]: I0320 08:41:49.529855 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.530269 master-0 kubenswrapper[18707]: I0320 08:41:49.530176 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-session\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.530394 master-0 kubenswrapper[18707]: I0320 08:41:49.530342 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-error\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.530445 master-0 kubenswrapper[18707]: I0320 08:41:49.530425 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-login\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.530893 master-0 kubenswrapper[18707]: I0320 08:41:49.530837 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/42f9a484-af8a-4d23-84c4-d4717c8877c5-audit-dir\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.530981 master-0 kubenswrapper[18707]: I0320 08:41:49.530927 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-847cl\" (UniqueName: \"kubernetes.io/projected/42f9a484-af8a-4d23-84c4-d4717c8877c5-kube-api-access-847cl\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.531024 master-0 kubenswrapper[18707]: I0320 08:41:49.530980 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-audit-policies\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.531024 master-0 kubenswrapper[18707]: I0320 08:41:49.531009 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.531112 master-0 kubenswrapper[18707]: I0320 08:41:49.531087 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-router-certs\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.531165 master-0 kubenswrapper[18707]: I0320 08:41:49.531123 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.531342 master-0 kubenswrapper[18707]: I0320 08:41:49.531318 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-service-ca\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.531409 master-0 kubenswrapper[18707]: I0320 08:41:49.531392 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.634830 master-0 kubenswrapper[18707]: I0320 08:41:49.634320 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-router-certs\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.635098 master-0 kubenswrapper[18707]: I0320 08:41:49.635080 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.635283 master-0 kubenswrapper[18707]: I0320 08:41:49.635265 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-service-ca\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.635389 master-0 kubenswrapper[18707]: I0320 08:41:49.635376 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.635521 master-0 kubenswrapper[18707]: I0320 08:41:49.635505 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.635600 master-0 kubenswrapper[18707]: I0320 08:41:49.635587 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.636512 master-0 kubenswrapper[18707]: I0320 08:41:49.636459 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-session\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.636578 master-0 kubenswrapper[18707]: I0320 08:41:49.636515 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-error\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.636578 master-0 kubenswrapper[18707]: I0320 08:41:49.636560 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-login\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.636717 master-0 kubenswrapper[18707]: I0320 08:41:49.636675 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/42f9a484-af8a-4d23-84c4-d4717c8877c5-audit-dir\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.636763 master-0 kubenswrapper[18707]: I0320 08:41:49.636718 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-847cl\" (UniqueName: \"kubernetes.io/projected/42f9a484-af8a-4d23-84c4-d4717c8877c5-kube-api-access-847cl\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.636763 master-0 kubenswrapper[18707]: I0320 08:41:49.636750 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-audit-policies\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.636823 master-0 kubenswrapper[18707]: I0320 08:41:49.636774 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.636932 master-0 kubenswrapper[18707]: I0320 08:41:49.636911 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bdabb7fd-c047-419a-88e4-5e11db2445e7-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 20 08:41:49.637000 master-0 kubenswrapper[18707]: I0320 08:41:49.636945 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.637958 master-0 kubenswrapper[18707]: I0320 08:41:49.637751 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-audit-policies\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.637958 master-0 kubenswrapper[18707]: I0320 08:41:49.636962 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-service-ca\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.638379 master-0 kubenswrapper[18707]: I0320 08:41:49.638358 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.638601 master-0 kubenswrapper[18707]: I0320 08:41:49.638585 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/42f9a484-af8a-4d23-84c4-d4717c8877c5-audit-dir\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.639767 master-0 kubenswrapper[18707]: I0320 08:41:49.639718 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-router-certs\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.640894 master-0 kubenswrapper[18707]: I0320 08:41:49.640877 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-login\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.641754 master-0 kubenswrapper[18707]: I0320 08:41:49.641682 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.641891 master-0 kubenswrapper[18707]: I0320 08:41:49.641849 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-session\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.643162 master-0 kubenswrapper[18707]: I0320 08:41:49.643112 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-error\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.644667 master-0 kubenswrapper[18707]: I0320 08:41:49.644576 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.648151 master-0 kubenswrapper[18707]: I0320 08:41:49.648113 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.665061 master-0 kubenswrapper[18707]: I0320 08:41:49.665037 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-847cl\" (UniqueName: \"kubernetes.io/projected/42f9a484-af8a-4d23-84c4-d4717c8877c5-kube-api-access-847cl\") pod \"oauth-openshift-5bc4f8b544-czdmk\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:49.790682 master-0 kubenswrapper[18707]: I0320 08:41:49.790633 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:50.234525 master-0 kubenswrapper[18707]: I0320 08:41:50.234444 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5bc4f8b544-czdmk"] Mar 20 08:41:50.236073 master-0 kubenswrapper[18707]: W0320 08:41:50.236030 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42f9a484_af8a_4d23_84c4_d4717c8877c5.slice/crio-3cb2d6761e890f4b4629664520fd8698e069c73b59cf66c997258b8d96a32a6d WatchSource:0}: Error finding container 3cb2d6761e890f4b4629664520fd8698e069c73b59cf66c997258b8d96a32a6d: Status 404 returned error can't find the container with id 3cb2d6761e890f4b4629664520fd8698e069c73b59cf66c997258b8d96a32a6d Mar 20 08:41:50.375887 master-0 kubenswrapper[18707]: I0320 08:41:50.375793 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" event={"ID":"42f9a484-af8a-4d23-84c4-d4717c8877c5","Type":"ContainerStarted","Data":"3cb2d6761e890f4b4629664520fd8698e069c73b59cf66c997258b8d96a32a6d"} Mar 20 08:41:51.105847 master-0 kubenswrapper[18707]: I0320 08:41:51.105777 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdabb7fd-c047-419a-88e4-5e11db2445e7" path="/var/lib/kubelet/pods/bdabb7fd-c047-419a-88e4-5e11db2445e7/volumes" Mar 20 08:41:53.399780 master-0 kubenswrapper[18707]: I0320 08:41:53.399721 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" event={"ID":"42f9a484-af8a-4d23-84c4-d4717c8877c5","Type":"ContainerStarted","Data":"fc8697bfa06a4bdb420ae52edc94b7e3ade80fca4018202e822c978a1207f4cc"} Mar 20 08:41:53.400823 master-0 kubenswrapper[18707]: I0320 08:41:53.400789 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:53.409097 master-0 kubenswrapper[18707]: I0320 08:41:53.409029 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:41:53.429608 master-0 kubenswrapper[18707]: I0320 08:41:53.429487 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" podStartSLOduration=3.4339342840000002 podStartE2EDuration="5.429460149s" podCreationTimestamp="2026-03-20 08:41:48 +0000 UTC" firstStartedPulling="2026-03-20 08:41:50.238089197 +0000 UTC m=+55.394269553" lastFinishedPulling="2026-03-20 08:41:52.233615062 +0000 UTC m=+57.389795418" observedRunningTime="2026-03-20 08:41:53.4246834 +0000 UTC m=+58.580863746" watchObservedRunningTime="2026-03-20 08:41:53.429460149 +0000 UTC m=+58.585640505" Mar 20 08:41:55.054774 master-0 kubenswrapper[18707]: I0320 08:41:55.054274 18707 scope.go:117] "RemoveContainer" containerID="21b9803fda84668208544ea6b68c3d3a859b684d4b97f36df7e3a02f81f34399" Mar 20 08:42:03.601724 master-0 kubenswrapper[18707]: I0320 08:42:03.601614 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5bc4f8b544-czdmk"] Mar 20 08:42:04.737040 master-0 kubenswrapper[18707]: I0320 08:42:04.736959 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:42:04.737690 master-0 kubenswrapper[18707]: E0320 08:42:04.737267 18707 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:42:04.737690 master-0 kubenswrapper[18707]: E0320 08:42:04.737327 18707 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:42:04.737690 master-0 kubenswrapper[18707]: E0320 08:42:04.737418 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access podName:d245e5b2-a30d-45c8-9b79-6e8096765c14 nodeName:}" failed. No retries permitted until 2026-03-20 08:43:08.737388716 +0000 UTC m=+133.893569082 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access") pod "installer-3-master-0" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:42:09.018396 master-0 kubenswrapper[18707]: I0320 08:42:09.018317 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 20 08:42:09.019623 master-0 kubenswrapper[18707]: I0320 08:42:09.019587 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 20 08:42:09.023793 master-0 kubenswrapper[18707]: I0320 08:42:09.023753 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 08:42:09.023972 master-0 kubenswrapper[18707]: I0320 08:42:09.023945 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-k4ktd" Mar 20 08:42:09.034695 master-0 kubenswrapper[18707]: I0320 08:42:09.034642 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 20 08:42:09.117506 master-0 kubenswrapper[18707]: I0320 08:42:09.117424 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a5248d42-4743-4a8f-a554-9ae427b73597-var-lock\") pod \"installer-4-master-0\" (UID: \"a5248d42-4743-4a8f-a554-9ae427b73597\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 20 08:42:09.117826 master-0 kubenswrapper[18707]: I0320 08:42:09.117539 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5248d42-4743-4a8f-a554-9ae427b73597-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"a5248d42-4743-4a8f-a554-9ae427b73597\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 20 08:42:09.117826 master-0 kubenswrapper[18707]: I0320 08:42:09.117582 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5248d42-4743-4a8f-a554-9ae427b73597-kube-api-access\") pod \"installer-4-master-0\" (UID: \"a5248d42-4743-4a8f-a554-9ae427b73597\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 20 08:42:09.220169 master-0 kubenswrapper[18707]: I0320 08:42:09.220007 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a5248d42-4743-4a8f-a554-9ae427b73597-var-lock\") pod \"installer-4-master-0\" (UID: \"a5248d42-4743-4a8f-a554-9ae427b73597\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 20 08:42:09.220566 master-0 kubenswrapper[18707]: I0320 08:42:09.220180 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a5248d42-4743-4a8f-a554-9ae427b73597-var-lock\") pod \"installer-4-master-0\" (UID: \"a5248d42-4743-4a8f-a554-9ae427b73597\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 20 08:42:09.220858 master-0 kubenswrapper[18707]: I0320 08:42:09.220808 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5248d42-4743-4a8f-a554-9ae427b73597-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"a5248d42-4743-4a8f-a554-9ae427b73597\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 20 08:42:09.220948 master-0 kubenswrapper[18707]: I0320 08:42:09.220916 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5248d42-4743-4a8f-a554-9ae427b73597-kube-api-access\") pod \"installer-4-master-0\" (UID: \"a5248d42-4743-4a8f-a554-9ae427b73597\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 20 08:42:09.222295 master-0 kubenswrapper[18707]: I0320 08:42:09.221978 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5248d42-4743-4a8f-a554-9ae427b73597-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"a5248d42-4743-4a8f-a554-9ae427b73597\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 20 08:42:09.250170 master-0 kubenswrapper[18707]: I0320 08:42:09.250109 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5248d42-4743-4a8f-a554-9ae427b73597-kube-api-access\") pod \"installer-4-master-0\" (UID: \"a5248d42-4743-4a8f-a554-9ae427b73597\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 20 08:42:09.353469 master-0 kubenswrapper[18707]: I0320 08:42:09.353368 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 20 08:42:09.686139 master-0 kubenswrapper[18707]: I0320 08:42:09.685967 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-8ffsg"] Mar 20 08:42:09.688007 master-0 kubenswrapper[18707]: I0320 08:42:09.687884 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8ffsg" Mar 20 08:42:09.690589 master-0 kubenswrapper[18707]: I0320 08:42:09.690319 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-h4mf9" Mar 20 08:42:09.692249 master-0 kubenswrapper[18707]: I0320 08:42:09.692221 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 08:42:09.730855 master-0 kubenswrapper[18707]: I0320 08:42:09.730334 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b61d2fd4-13ca-4aab-b5cd-f4c10883335f-host\") pod \"node-ca-8ffsg\" (UID: \"b61d2fd4-13ca-4aab-b5cd-f4c10883335f\") " pod="openshift-image-registry/node-ca-8ffsg" Mar 20 08:42:09.731677 master-0 kubenswrapper[18707]: I0320 08:42:09.731633 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b61d2fd4-13ca-4aab-b5cd-f4c10883335f-serviceca\") pod \"node-ca-8ffsg\" (UID: \"b61d2fd4-13ca-4aab-b5cd-f4c10883335f\") " pod="openshift-image-registry/node-ca-8ffsg" Mar 20 08:42:09.732123 master-0 kubenswrapper[18707]: I0320 08:42:09.732088 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-548zq\" (UniqueName: \"kubernetes.io/projected/b61d2fd4-13ca-4aab-b5cd-f4c10883335f-kube-api-access-548zq\") pod \"node-ca-8ffsg\" (UID: \"b61d2fd4-13ca-4aab-b5cd-f4c10883335f\") " pod="openshift-image-registry/node-ca-8ffsg" Mar 20 08:42:09.767954 master-0 kubenswrapper[18707]: I0320 08:42:09.767907 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 20 08:42:09.770572 master-0 kubenswrapper[18707]: W0320 08:42:09.770513 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda5248d42_4743_4a8f_a554_9ae427b73597.slice/crio-97aae63a0ca450b0c154fc3a206bc1acd6d3c6998bdc43118cbbff5debe969ac WatchSource:0}: Error finding container 97aae63a0ca450b0c154fc3a206bc1acd6d3c6998bdc43118cbbff5debe969ac: Status 404 returned error can't find the container with id 97aae63a0ca450b0c154fc3a206bc1acd6d3c6998bdc43118cbbff5debe969ac Mar 20 08:42:09.834293 master-0 kubenswrapper[18707]: I0320 08:42:09.834176 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-548zq\" (UniqueName: \"kubernetes.io/projected/b61d2fd4-13ca-4aab-b5cd-f4c10883335f-kube-api-access-548zq\") pod \"node-ca-8ffsg\" (UID: \"b61d2fd4-13ca-4aab-b5cd-f4c10883335f\") " pod="openshift-image-registry/node-ca-8ffsg" Mar 20 08:42:09.834772 master-0 kubenswrapper[18707]: I0320 08:42:09.834723 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b61d2fd4-13ca-4aab-b5cd-f4c10883335f-host\") pod \"node-ca-8ffsg\" (UID: \"b61d2fd4-13ca-4aab-b5cd-f4c10883335f\") " pod="openshift-image-registry/node-ca-8ffsg" Mar 20 08:42:09.835273 master-0 kubenswrapper[18707]: I0320 08:42:09.834800 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b61d2fd4-13ca-4aab-b5cd-f4c10883335f-host\") pod \"node-ca-8ffsg\" (UID: \"b61d2fd4-13ca-4aab-b5cd-f4c10883335f\") " pod="openshift-image-registry/node-ca-8ffsg" Mar 20 08:42:09.835463 master-0 kubenswrapper[18707]: I0320 08:42:09.835434 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b61d2fd4-13ca-4aab-b5cd-f4c10883335f-serviceca\") pod \"node-ca-8ffsg\" (UID: \"b61d2fd4-13ca-4aab-b5cd-f4c10883335f\") " pod="openshift-image-registry/node-ca-8ffsg" Mar 20 08:42:09.836478 master-0 kubenswrapper[18707]: I0320 08:42:09.836424 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b61d2fd4-13ca-4aab-b5cd-f4c10883335f-serviceca\") pod \"node-ca-8ffsg\" (UID: \"b61d2fd4-13ca-4aab-b5cd-f4c10883335f\") " pod="openshift-image-registry/node-ca-8ffsg" Mar 20 08:42:09.851208 master-0 kubenswrapper[18707]: I0320 08:42:09.851148 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-548zq\" (UniqueName: \"kubernetes.io/projected/b61d2fd4-13ca-4aab-b5cd-f4c10883335f-kube-api-access-548zq\") pod \"node-ca-8ffsg\" (UID: \"b61d2fd4-13ca-4aab-b5cd-f4c10883335f\") " pod="openshift-image-registry/node-ca-8ffsg" Mar 20 08:42:10.022374 master-0 kubenswrapper[18707]: I0320 08:42:10.022305 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8ffsg" Mar 20 08:42:10.051997 master-0 kubenswrapper[18707]: W0320 08:42:10.051954 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb61d2fd4_13ca_4aab_b5cd_f4c10883335f.slice/crio-b0610caf4a55ef5f9e6699ee5aa897ed4d8de6678c68585b82f22ccf61897725 WatchSource:0}: Error finding container b0610caf4a55ef5f9e6699ee5aa897ed4d8de6678c68585b82f22ccf61897725: Status 404 returned error can't find the container with id b0610caf4a55ef5f9e6699ee5aa897ed4d8de6678c68585b82f22ccf61897725 Mar 20 08:42:10.545279 master-0 kubenswrapper[18707]: I0320 08:42:10.545202 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8ffsg" event={"ID":"b61d2fd4-13ca-4aab-b5cd-f4c10883335f","Type":"ContainerStarted","Data":"b0610caf4a55ef5f9e6699ee5aa897ed4d8de6678c68585b82f22ccf61897725"} Mar 20 08:42:10.547218 master-0 kubenswrapper[18707]: I0320 08:42:10.547134 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"a5248d42-4743-4a8f-a554-9ae427b73597","Type":"ContainerStarted","Data":"3f28c35eacfe1483495094b16c05076a9090b9a8e8e65b4c9615680fc2eea162"} Mar 20 08:42:10.547306 master-0 kubenswrapper[18707]: I0320 08:42:10.547233 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"a5248d42-4743-4a8f-a554-9ae427b73597","Type":"ContainerStarted","Data":"97aae63a0ca450b0c154fc3a206bc1acd6d3c6998bdc43118cbbff5debe969ac"} Mar 20 08:42:13.572026 master-0 kubenswrapper[18707]: I0320 08:42:13.571904 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8ffsg" event={"ID":"b61d2fd4-13ca-4aab-b5cd-f4c10883335f","Type":"ContainerStarted","Data":"98f29483e5403aa93a06dc410c946c1b98e8cb9ef71cc2581f299054942f3833"} Mar 20 08:42:13.600299 master-0 kubenswrapper[18707]: I0320 08:42:13.600125 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8ffsg" podStartSLOduration=2.340268946 podStartE2EDuration="4.600089562s" podCreationTimestamp="2026-03-20 08:42:09 +0000 UTC" firstStartedPulling="2026-03-20 08:42:10.054024349 +0000 UTC m=+75.210204705" lastFinishedPulling="2026-03-20 08:42:12.313844965 +0000 UTC m=+77.470025321" observedRunningTime="2026-03-20 08:42:13.597070373 +0000 UTC m=+78.753250789" watchObservedRunningTime="2026-03-20 08:42:13.600089562 +0000 UTC m=+78.756269948" Mar 20 08:42:13.601639 master-0 kubenswrapper[18707]: I0320 08:42:13.601541 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=4.601522863 podStartE2EDuration="4.601522863s" podCreationTimestamp="2026-03-20 08:42:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:42:10.570553841 +0000 UTC m=+75.726734197" watchObservedRunningTime="2026-03-20 08:42:13.601522863 +0000 UTC m=+78.757703259" Mar 20 08:42:19.503008 master-0 kubenswrapper[18707]: I0320 08:42:19.502913 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:42:19.503843 master-0 kubenswrapper[18707]: E0320 08:42:19.503271 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca podName:348f3880-793f-43e4-9de1-8511626d2552 nodeName:}" failed. No retries permitted until 2026-03-20 08:43:23.503226279 +0000 UTC m=+148.659406675 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca") pod "console-operator-76b6568d85-8b8gv" (UID: "348f3880-793f-43e4-9de1-8511626d2552") : configmap references non-existent config key: ca-bundle.crt Mar 20 08:42:28.637644 master-0 kubenswrapper[18707]: I0320 08:42:28.637532 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" podUID="42f9a484-af8a-4d23-84c4-d4717c8877c5" containerName="oauth-openshift" containerID="cri-o://fc8697bfa06a4bdb420ae52edc94b7e3ade80fca4018202e822c978a1207f4cc" gracePeriod=15 Mar 20 08:42:29.181532 master-0 kubenswrapper[18707]: I0320 08:42:29.181360 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:42:29.238261 master-0 kubenswrapper[18707]: I0320 08:42:29.238144 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-dcb9594d9-wlht7"] Mar 20 08:42:29.238574 master-0 kubenswrapper[18707]: E0320 08:42:29.238514 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42f9a484-af8a-4d23-84c4-d4717c8877c5" containerName="oauth-openshift" Mar 20 08:42:29.238574 master-0 kubenswrapper[18707]: I0320 08:42:29.238533 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="42f9a484-af8a-4d23-84c4-d4717c8877c5" containerName="oauth-openshift" Mar 20 08:42:29.238798 master-0 kubenswrapper[18707]: I0320 08:42:29.238749 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="42f9a484-af8a-4d23-84c4-d4717c8877c5" containerName="oauth-openshift" Mar 20 08:42:29.239368 master-0 kubenswrapper[18707]: I0320 08:42:29.239269 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.245773 master-0 kubenswrapper[18707]: I0320 08:42:29.245696 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-dcb9594d9-wlht7"] Mar 20 08:42:29.320579 master-0 kubenswrapper[18707]: I0320 08:42:29.320455 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-847cl\" (UniqueName: \"kubernetes.io/projected/42f9a484-af8a-4d23-84c4-d4717c8877c5-kube-api-access-847cl\") pod \"42f9a484-af8a-4d23-84c4-d4717c8877c5\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " Mar 20 08:42:29.320872 master-0 kubenswrapper[18707]: I0320 08:42:29.320635 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/42f9a484-af8a-4d23-84c4-d4717c8877c5-audit-dir\") pod \"42f9a484-af8a-4d23-84c4-d4717c8877c5\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " Mar 20 08:42:29.320872 master-0 kubenswrapper[18707]: I0320 08:42:29.320798 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-cliconfig\") pod \"42f9a484-af8a-4d23-84c4-d4717c8877c5\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " Mar 20 08:42:29.320980 master-0 kubenswrapper[18707]: I0320 08:42:29.320949 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-login\") pod \"42f9a484-af8a-4d23-84c4-d4717c8877c5\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " Mar 20 08:42:29.321816 master-0 kubenswrapper[18707]: I0320 08:42:29.321773 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "42f9a484-af8a-4d23-84c4-d4717c8877c5" (UID: "42f9a484-af8a-4d23-84c4-d4717c8877c5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:42:29.322135 master-0 kubenswrapper[18707]: I0320 08:42:29.322027 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42f9a484-af8a-4d23-84c4-d4717c8877c5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "42f9a484-af8a-4d23-84c4-d4717c8877c5" (UID: "42f9a484-af8a-4d23-84c4-d4717c8877c5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:42:29.322874 master-0 kubenswrapper[18707]: I0320 08:42:29.322838 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "42f9a484-af8a-4d23-84c4-d4717c8877c5" (UID: "42f9a484-af8a-4d23-84c4-d4717c8877c5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:42:29.322925 master-0 kubenswrapper[18707]: I0320 08:42:29.322879 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-audit-policies\") pod \"42f9a484-af8a-4d23-84c4-d4717c8877c5\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " Mar 20 08:42:29.322963 master-0 kubenswrapper[18707]: I0320 08:42:29.322937 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-error\") pod \"42f9a484-af8a-4d23-84c4-d4717c8877c5\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " Mar 20 08:42:29.323497 master-0 kubenswrapper[18707]: I0320 08:42:29.323467 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-router-certs\") pod \"42f9a484-af8a-4d23-84c4-d4717c8877c5\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " Mar 20 08:42:29.323589 master-0 kubenswrapper[18707]: I0320 08:42:29.323515 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-session\") pod \"42f9a484-af8a-4d23-84c4-d4717c8877c5\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " Mar 20 08:42:29.323589 master-0 kubenswrapper[18707]: I0320 08:42:29.323562 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-service-ca\") pod \"42f9a484-af8a-4d23-84c4-d4717c8877c5\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " Mar 20 08:42:29.323678 master-0 kubenswrapper[18707]: I0320 08:42:29.323594 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-serving-cert\") pod \"42f9a484-af8a-4d23-84c4-d4717c8877c5\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " Mar 20 08:42:29.323678 master-0 kubenswrapper[18707]: I0320 08:42:29.323622 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-trusted-ca-bundle\") pod \"42f9a484-af8a-4d23-84c4-d4717c8877c5\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " Mar 20 08:42:29.323678 master-0 kubenswrapper[18707]: I0320 08:42:29.323664 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-provider-selection\") pod \"42f9a484-af8a-4d23-84c4-d4717c8877c5\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " Mar 20 08:42:29.323825 master-0 kubenswrapper[18707]: I0320 08:42:29.323710 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-ocp-branding-template\") pod \"42f9a484-af8a-4d23-84c4-d4717c8877c5\" (UID: \"42f9a484-af8a-4d23-84c4-d4717c8877c5\") " Mar 20 08:42:29.324222 master-0 kubenswrapper[18707]: I0320 08:42:29.323944 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-service-ca\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.324222 master-0 kubenswrapper[18707]: I0320 08:42:29.323990 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-login\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.324222 master-0 kubenswrapper[18707]: I0320 08:42:29.324040 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-error\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.324385 master-0 kubenswrapper[18707]: I0320 08:42:29.324075 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.326476 master-0 kubenswrapper[18707]: I0320 08:42:29.326427 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "42f9a484-af8a-4d23-84c4-d4717c8877c5" (UID: "42f9a484-af8a-4d23-84c4-d4717c8877c5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:42:29.326476 master-0 kubenswrapper[18707]: I0320 08:42:29.326448 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "42f9a484-af8a-4d23-84c4-d4717c8877c5" (UID: "42f9a484-af8a-4d23-84c4-d4717c8877c5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:42:29.326602 master-0 kubenswrapper[18707]: I0320 08:42:29.326580 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42f9a484-af8a-4d23-84c4-d4717c8877c5-kube-api-access-847cl" (OuterVolumeSpecName: "kube-api-access-847cl") pod "42f9a484-af8a-4d23-84c4-d4717c8877c5" (UID: "42f9a484-af8a-4d23-84c4-d4717c8877c5"). InnerVolumeSpecName "kube-api-access-847cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:42:29.326704 master-0 kubenswrapper[18707]: I0320 08:42:29.326669 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "42f9a484-af8a-4d23-84c4-d4717c8877c5" (UID: "42f9a484-af8a-4d23-84c4-d4717c8877c5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:42:29.328542 master-0 kubenswrapper[18707]: I0320 08:42:29.324071 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "42f9a484-af8a-4d23-84c4-d4717c8877c5" (UID: "42f9a484-af8a-4d23-84c4-d4717c8877c5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:42:29.328542 master-0 kubenswrapper[18707]: I0320 08:42:29.328465 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.328666 master-0 kubenswrapper[18707]: I0320 08:42:29.328578 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-audit-policies\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.328666 master-0 kubenswrapper[18707]: I0320 08:42:29.328652 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4sz4\" (UniqueName: \"kubernetes.io/projected/ecd2e0e2-a8c1-42bf-8637-8999030075f1-kube-api-access-c4sz4\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.328772 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-router-certs\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.328817 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecd2e0e2-a8c1-42bf-8637-8999030075f1-audit-dir\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.328897 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "42f9a484-af8a-4d23-84c4-d4717c8877c5" (UID: "42f9a484-af8a-4d23-84c4-d4717c8877c5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.328950 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.329051 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.329096 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-session\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.329168 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.329313 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.329338 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.329351 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.329365 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-847cl\" (UniqueName: \"kubernetes.io/projected/42f9a484-af8a-4d23-84c4-d4717c8877c5-kube-api-access-847cl\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.329383 18707 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/42f9a484-af8a-4d23-84c4-d4717c8877c5-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.329395 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.329405 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.329416 18707 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/42f9a484-af8a-4d23-84c4-d4717c8877c5-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:29.330975 master-0 kubenswrapper[18707]: I0320 08:42:29.329430 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:29.338795 master-0 kubenswrapper[18707]: I0320 08:42:29.338733 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "42f9a484-af8a-4d23-84c4-d4717c8877c5" (UID: "42f9a484-af8a-4d23-84c4-d4717c8877c5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:42:29.338795 master-0 kubenswrapper[18707]: I0320 08:42:29.338696 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "42f9a484-af8a-4d23-84c4-d4717c8877c5" (UID: "42f9a484-af8a-4d23-84c4-d4717c8877c5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:42:29.338959 master-0 kubenswrapper[18707]: I0320 08:42:29.338794 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "42f9a484-af8a-4d23-84c4-d4717c8877c5" (UID: "42f9a484-af8a-4d23-84c4-d4717c8877c5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:42:29.338959 master-0 kubenswrapper[18707]: I0320 08:42:29.338911 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "42f9a484-af8a-4d23-84c4-d4717c8877c5" (UID: "42f9a484-af8a-4d23-84c4-d4717c8877c5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:42:29.431080 master-0 kubenswrapper[18707]: I0320 08:42:29.430991 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4sz4\" (UniqueName: \"kubernetes.io/projected/ecd2e0e2-a8c1-42bf-8637-8999030075f1-kube-api-access-c4sz4\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.431417 master-0 kubenswrapper[18707]: I0320 08:42:29.431118 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-router-certs\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.431417 master-0 kubenswrapper[18707]: I0320 08:42:29.431351 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecd2e0e2-a8c1-42bf-8637-8999030075f1-audit-dir\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.431506 master-0 kubenswrapper[18707]: I0320 08:42:29.431466 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.431762 master-0 kubenswrapper[18707]: I0320 08:42:29.431619 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecd2e0e2-a8c1-42bf-8637-8999030075f1-audit-dir\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.431809 master-0 kubenswrapper[18707]: I0320 08:42:29.431765 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.431846 master-0 kubenswrapper[18707]: I0320 08:42:29.431828 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-session\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.431897 master-0 kubenswrapper[18707]: I0320 08:42:29.431873 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.432000 master-0 kubenswrapper[18707]: I0320 08:42:29.431965 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-service-ca\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.432042 master-0 kubenswrapper[18707]: I0320 08:42:29.432026 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-login\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.432086 master-0 kubenswrapper[18707]: I0320 08:42:29.432064 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-error\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.432157 master-0 kubenswrapper[18707]: I0320 08:42:29.432131 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.432247 master-0 kubenswrapper[18707]: I0320 08:42:29.432224 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.432310 master-0 kubenswrapper[18707]: I0320 08:42:29.432287 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-audit-policies\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.432439 master-0 kubenswrapper[18707]: I0320 08:42:29.432404 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:29.432489 master-0 kubenswrapper[18707]: I0320 08:42:29.432436 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:29.432489 master-0 kubenswrapper[18707]: I0320 08:42:29.432460 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:29.432489 master-0 kubenswrapper[18707]: I0320 08:42:29.432477 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/42f9a484-af8a-4d23-84c4-d4717c8877c5-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:29.433211 master-0 kubenswrapper[18707]: I0320 08:42:29.433152 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.433368 master-0 kubenswrapper[18707]: I0320 08:42:29.433338 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-audit-policies\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.433831 master-0 kubenswrapper[18707]: I0320 08:42:29.433592 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-service-ca\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.434011 master-0 kubenswrapper[18707]: I0320 08:42:29.433975 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.435775 master-0 kubenswrapper[18707]: I0320 08:42:29.435732 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.436511 master-0 kubenswrapper[18707]: I0320 08:42:29.436475 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.436866 master-0 kubenswrapper[18707]: I0320 08:42:29.436811 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-session\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.437312 master-0 kubenswrapper[18707]: I0320 08:42:29.437271 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-error\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.438734 master-0 kubenswrapper[18707]: I0320 08:42:29.438671 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-login\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.439091 master-0 kubenswrapper[18707]: I0320 08:42:29.439058 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-router-certs\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.442123 master-0 kubenswrapper[18707]: I0320 08:42:29.442081 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.459370 master-0 kubenswrapper[18707]: I0320 08:42:29.459306 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4sz4\" (UniqueName: \"kubernetes.io/projected/ecd2e0e2-a8c1-42bf-8637-8999030075f1-kube-api-access-c4sz4\") pod \"oauth-openshift-dcb9594d9-wlht7\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.571825 master-0 kubenswrapper[18707]: I0320 08:42:29.571721 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:29.707983 master-0 kubenswrapper[18707]: I0320 08:42:29.707905 18707 generic.go:334] "Generic (PLEG): container finished" podID="42f9a484-af8a-4d23-84c4-d4717c8877c5" containerID="fc8697bfa06a4bdb420ae52edc94b7e3ade80fca4018202e822c978a1207f4cc" exitCode=0 Mar 20 08:42:29.707983 master-0 kubenswrapper[18707]: I0320 08:42:29.707988 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" event={"ID":"42f9a484-af8a-4d23-84c4-d4717c8877c5","Type":"ContainerDied","Data":"fc8697bfa06a4bdb420ae52edc94b7e3ade80fca4018202e822c978a1207f4cc"} Mar 20 08:42:29.708711 master-0 kubenswrapper[18707]: I0320 08:42:29.708024 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" event={"ID":"42f9a484-af8a-4d23-84c4-d4717c8877c5","Type":"ContainerDied","Data":"3cb2d6761e890f4b4629664520fd8698e069c73b59cf66c997258b8d96a32a6d"} Mar 20 08:42:29.708711 master-0 kubenswrapper[18707]: I0320 08:42:29.708066 18707 scope.go:117] "RemoveContainer" containerID="fc8697bfa06a4bdb420ae52edc94b7e3ade80fca4018202e822c978a1207f4cc" Mar 20 08:42:29.708711 master-0 kubenswrapper[18707]: I0320 08:42:29.708248 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5bc4f8b544-czdmk" Mar 20 08:42:29.728999 master-0 kubenswrapper[18707]: I0320 08:42:29.728948 18707 scope.go:117] "RemoveContainer" containerID="fc8697bfa06a4bdb420ae52edc94b7e3ade80fca4018202e822c978a1207f4cc" Mar 20 08:42:29.730534 master-0 kubenswrapper[18707]: E0320 08:42:29.729520 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc8697bfa06a4bdb420ae52edc94b7e3ade80fca4018202e822c978a1207f4cc\": container with ID starting with fc8697bfa06a4bdb420ae52edc94b7e3ade80fca4018202e822c978a1207f4cc not found: ID does not exist" containerID="fc8697bfa06a4bdb420ae52edc94b7e3ade80fca4018202e822c978a1207f4cc" Mar 20 08:42:29.730534 master-0 kubenswrapper[18707]: I0320 08:42:29.729561 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8697bfa06a4bdb420ae52edc94b7e3ade80fca4018202e822c978a1207f4cc"} err="failed to get container status \"fc8697bfa06a4bdb420ae52edc94b7e3ade80fca4018202e822c978a1207f4cc\": rpc error: code = NotFound desc = could not find container \"fc8697bfa06a4bdb420ae52edc94b7e3ade80fca4018202e822c978a1207f4cc\": container with ID starting with fc8697bfa06a4bdb420ae52edc94b7e3ade80fca4018202e822c978a1207f4cc not found: ID does not exist" Mar 20 08:42:29.763109 master-0 kubenswrapper[18707]: I0320 08:42:29.763022 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5bc4f8b544-czdmk"] Mar 20 08:42:29.763109 master-0 kubenswrapper[18707]: I0320 08:42:29.763114 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-5bc4f8b544-czdmk"] Mar 20 08:42:30.015752 master-0 kubenswrapper[18707]: I0320 08:42:30.015693 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-dcb9594d9-wlht7"] Mar 20 08:42:30.019877 master-0 kubenswrapper[18707]: W0320 08:42:30.019793 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecd2e0e2_a8c1_42bf_8637_8999030075f1.slice/crio-ceb3ffd79d8b395b6e52a5b41ec3cd1c27194d6f64a2f6a384db2267eb73bcfa WatchSource:0}: Error finding container ceb3ffd79d8b395b6e52a5b41ec3cd1c27194d6f64a2f6a384db2267eb73bcfa: Status 404 returned error can't find the container with id ceb3ffd79d8b395b6e52a5b41ec3cd1c27194d6f64a2f6a384db2267eb73bcfa Mar 20 08:42:30.717256 master-0 kubenswrapper[18707]: I0320 08:42:30.717166 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" event={"ID":"ecd2e0e2-a8c1-42bf-8637-8999030075f1","Type":"ContainerStarted","Data":"1c142f6df3e02bb51343a9d48043cfbb00fa968fa9a477f02c864cdd8a5a9600"} Mar 20 08:42:30.717256 master-0 kubenswrapper[18707]: I0320 08:42:30.717251 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" event={"ID":"ecd2e0e2-a8c1-42bf-8637-8999030075f1","Type":"ContainerStarted","Data":"ceb3ffd79d8b395b6e52a5b41ec3cd1c27194d6f64a2f6a384db2267eb73bcfa"} Mar 20 08:42:30.718571 master-0 kubenswrapper[18707]: I0320 08:42:30.718503 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:30.723894 master-0 kubenswrapper[18707]: I0320 08:42:30.723841 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:42:30.752468 master-0 kubenswrapper[18707]: I0320 08:42:30.752287 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" podStartSLOduration=27.752258086 podStartE2EDuration="27.752258086s" podCreationTimestamp="2026-03-20 08:42:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:42:30.748805865 +0000 UTC m=+95.904986231" watchObservedRunningTime="2026-03-20 08:42:30.752258086 +0000 UTC m=+95.908438462" Mar 20 08:42:31.104480 master-0 kubenswrapper[18707]: I0320 08:42:31.104409 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42f9a484-af8a-4d23-84c4-d4717c8877c5" path="/var/lib/kubelet/pods/42f9a484-af8a-4d23-84c4-d4717c8877c5/volumes" Mar 20 08:42:47.892172 master-0 kubenswrapper[18707]: I0320 08:42:47.892077 18707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 20 08:42:47.894294 master-0 kubenswrapper[18707]: I0320 08:42:47.894236 18707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 20 08:42:47.894587 master-0 kubenswrapper[18707]: I0320 08:42:47.894528 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:47.894732 master-0 kubenswrapper[18707]: I0320 08:42:47.894662 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" containerID="cri-o://d0111435fee5f861beed3e54093288b80fe796da20879b74d333fc90fd130f88" gracePeriod=15 Mar 20 08:42:47.894815 master-0 kubenswrapper[18707]: I0320 08:42:47.894711 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" containerID="cri-o://6ab22b3f2d37bf5775a5ff7b8bee43ed84a6b6e971e88c5aa5c7bea50f737d96" gracePeriod=15 Mar 20 08:42:47.894858 master-0 kubenswrapper[18707]: I0320 08:42:47.894746 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://3030fbaae53d50269bd5e2ef3f7474bc1b880c795eb0e6c078b59b5741beb732" gracePeriod=15 Mar 20 08:42:47.894892 master-0 kubenswrapper[18707]: I0320 08:42:47.894855 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f3b6bb748ad127ad783c317a9ec2edeaa6139af683fa7cda66b138823bc535c8" gracePeriod=15 Mar 20 08:42:47.895460 master-0 kubenswrapper[18707]: I0320 08:42:47.895412 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://0f63c13521e5ca3a0dcdf96b62fa9e56532f32f43af31de53a30911e593c568a" gracePeriod=15 Mar 20 08:42:47.896084 master-0 kubenswrapper[18707]: I0320 08:42:47.895965 18707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 20 08:42:47.896542 master-0 kubenswrapper[18707]: E0320 08:42:47.896495 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" Mar 20 08:42:47.896542 master-0 kubenswrapper[18707]: I0320 08:42:47.896530 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" Mar 20 08:42:47.896655 master-0 kubenswrapper[18707]: E0320 08:42:47.896562 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" Mar 20 08:42:47.896655 master-0 kubenswrapper[18707]: I0320 08:42:47.896576 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" Mar 20 08:42:47.896655 master-0 kubenswrapper[18707]: E0320 08:42:47.896595 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="setup" Mar 20 08:42:47.896655 master-0 kubenswrapper[18707]: I0320 08:42:47.896608 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="setup" Mar 20 08:42:47.896655 master-0 kubenswrapper[18707]: E0320 08:42:47.896634 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 20 08:42:47.896655 master-0 kubenswrapper[18707]: I0320 08:42:47.896645 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 20 08:42:47.896931 master-0 kubenswrapper[18707]: E0320 08:42:47.896731 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" Mar 20 08:42:47.896931 master-0 kubenswrapper[18707]: I0320 08:42:47.896748 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" Mar 20 08:42:47.896931 master-0 kubenswrapper[18707]: E0320 08:42:47.896768 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 08:42:47.896931 master-0 kubenswrapper[18707]: I0320 08:42:47.896779 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 08:42:47.897074 master-0 kubenswrapper[18707]: I0320 08:42:47.897011 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 08:42:47.897074 master-0 kubenswrapper[18707]: I0320 08:42:47.897031 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" Mar 20 08:42:47.897217 master-0 kubenswrapper[18707]: I0320 08:42:47.897082 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="setup" Mar 20 08:42:47.897217 master-0 kubenswrapper[18707]: I0320 08:42:47.897097 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" Mar 20 08:42:47.897217 master-0 kubenswrapper[18707]: I0320 08:42:47.897113 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 20 08:42:47.897217 master-0 kubenswrapper[18707]: I0320 08:42:47.897130 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" Mar 20 08:42:47.964797 master-0 kubenswrapper[18707]: I0320 08:42:47.964682 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:42:47.964960 master-0 kubenswrapper[18707]: I0320 08:42:47.964842 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:42:47.964960 master-0 kubenswrapper[18707]: I0320 08:42:47.964887 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:47.964960 master-0 kubenswrapper[18707]: I0320 08:42:47.964930 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:47.965117 master-0 kubenswrapper[18707]: I0320 08:42:47.965026 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:47.965117 master-0 kubenswrapper[18707]: I0320 08:42:47.965094 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:47.965263 master-0 kubenswrapper[18707]: I0320 08:42:47.965126 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:47.965263 master-0 kubenswrapper[18707]: I0320 08:42:47.965203 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:42:48.004798 master-0 kubenswrapper[18707]: E0320 08:42:48.004709 18707 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:48.066242 master-0 kubenswrapper[18707]: I0320 08:42:48.066006 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:48.066242 master-0 kubenswrapper[18707]: I0320 08:42:48.066094 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:48.066242 master-0 kubenswrapper[18707]: I0320 08:42:48.066118 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:48.066242 master-0 kubenswrapper[18707]: I0320 08:42:48.066153 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:42:48.066242 master-0 kubenswrapper[18707]: I0320 08:42:48.066202 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:42:48.066659 master-0 kubenswrapper[18707]: I0320 08:42:48.066302 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:48.066659 master-0 kubenswrapper[18707]: I0320 08:42:48.066484 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:42:48.066659 master-0 kubenswrapper[18707]: I0320 08:42:48.066495 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:42:48.066659 master-0 kubenswrapper[18707]: I0320 08:42:48.066542 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:42:48.066659 master-0 kubenswrapper[18707]: I0320 08:42:48.066515 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:48.066659 master-0 kubenswrapper[18707]: I0320 08:42:48.066629 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:48.066659 master-0 kubenswrapper[18707]: I0320 08:42:48.066632 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:48.066659 master-0 kubenswrapper[18707]: I0320 08:42:48.066632 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:42:48.066945 master-0 kubenswrapper[18707]: I0320 08:42:48.066697 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:48.066945 master-0 kubenswrapper[18707]: I0320 08:42:48.066730 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:48.066945 master-0 kubenswrapper[18707]: I0320 08:42:48.066695 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:48.305635 master-0 kubenswrapper[18707]: I0320 08:42:48.305565 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:48.328947 master-0 kubenswrapper[18707]: W0320 08:42:48.328861 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16fb4ea7f83036d9c6adf3454fc7e9db.slice/crio-39f5b66c04f7ac414db8d175708bf7815dc8a848d8dc3f164f0f425054c2ef17 WatchSource:0}: Error finding container 39f5b66c04f7ac414db8d175708bf7815dc8a848d8dc3f164f0f425054c2ef17: Status 404 returned error can't find the container with id 39f5b66c04f7ac414db8d175708bf7815dc8a848d8dc3f164f0f425054c2ef17 Mar 20 08:42:48.335685 master-0 kubenswrapper[18707]: E0320 08:42:48.335448 18707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e801db67425a2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:16fb4ea7f83036d9c6adf3454fc7e9db,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:42:48.334026146 +0000 UTC m=+113.490206492,LastTimestamp:2026-03-20 08:42:48.334026146 +0000 UTC m=+113.490206492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:42:48.882358 master-0 kubenswrapper[18707]: I0320 08:42:48.882204 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-cert-syncer/0.log" Mar 20 08:42:48.883145 master-0 kubenswrapper[18707]: I0320 08:42:48.883096 18707 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="6ab22b3f2d37bf5775a5ff7b8bee43ed84a6b6e971e88c5aa5c7bea50f737d96" exitCode=0 Mar 20 08:42:48.883145 master-0 kubenswrapper[18707]: I0320 08:42:48.883136 18707 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="0f63c13521e5ca3a0dcdf96b62fa9e56532f32f43af31de53a30911e593c568a" exitCode=0 Mar 20 08:42:48.883145 master-0 kubenswrapper[18707]: I0320 08:42:48.883148 18707 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="3030fbaae53d50269bd5e2ef3f7474bc1b880c795eb0e6c078b59b5741beb732" exitCode=0 Mar 20 08:42:48.883350 master-0 kubenswrapper[18707]: I0320 08:42:48.883160 18707 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="f3b6bb748ad127ad783c317a9ec2edeaa6139af683fa7cda66b138823bc535c8" exitCode=2 Mar 20 08:42:48.884990 master-0 kubenswrapper[18707]: I0320 08:42:48.884962 18707 generic.go:334] "Generic (PLEG): container finished" podID="a5248d42-4743-4a8f-a554-9ae427b73597" containerID="3f28c35eacfe1483495094b16c05076a9090b9a8e8e65b4c9615680fc2eea162" exitCode=0 Mar 20 08:42:48.885076 master-0 kubenswrapper[18707]: I0320 08:42:48.884999 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"a5248d42-4743-4a8f-a554-9ae427b73597","Type":"ContainerDied","Data":"3f28c35eacfe1483495094b16c05076a9090b9a8e8e65b4c9615680fc2eea162"} Mar 20 08:42:48.886649 master-0 kubenswrapper[18707]: I0320 08:42:48.886594 18707 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:48.887174 master-0 kubenswrapper[18707]: I0320 08:42:48.887138 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"16fb4ea7f83036d9c6adf3454fc7e9db","Type":"ContainerStarted","Data":"7becf1bd30be15317505d9b734fa3236c48dd5943a2c295e96cc2c154bf77f5e"} Mar 20 08:42:48.887279 master-0 kubenswrapper[18707]: I0320 08:42:48.887179 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"16fb4ea7f83036d9c6adf3454fc7e9db","Type":"ContainerStarted","Data":"39f5b66c04f7ac414db8d175708bf7815dc8a848d8dc3f164f0f425054c2ef17"} Mar 20 08:42:48.887822 master-0 kubenswrapper[18707]: I0320 08:42:48.887758 18707 status_manager.go:851] "Failed to get status for pod" podUID="a5248d42-4743-4a8f-a554-9ae427b73597" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:48.888452 master-0 kubenswrapper[18707]: E0320 08:42:48.888385 18707 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:42:48.888791 master-0 kubenswrapper[18707]: I0320 08:42:48.888762 18707 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:48.889734 master-0 kubenswrapper[18707]: I0320 08:42:48.889653 18707 status_manager.go:851] "Failed to get status for pod" podUID="a5248d42-4743-4a8f-a554-9ae427b73597" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:49.730211 master-0 kubenswrapper[18707]: E0320 08:42:49.730096 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:49.731615 master-0 kubenswrapper[18707]: E0320 08:42:49.731534 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:49.732620 master-0 kubenswrapper[18707]: E0320 08:42:49.732573 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:49.733437 master-0 kubenswrapper[18707]: E0320 08:42:49.733396 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:49.734006 master-0 kubenswrapper[18707]: E0320 08:42:49.733970 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:49.734088 master-0 kubenswrapper[18707]: I0320 08:42:49.734011 18707 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 08:42:49.734727 master-0 kubenswrapper[18707]: E0320 08:42:49.734663 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 20 08:42:49.769736 master-0 kubenswrapper[18707]: E0320 08:42:49.769494 18707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e801db67425a2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:16fb4ea7f83036d9c6adf3454fc7e9db,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:42:48.334026146 +0000 UTC m=+113.490206492,LastTimestamp:2026-03-20 08:42:48.334026146 +0000 UTC m=+113.490206492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:42:49.944791 master-0 kubenswrapper[18707]: E0320 08:42:49.944658 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 20 08:42:50.349382 master-0 kubenswrapper[18707]: E0320 08:42:50.346092 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 20 08:42:50.379280 master-0 kubenswrapper[18707]: I0320 08:42:50.379198 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 20 08:42:50.381103 master-0 kubenswrapper[18707]: I0320 08:42:50.381018 18707 status_manager.go:851] "Failed to get status for pod" podUID="a5248d42-4743-4a8f-a554-9ae427b73597" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:50.381224 master-0 kubenswrapper[18707]: I0320 08:42:50.381169 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-cert-syncer/0.log" Mar 20 08:42:50.382134 master-0 kubenswrapper[18707]: I0320 08:42:50.382115 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:42:50.385657 master-0 kubenswrapper[18707]: I0320 08:42:50.383393 18707 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:50.385657 master-0 kubenswrapper[18707]: I0320 08:42:50.384100 18707 status_manager.go:851] "Failed to get status for pod" podUID="a5248d42-4743-4a8f-a554-9ae427b73597" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:50.508442 master-0 kubenswrapper[18707]: I0320 08:42:50.508326 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a5248d42-4743-4a8f-a554-9ae427b73597-var-lock\") pod \"a5248d42-4743-4a8f-a554-9ae427b73597\" (UID: \"a5248d42-4743-4a8f-a554-9ae427b73597\") " Mar 20 08:42:50.508809 master-0 kubenswrapper[18707]: I0320 08:42:50.508572 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"b45ea2ef1cf2bc9d1d994d6538ae0a64\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " Mar 20 08:42:50.508809 master-0 kubenswrapper[18707]: I0320 08:42:50.508575 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5248d42-4743-4a8f-a554-9ae427b73597-var-lock" (OuterVolumeSpecName: "var-lock") pod "a5248d42-4743-4a8f-a554-9ae427b73597" (UID: "a5248d42-4743-4a8f-a554-9ae427b73597"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:42:50.508809 master-0 kubenswrapper[18707]: I0320 08:42:50.508635 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"b45ea2ef1cf2bc9d1d994d6538ae0a64\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " Mar 20 08:42:50.508809 master-0 kubenswrapper[18707]: I0320 08:42:50.508715 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b45ea2ef1cf2bc9d1d994d6538ae0a64" (UID: "b45ea2ef1cf2bc9d1d994d6538ae0a64"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:42:50.508943 master-0 kubenswrapper[18707]: I0320 08:42:50.508805 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "b45ea2ef1cf2bc9d1d994d6538ae0a64" (UID: "b45ea2ef1cf2bc9d1d994d6538ae0a64"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:42:50.509016 master-0 kubenswrapper[18707]: I0320 08:42:50.508960 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5248d42-4743-4a8f-a554-9ae427b73597-kube-api-access\") pod \"a5248d42-4743-4a8f-a554-9ae427b73597\" (UID: \"a5248d42-4743-4a8f-a554-9ae427b73597\") " Mar 20 08:42:50.509073 master-0 kubenswrapper[18707]: I0320 08:42:50.509042 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"b45ea2ef1cf2bc9d1d994d6538ae0a64\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " Mar 20 08:42:50.509135 master-0 kubenswrapper[18707]: I0320 08:42:50.509103 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "b45ea2ef1cf2bc9d1d994d6538ae0a64" (UID: "b45ea2ef1cf2bc9d1d994d6538ae0a64"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:42:50.509229 master-0 kubenswrapper[18707]: I0320 08:42:50.509154 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5248d42-4743-4a8f-a554-9ae427b73597-kubelet-dir\") pod \"a5248d42-4743-4a8f-a554-9ae427b73597\" (UID: \"a5248d42-4743-4a8f-a554-9ae427b73597\") " Mar 20 08:42:50.509272 master-0 kubenswrapper[18707]: I0320 08:42:50.509226 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a5248d42-4743-4a8f-a554-9ae427b73597-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a5248d42-4743-4a8f-a554-9ae427b73597" (UID: "a5248d42-4743-4a8f-a554-9ae427b73597"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:42:50.510067 master-0 kubenswrapper[18707]: I0320 08:42:50.510021 18707 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:50.510120 master-0 kubenswrapper[18707]: I0320 08:42:50.510064 18707 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:50.510120 master-0 kubenswrapper[18707]: I0320 08:42:50.510086 18707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:50.510120 master-0 kubenswrapper[18707]: I0320 08:42:50.510110 18707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5248d42-4743-4a8f-a554-9ae427b73597-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:50.510244 master-0 kubenswrapper[18707]: I0320 08:42:50.510129 18707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a5248d42-4743-4a8f-a554-9ae427b73597-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:50.514908 master-0 kubenswrapper[18707]: I0320 08:42:50.514791 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5248d42-4743-4a8f-a554-9ae427b73597-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a5248d42-4743-4a8f-a554-9ae427b73597" (UID: "a5248d42-4743-4a8f-a554-9ae427b73597"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:42:50.611606 master-0 kubenswrapper[18707]: I0320 08:42:50.611414 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a5248d42-4743-4a8f-a554-9ae427b73597-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:42:50.910956 master-0 kubenswrapper[18707]: I0320 08:42:50.910779 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-cert-syncer/0.log" Mar 20 08:42:50.911985 master-0 kubenswrapper[18707]: I0320 08:42:50.911927 18707 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="d0111435fee5f861beed3e54093288b80fe796da20879b74d333fc90fd130f88" exitCode=0 Mar 20 08:42:50.912127 master-0 kubenswrapper[18707]: I0320 08:42:50.912093 18707 scope.go:117] "RemoveContainer" containerID="6ab22b3f2d37bf5775a5ff7b8bee43ed84a6b6e971e88c5aa5c7bea50f737d96" Mar 20 08:42:50.912182 master-0 kubenswrapper[18707]: I0320 08:42:50.912131 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:42:50.914861 master-0 kubenswrapper[18707]: I0320 08:42:50.914455 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"a5248d42-4743-4a8f-a554-9ae427b73597","Type":"ContainerDied","Data":"97aae63a0ca450b0c154fc3a206bc1acd6d3c6998bdc43118cbbff5debe969ac"} Mar 20 08:42:50.914861 master-0 kubenswrapper[18707]: I0320 08:42:50.914508 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97aae63a0ca450b0c154fc3a206bc1acd6d3c6998bdc43118cbbff5debe969ac" Mar 20 08:42:50.914861 master-0 kubenswrapper[18707]: I0320 08:42:50.914590 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 20 08:42:50.936825 master-0 kubenswrapper[18707]: I0320 08:42:50.936724 18707 scope.go:117] "RemoveContainer" containerID="0f63c13521e5ca3a0dcdf96b62fa9e56532f32f43af31de53a30911e593c568a" Mar 20 08:42:50.960694 master-0 kubenswrapper[18707]: I0320 08:42:50.960591 18707 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:50.961612 master-0 kubenswrapper[18707]: I0320 08:42:50.961546 18707 status_manager.go:851] "Failed to get status for pod" podUID="a5248d42-4743-4a8f-a554-9ae427b73597" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:50.962436 master-0 kubenswrapper[18707]: I0320 08:42:50.962363 18707 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:50.963259 master-0 kubenswrapper[18707]: I0320 08:42:50.963164 18707 status_manager.go:851] "Failed to get status for pod" podUID="a5248d42-4743-4a8f-a554-9ae427b73597" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:50.975891 master-0 kubenswrapper[18707]: I0320 08:42:50.975835 18707 scope.go:117] "RemoveContainer" containerID="3030fbaae53d50269bd5e2ef3f7474bc1b880c795eb0e6c078b59b5741beb732" Mar 20 08:42:50.994772 master-0 kubenswrapper[18707]: I0320 08:42:50.994707 18707 scope.go:117] "RemoveContainer" containerID="f3b6bb748ad127ad783c317a9ec2edeaa6139af683fa7cda66b138823bc535c8" Mar 20 08:42:51.009700 master-0 kubenswrapper[18707]: I0320 08:42:51.009641 18707 scope.go:117] "RemoveContainer" containerID="d0111435fee5f861beed3e54093288b80fe796da20879b74d333fc90fd130f88" Mar 20 08:42:51.034645 master-0 kubenswrapper[18707]: I0320 08:42:51.034585 18707 scope.go:117] "RemoveContainer" containerID="d11c59462bae131a903a2325334d63a1ab5222611dbbb14b008e128c7a2a34a9" Mar 20 08:42:51.058771 master-0 kubenswrapper[18707]: I0320 08:42:51.058711 18707 scope.go:117] "RemoveContainer" containerID="6ab22b3f2d37bf5775a5ff7b8bee43ed84a6b6e971e88c5aa5c7bea50f737d96" Mar 20 08:42:51.059940 master-0 kubenswrapper[18707]: E0320 08:42:51.059543 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab22b3f2d37bf5775a5ff7b8bee43ed84a6b6e971e88c5aa5c7bea50f737d96\": container with ID starting with 6ab22b3f2d37bf5775a5ff7b8bee43ed84a6b6e971e88c5aa5c7bea50f737d96 not found: ID does not exist" containerID="6ab22b3f2d37bf5775a5ff7b8bee43ed84a6b6e971e88c5aa5c7bea50f737d96" Mar 20 08:42:51.059940 master-0 kubenswrapper[18707]: I0320 08:42:51.059688 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab22b3f2d37bf5775a5ff7b8bee43ed84a6b6e971e88c5aa5c7bea50f737d96"} err="failed to get container status \"6ab22b3f2d37bf5775a5ff7b8bee43ed84a6b6e971e88c5aa5c7bea50f737d96\": rpc error: code = NotFound desc = could not find container \"6ab22b3f2d37bf5775a5ff7b8bee43ed84a6b6e971e88c5aa5c7bea50f737d96\": container with ID starting with 6ab22b3f2d37bf5775a5ff7b8bee43ed84a6b6e971e88c5aa5c7bea50f737d96 not found: ID does not exist" Mar 20 08:42:51.059940 master-0 kubenswrapper[18707]: I0320 08:42:51.059760 18707 scope.go:117] "RemoveContainer" containerID="0f63c13521e5ca3a0dcdf96b62fa9e56532f32f43af31de53a30911e593c568a" Mar 20 08:42:51.060998 master-0 kubenswrapper[18707]: E0320 08:42:51.060907 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f63c13521e5ca3a0dcdf96b62fa9e56532f32f43af31de53a30911e593c568a\": container with ID starting with 0f63c13521e5ca3a0dcdf96b62fa9e56532f32f43af31de53a30911e593c568a not found: ID does not exist" containerID="0f63c13521e5ca3a0dcdf96b62fa9e56532f32f43af31de53a30911e593c568a" Mar 20 08:42:51.061162 master-0 kubenswrapper[18707]: I0320 08:42:51.061008 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f63c13521e5ca3a0dcdf96b62fa9e56532f32f43af31de53a30911e593c568a"} err="failed to get container status \"0f63c13521e5ca3a0dcdf96b62fa9e56532f32f43af31de53a30911e593c568a\": rpc error: code = NotFound desc = could not find container \"0f63c13521e5ca3a0dcdf96b62fa9e56532f32f43af31de53a30911e593c568a\": container with ID starting with 0f63c13521e5ca3a0dcdf96b62fa9e56532f32f43af31de53a30911e593c568a not found: ID does not exist" Mar 20 08:42:51.061162 master-0 kubenswrapper[18707]: I0320 08:42:51.061076 18707 scope.go:117] "RemoveContainer" containerID="3030fbaae53d50269bd5e2ef3f7474bc1b880c795eb0e6c078b59b5741beb732" Mar 20 08:42:51.061783 master-0 kubenswrapper[18707]: E0320 08:42:51.061719 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3030fbaae53d50269bd5e2ef3f7474bc1b880c795eb0e6c078b59b5741beb732\": container with ID starting with 3030fbaae53d50269bd5e2ef3f7474bc1b880c795eb0e6c078b59b5741beb732 not found: ID does not exist" containerID="3030fbaae53d50269bd5e2ef3f7474bc1b880c795eb0e6c078b59b5741beb732" Mar 20 08:42:51.061871 master-0 kubenswrapper[18707]: I0320 08:42:51.061777 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3030fbaae53d50269bd5e2ef3f7474bc1b880c795eb0e6c078b59b5741beb732"} err="failed to get container status \"3030fbaae53d50269bd5e2ef3f7474bc1b880c795eb0e6c078b59b5741beb732\": rpc error: code = NotFound desc = could not find container \"3030fbaae53d50269bd5e2ef3f7474bc1b880c795eb0e6c078b59b5741beb732\": container with ID starting with 3030fbaae53d50269bd5e2ef3f7474bc1b880c795eb0e6c078b59b5741beb732 not found: ID does not exist" Mar 20 08:42:51.061871 master-0 kubenswrapper[18707]: I0320 08:42:51.061816 18707 scope.go:117] "RemoveContainer" containerID="f3b6bb748ad127ad783c317a9ec2edeaa6139af683fa7cda66b138823bc535c8" Mar 20 08:42:51.062460 master-0 kubenswrapper[18707]: E0320 08:42:51.062388 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3b6bb748ad127ad783c317a9ec2edeaa6139af683fa7cda66b138823bc535c8\": container with ID starting with f3b6bb748ad127ad783c317a9ec2edeaa6139af683fa7cda66b138823bc535c8 not found: ID does not exist" containerID="f3b6bb748ad127ad783c317a9ec2edeaa6139af683fa7cda66b138823bc535c8" Mar 20 08:42:51.062587 master-0 kubenswrapper[18707]: I0320 08:42:51.062465 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b6bb748ad127ad783c317a9ec2edeaa6139af683fa7cda66b138823bc535c8"} err="failed to get container status \"f3b6bb748ad127ad783c317a9ec2edeaa6139af683fa7cda66b138823bc535c8\": rpc error: code = NotFound desc = could not find container \"f3b6bb748ad127ad783c317a9ec2edeaa6139af683fa7cda66b138823bc535c8\": container with ID starting with f3b6bb748ad127ad783c317a9ec2edeaa6139af683fa7cda66b138823bc535c8 not found: ID does not exist" Mar 20 08:42:51.062587 master-0 kubenswrapper[18707]: I0320 08:42:51.062526 18707 scope.go:117] "RemoveContainer" containerID="d0111435fee5f861beed3e54093288b80fe796da20879b74d333fc90fd130f88" Mar 20 08:42:51.063467 master-0 kubenswrapper[18707]: E0320 08:42:51.063133 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0111435fee5f861beed3e54093288b80fe796da20879b74d333fc90fd130f88\": container with ID starting with d0111435fee5f861beed3e54093288b80fe796da20879b74d333fc90fd130f88 not found: ID does not exist" containerID="d0111435fee5f861beed3e54093288b80fe796da20879b74d333fc90fd130f88" Mar 20 08:42:51.063467 master-0 kubenswrapper[18707]: I0320 08:42:51.063232 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0111435fee5f861beed3e54093288b80fe796da20879b74d333fc90fd130f88"} err="failed to get container status \"d0111435fee5f861beed3e54093288b80fe796da20879b74d333fc90fd130f88\": rpc error: code = NotFound desc = could not find container \"d0111435fee5f861beed3e54093288b80fe796da20879b74d333fc90fd130f88\": container with ID starting with d0111435fee5f861beed3e54093288b80fe796da20879b74d333fc90fd130f88 not found: ID does not exist" Mar 20 08:42:51.063467 master-0 kubenswrapper[18707]: I0320 08:42:51.063281 18707 scope.go:117] "RemoveContainer" containerID="d11c59462bae131a903a2325334d63a1ab5222611dbbb14b008e128c7a2a34a9" Mar 20 08:42:51.063915 master-0 kubenswrapper[18707]: E0320 08:42:51.063820 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d11c59462bae131a903a2325334d63a1ab5222611dbbb14b008e128c7a2a34a9\": container with ID starting with d11c59462bae131a903a2325334d63a1ab5222611dbbb14b008e128c7a2a34a9 not found: ID does not exist" containerID="d11c59462bae131a903a2325334d63a1ab5222611dbbb14b008e128c7a2a34a9" Mar 20 08:42:51.063915 master-0 kubenswrapper[18707]: I0320 08:42:51.063874 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d11c59462bae131a903a2325334d63a1ab5222611dbbb14b008e128c7a2a34a9"} err="failed to get container status \"d11c59462bae131a903a2325334d63a1ab5222611dbbb14b008e128c7a2a34a9\": rpc error: code = NotFound desc = could not find container \"d11c59462bae131a903a2325334d63a1ab5222611dbbb14b008e128c7a2a34a9\": container with ID starting with d11c59462bae131a903a2325334d63a1ab5222611dbbb14b008e128c7a2a34a9 not found: ID does not exist" Mar 20 08:42:51.114095 master-0 kubenswrapper[18707]: I0320 08:42:51.114029 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" path="/var/lib/kubelet/pods/b45ea2ef1cf2bc9d1d994d6538ae0a64/volumes" Mar 20 08:42:51.147114 master-0 kubenswrapper[18707]: E0320 08:42:51.147016 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 20 08:42:52.748358 master-0 kubenswrapper[18707]: E0320 08:42:52.748242 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 20 08:42:55.102661 master-0 kubenswrapper[18707]: I0320 08:42:55.102518 18707 status_manager.go:851] "Failed to get status for pod" podUID="a5248d42-4743-4a8f-a554-9ae427b73597" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:42:55.950379 master-0 kubenswrapper[18707]: E0320 08:42:55.950269 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 20 08:42:59.771774 master-0 kubenswrapper[18707]: E0320 08:42:59.771513 18707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e801db67425a2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:16fb4ea7f83036d9c6adf3454fc7e9db,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:42:48.334026146 +0000 UTC m=+113.490206492,LastTimestamp:2026-03-20 08:42:48.334026146 +0000 UTC m=+113.490206492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:43:02.031734 master-0 kubenswrapper[18707]: I0320 08:43:02.031641 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/0.log" Mar 20 08:43:02.032772 master-0 kubenswrapper[18707]: I0320 08:43:02.031740 18707 generic.go:334] "Generic (PLEG): container finished" podID="2028761b8522f874dcebf13c4683d033" containerID="fe1c54a0e0873f0780c7faf59fe96cab185ee928520426321b3ddc92f25a0204" exitCode=1 Mar 20 08:43:02.032772 master-0 kubenswrapper[18707]: I0320 08:43:02.031837 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerDied","Data":"fe1c54a0e0873f0780c7faf59fe96cab185ee928520426321b3ddc92f25a0204"} Mar 20 08:43:02.035610 master-0 kubenswrapper[18707]: I0320 08:43:02.035558 18707 scope.go:117] "RemoveContainer" containerID="fe1c54a0e0873f0780c7faf59fe96cab185ee928520426321b3ddc92f25a0204" Mar 20 08:43:02.035966 master-0 kubenswrapper[18707]: I0320 08:43:02.035866 18707 status_manager.go:851] "Failed to get status for pod" podUID="2028761b8522f874dcebf13c4683d033" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:43:02.037416 master-0 kubenswrapper[18707]: I0320 08:43:02.037341 18707 status_manager.go:851] "Failed to get status for pod" podUID="a5248d42-4743-4a8f-a554-9ae427b73597" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:43:02.093244 master-0 kubenswrapper[18707]: I0320 08:43:02.093112 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:43:02.094928 master-0 kubenswrapper[18707]: I0320 08:43:02.094853 18707 status_manager.go:851] "Failed to get status for pod" podUID="2028761b8522f874dcebf13c4683d033" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:43:02.095812 master-0 kubenswrapper[18707]: I0320 08:43:02.095727 18707 status_manager.go:851] "Failed to get status for pod" podUID="a5248d42-4743-4a8f-a554-9ae427b73597" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:43:02.121543 master-0 kubenswrapper[18707]: I0320 08:43:02.121456 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="235dd7a4-ca4f-4bc0-91c5-789f03c4d8ce" Mar 20 08:43:02.121543 master-0 kubenswrapper[18707]: I0320 08:43:02.121530 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="235dd7a4-ca4f-4bc0-91c5-789f03c4d8ce" Mar 20 08:43:02.122649 master-0 kubenswrapper[18707]: E0320 08:43:02.122600 18707 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:43:02.123447 master-0 kubenswrapper[18707]: I0320 08:43:02.123378 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:43:02.356447 master-0 kubenswrapper[18707]: E0320 08:43:02.356317 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 20 08:43:02.760430 master-0 kubenswrapper[18707]: I0320 08:43:02.760300 18707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:43:03.047552 master-0 kubenswrapper[18707]: I0320 08:43:03.047460 18707 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="1409bbcde5e79bac3a67951727093fc0f8d50edb82f42aa3f8fe494469470b11" exitCode=0 Mar 20 08:43:03.048470 master-0 kubenswrapper[18707]: I0320 08:43:03.047600 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerDied","Data":"1409bbcde5e79bac3a67951727093fc0f8d50edb82f42aa3f8fe494469470b11"} Mar 20 08:43:03.048470 master-0 kubenswrapper[18707]: I0320 08:43:03.047651 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"4cbef0ec6b5acffd4e6db99b59725bf4bfe8c7c734d2b1c8458674788bf10117"} Mar 20 08:43:03.048470 master-0 kubenswrapper[18707]: I0320 08:43:03.048074 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="235dd7a4-ca4f-4bc0-91c5-789f03c4d8ce" Mar 20 08:43:03.048470 master-0 kubenswrapper[18707]: I0320 08:43:03.048100 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="235dd7a4-ca4f-4bc0-91c5-789f03c4d8ce" Mar 20 08:43:03.049802 master-0 kubenswrapper[18707]: E0320 08:43:03.049716 18707 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:43:03.049919 master-0 kubenswrapper[18707]: I0320 08:43:03.049798 18707 status_manager.go:851] "Failed to get status for pod" podUID="2028761b8522f874dcebf13c4683d033" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:43:03.051607 master-0 kubenswrapper[18707]: I0320 08:43:03.051428 18707 status_manager.go:851] "Failed to get status for pod" podUID="a5248d42-4743-4a8f-a554-9ae427b73597" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:43:03.056360 master-0 kubenswrapper[18707]: I0320 08:43:03.056309 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/0.log" Mar 20 08:43:03.056456 master-0 kubenswrapper[18707]: I0320 08:43:03.056402 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"a800488fb62a072a6848e70ff9d43658046c14f0dd3e6ca8078acf4e79046779"} Mar 20 08:43:03.058262 master-0 kubenswrapper[18707]: I0320 08:43:03.058208 18707 status_manager.go:851] "Failed to get status for pod" podUID="2028761b8522f874dcebf13c4683d033" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:43:03.058641 master-0 kubenswrapper[18707]: I0320 08:43:03.058597 18707 status_manager.go:851] "Failed to get status for pod" podUID="a5248d42-4743-4a8f-a554-9ae427b73597" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:43:03.103943 master-0 kubenswrapper[18707]: E0320 08:43:03.103830 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:43:03Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:43:03Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:43:03Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:43:03Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:43:03.104789 master-0 kubenswrapper[18707]: E0320 08:43:03.104715 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:43:03.105755 master-0 kubenswrapper[18707]: E0320 08:43:03.105699 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:43:03.106677 master-0 kubenswrapper[18707]: E0320 08:43:03.106623 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:43:03.107416 master-0 kubenswrapper[18707]: E0320 08:43:03.107333 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:43:03.107416 master-0 kubenswrapper[18707]: E0320 08:43:03.107370 18707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 08:43:04.067825 master-0 kubenswrapper[18707]: I0320 08:43:04.067744 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"980a74af7dbc007260e7377ea4cb1edcafe4c9568ad57a168d88500b7bd91f2e"} Mar 20 08:43:04.068159 master-0 kubenswrapper[18707]: I0320 08:43:04.067841 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"133aaf528ab1e0887a626b89e3622758b8188d7b50da183f7f2755440645c3ac"} Mar 20 08:43:04.068159 master-0 kubenswrapper[18707]: I0320 08:43:04.067865 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"16441156b7e8bbe4b02a4b00165bba66f63737e9ccdc2639e50919dc2f550bf4"} Mar 20 08:43:05.079371 master-0 kubenswrapper[18707]: I0320 08:43:05.079289 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"c2a566d1a6044b46ff0beec324066fa5e7cc44891867becc4439f58df2d6446e"} Mar 20 08:43:05.079371 master-0 kubenswrapper[18707]: I0320 08:43:05.079365 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"b0f5e22bf5db5b532cc2a9b7e1994228d722c79a9f2d60d4ddc02358bee2cb82"} Mar 20 08:43:05.080174 master-0 kubenswrapper[18707]: I0320 08:43:05.080130 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:43:05.080492 master-0 kubenswrapper[18707]: I0320 08:43:05.080438 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="235dd7a4-ca4f-4bc0-91c5-789f03c4d8ce" Mar 20 08:43:05.080577 master-0 kubenswrapper[18707]: I0320 08:43:05.080537 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="235dd7a4-ca4f-4bc0-91c5-789f03c4d8ce" Mar 20 08:43:07.123695 master-0 kubenswrapper[18707]: I0320 08:43:07.123571 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:43:07.124744 master-0 kubenswrapper[18707]: I0320 08:43:07.123688 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:43:07.134116 master-0 kubenswrapper[18707]: I0320 08:43:07.134060 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:43:08.754627 master-0 kubenswrapper[18707]: I0320 08:43:08.754567 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:43:08.755433 master-0 kubenswrapper[18707]: E0320 08:43:08.754936 18707 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:43:08.755530 master-0 kubenswrapper[18707]: E0320 08:43:08.755517 18707 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:43:08.755665 master-0 kubenswrapper[18707]: E0320 08:43:08.755653 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access podName:d245e5b2-a30d-45c8-9b79-6e8096765c14 nodeName:}" failed. No retries permitted until 2026-03-20 08:45:10.755629832 +0000 UTC m=+255.911810188 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access") pod "installer-3-master-0" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:43:09.511405 master-0 kubenswrapper[18707]: I0320 08:43:09.511331 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:43:10.196302 master-0 kubenswrapper[18707]: I0320 08:43:10.196243 18707 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:43:10.286682 master-0 kubenswrapper[18707]: I0320 08:43:10.286594 18707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="7d5ce05b3d592e63f1f92202d52b9635" podUID="10f6b2bf-da60-45b1-b95a-76acdc7f1e38" Mar 20 08:43:11.127485 master-0 kubenswrapper[18707]: I0320 08:43:11.127171 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="235dd7a4-ca4f-4bc0-91c5-789f03c4d8ce" Mar 20 08:43:11.127485 master-0 kubenswrapper[18707]: I0320 08:43:11.127231 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="235dd7a4-ca4f-4bc0-91c5-789f03c4d8ce" Mar 20 08:43:11.132358 master-0 kubenswrapper[18707]: I0320 08:43:11.132304 18707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="7d5ce05b3d592e63f1f92202d52b9635" podUID="10f6b2bf-da60-45b1-b95a-76acdc7f1e38" Mar 20 08:43:11.133779 master-0 kubenswrapper[18707]: I0320 08:43:11.133728 18707 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-master-0" containerID="cri-o://16441156b7e8bbe4b02a4b00165bba66f63737e9ccdc2639e50919dc2f550bf4" Mar 20 08:43:11.133928 master-0 kubenswrapper[18707]: I0320 08:43:11.133779 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:43:11.290842 master-0 kubenswrapper[18707]: I0320 08:43:11.290801 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:43:11.291549 master-0 kubenswrapper[18707]: I0320 08:43:11.291481 18707 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 20 08:43:11.291635 master-0 kubenswrapper[18707]: I0320 08:43:11.291593 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 20 08:43:12.134011 master-0 kubenswrapper[18707]: I0320 08:43:12.133942 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="235dd7a4-ca4f-4bc0-91c5-789f03c4d8ce" Mar 20 08:43:12.134011 master-0 kubenswrapper[18707]: I0320 08:43:12.133993 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="235dd7a4-ca4f-4bc0-91c5-789f03c4d8ce" Mar 20 08:43:12.139906 master-0 kubenswrapper[18707]: I0320 08:43:12.139790 18707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="7d5ce05b3d592e63f1f92202d52b9635" podUID="10f6b2bf-da60-45b1-b95a-76acdc7f1e38" Mar 20 08:43:18.553488 master-0 kubenswrapper[18707]: E0320 08:43:18.553360 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[trusted-ca], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" podUID="348f3880-793f-43e4-9de1-8511626d2552" Mar 20 08:43:19.197991 master-0 kubenswrapper[18707]: I0320 08:43:19.197902 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:43:20.413119 master-0 kubenswrapper[18707]: I0320 08:43:20.413060 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 20 08:43:20.423311 master-0 kubenswrapper[18707]: I0320 08:43:20.423250 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 20 08:43:20.599246 master-0 kubenswrapper[18707]: I0320 08:43:20.599129 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 08:43:20.948832 master-0 kubenswrapper[18707]: I0320 08:43:20.948712 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-frnfd" Mar 20 08:43:21.049481 master-0 kubenswrapper[18707]: I0320 08:43:21.049404 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 08:43:21.163383 master-0 kubenswrapper[18707]: I0320 08:43:21.163310 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 08:43:21.277030 master-0 kubenswrapper[18707]: I0320 08:43:21.276831 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 08:43:21.296328 master-0 kubenswrapper[18707]: I0320 08:43:21.296284 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:43:21.301096 master-0 kubenswrapper[18707]: I0320 08:43:21.301076 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:43:21.316412 master-0 kubenswrapper[18707]: I0320 08:43:21.316352 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 08:43:21.392647 master-0 kubenswrapper[18707]: I0320 08:43:21.392351 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 08:43:21.460144 master-0 kubenswrapper[18707]: I0320 08:43:21.460075 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 08:43:21.731157 master-0 kubenswrapper[18707]: I0320 08:43:21.731075 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 08:43:21.915468 master-0 kubenswrapper[18707]: I0320 08:43:21.915356 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 08:43:22.297879 master-0 kubenswrapper[18707]: I0320 08:43:22.297839 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 20 08:43:22.311519 master-0 kubenswrapper[18707]: I0320 08:43:22.311446 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 08:43:22.339954 master-0 kubenswrapper[18707]: I0320 08:43:22.339864 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 20 08:43:22.473885 master-0 kubenswrapper[18707]: I0320 08:43:22.473833 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 08:43:22.549226 master-0 kubenswrapper[18707]: I0320 08:43:22.549009 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 20 08:43:22.615024 master-0 kubenswrapper[18707]: I0320 08:43:22.614909 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 08:43:22.641235 master-0 kubenswrapper[18707]: I0320 08:43:22.638967 18707 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 08:43:22.753068 master-0 kubenswrapper[18707]: I0320 08:43:22.752977 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 20 08:43:22.914168 master-0 kubenswrapper[18707]: I0320 08:43:22.913990 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 08:43:22.960921 master-0 kubenswrapper[18707]: I0320 08:43:22.960841 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 20 08:43:22.991693 master-0 kubenswrapper[18707]: I0320 08:43:22.991622 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 20 08:43:23.053140 master-0 kubenswrapper[18707]: I0320 08:43:23.053058 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 08:43:23.210161 master-0 kubenswrapper[18707]: I0320 08:43:23.209995 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 08:43:23.215588 master-0 kubenswrapper[18707]: I0320 08:43:23.215529 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 20 08:43:23.340704 master-0 kubenswrapper[18707]: I0320 08:43:23.340623 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 20 08:43:23.526033 master-0 kubenswrapper[18707]: I0320 08:43:23.525841 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:43:23.526723 master-0 kubenswrapper[18707]: E0320 08:43:23.526167 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca podName:348f3880-793f-43e4-9de1-8511626d2552 nodeName:}" failed. No retries permitted until 2026-03-20 08:45:25.526121482 +0000 UTC m=+270.682301848 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca") pod "console-operator-76b6568d85-8b8gv" (UID: "348f3880-793f-43e4-9de1-8511626d2552") : configmap references non-existent config key: ca-bundle.crt Mar 20 08:43:23.583894 master-0 kubenswrapper[18707]: I0320 08:43:23.583744 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 08:43:23.588852 master-0 kubenswrapper[18707]: I0320 08:43:23.588811 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 08:43:23.600359 master-0 kubenswrapper[18707]: I0320 08:43:23.600308 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 08:43:23.601486 master-0 kubenswrapper[18707]: I0320 08:43:23.601269 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 08:43:23.624678 master-0 kubenswrapper[18707]: I0320 08:43:23.624596 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 08:43:23.637242 master-0 kubenswrapper[18707]: I0320 08:43:23.637199 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 08:43:23.686171 master-0 kubenswrapper[18707]: I0320 08:43:23.686111 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 20 08:43:23.720760 master-0 kubenswrapper[18707]: I0320 08:43:23.720690 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 08:43:23.742796 master-0 kubenswrapper[18707]: I0320 08:43:23.742729 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 08:43:23.829901 master-0 kubenswrapper[18707]: I0320 08:43:23.829843 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 20 08:43:23.882929 master-0 kubenswrapper[18707]: I0320 08:43:23.882850 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 08:43:23.906761 master-0 kubenswrapper[18707]: I0320 08:43:23.906269 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 08:43:23.916233 master-0 kubenswrapper[18707]: I0320 08:43:23.916149 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-2trhv" Mar 20 08:43:23.982104 master-0 kubenswrapper[18707]: I0320 08:43:23.982010 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 08:43:24.113041 master-0 kubenswrapper[18707]: I0320 08:43:24.112860 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 08:43:24.144768 master-0 kubenswrapper[18707]: I0320 08:43:24.144723 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-h4mf9" Mar 20 08:43:24.261629 master-0 kubenswrapper[18707]: I0320 08:43:24.261543 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-bksjt" Mar 20 08:43:24.496992 master-0 kubenswrapper[18707]: I0320 08:43:24.496330 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 08:43:24.556057 master-0 kubenswrapper[18707]: I0320 08:43:24.555976 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 08:43:24.611346 master-0 kubenswrapper[18707]: I0320 08:43:24.608307 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 08:43:24.649277 master-0 kubenswrapper[18707]: I0320 08:43:24.648749 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 08:43:24.700778 master-0 kubenswrapper[18707]: I0320 08:43:24.700682 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 20 08:43:24.757334 master-0 kubenswrapper[18707]: I0320 08:43:24.757104 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 20 08:43:24.767338 master-0 kubenswrapper[18707]: I0320 08:43:24.765101 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:43:24.901095 master-0 kubenswrapper[18707]: I0320 08:43:24.900981 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 20 08:43:24.920475 master-0 kubenswrapper[18707]: I0320 08:43:24.920331 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 08:43:24.926122 master-0 kubenswrapper[18707]: I0320 08:43:24.926064 18707 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 08:43:24.938621 master-0 kubenswrapper[18707]: I0320 08:43:24.938534 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 08:43:24.940305 master-0 kubenswrapper[18707]: I0320 08:43:24.940247 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 08:43:24.988721 master-0 kubenswrapper[18707]: I0320 08:43:24.988636 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 08:43:25.036753 master-0 kubenswrapper[18707]: I0320 08:43:25.036555 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 08:43:25.041311 master-0 kubenswrapper[18707]: I0320 08:43:25.041240 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 20 08:43:25.085031 master-0 kubenswrapper[18707]: I0320 08:43:25.084950 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 08:43:25.201325 master-0 kubenswrapper[18707]: I0320 08:43:25.201237 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 08:43:25.223006 master-0 kubenswrapper[18707]: I0320 08:43:25.222926 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 08:43:25.314036 master-0 kubenswrapper[18707]: I0320 08:43:25.313758 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 08:43:25.342834 master-0 kubenswrapper[18707]: I0320 08:43:25.342711 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 20 08:43:25.397836 master-0 kubenswrapper[18707]: I0320 08:43:25.397758 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 20 08:43:25.440375 master-0 kubenswrapper[18707]: I0320 08:43:25.440283 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 08:43:25.457215 master-0 kubenswrapper[18707]: I0320 08:43:25.457125 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 08:43:25.469937 master-0 kubenswrapper[18707]: I0320 08:43:25.469899 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 08:43:25.504429 master-0 kubenswrapper[18707]: I0320 08:43:25.504350 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 08:43:25.623125 master-0 kubenswrapper[18707]: I0320 08:43:25.622960 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 20 08:43:25.637703 master-0 kubenswrapper[18707]: I0320 08:43:25.637633 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 20 08:43:25.950293 master-0 kubenswrapper[18707]: I0320 08:43:25.950115 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 08:43:25.953708 master-0 kubenswrapper[18707]: I0320 08:43:25.953677 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 08:43:26.050661 master-0 kubenswrapper[18707]: I0320 08:43:26.050605 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 08:43:26.153637 master-0 kubenswrapper[18707]: I0320 08:43:26.153575 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 08:43:26.188983 master-0 kubenswrapper[18707]: I0320 08:43:26.188935 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 08:43:26.193296 master-0 kubenswrapper[18707]: I0320 08:43:26.193265 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 08:43:26.202785 master-0 kubenswrapper[18707]: I0320 08:43:26.202698 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 08:43:26.236714 master-0 kubenswrapper[18707]: I0320 08:43:26.236654 18707 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 08:43:26.326355 master-0 kubenswrapper[18707]: I0320 08:43:26.326138 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 20 08:43:26.330010 master-0 kubenswrapper[18707]: I0320 08:43:26.329967 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 20 08:43:26.350478 master-0 kubenswrapper[18707]: I0320 08:43:26.350388 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 20 08:43:26.405724 master-0 kubenswrapper[18707]: I0320 08:43:26.405507 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:43:26.458887 master-0 kubenswrapper[18707]: I0320 08:43:26.458692 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 08:43:26.461777 master-0 kubenswrapper[18707]: I0320 08:43:26.461704 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 08:43:26.464401 master-0 kubenswrapper[18707]: I0320 08:43:26.464333 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 08:43:26.515782 master-0 kubenswrapper[18707]: I0320 08:43:26.515680 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-qvkkb" Mar 20 08:43:26.575005 master-0 kubenswrapper[18707]: I0320 08:43:26.574893 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 08:43:26.703028 master-0 kubenswrapper[18707]: I0320 08:43:26.702938 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 20 08:43:26.839455 master-0 kubenswrapper[18707]: I0320 08:43:26.839375 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 08:43:26.983258 master-0 kubenswrapper[18707]: I0320 08:43:26.983121 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 08:43:27.035216 master-0 kubenswrapper[18707]: I0320 08:43:27.035116 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 08:43:27.140935 master-0 kubenswrapper[18707]: I0320 08:43:27.140765 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 20 08:43:27.223520 master-0 kubenswrapper[18707]: I0320 08:43:27.223410 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 20 08:43:27.341204 master-0 kubenswrapper[18707]: I0320 08:43:27.341133 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 20 08:43:27.343402 master-0 kubenswrapper[18707]: I0320 08:43:27.343365 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 08:43:27.425462 master-0 kubenswrapper[18707]: I0320 08:43:27.425294 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 08:43:27.508881 master-0 kubenswrapper[18707]: I0320 08:43:27.508734 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 08:43:27.516888 master-0 kubenswrapper[18707]: I0320 08:43:27.516820 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-wq6zb" Mar 20 08:43:27.612321 master-0 kubenswrapper[18707]: I0320 08:43:27.612230 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 08:43:27.625211 master-0 kubenswrapper[18707]: I0320 08:43:27.625127 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 20 08:43:27.652691 master-0 kubenswrapper[18707]: I0320 08:43:27.652607 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 20 08:43:27.685552 master-0 kubenswrapper[18707]: I0320 08:43:27.685410 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 08:43:27.711355 master-0 kubenswrapper[18707]: I0320 08:43:27.711244 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 08:43:27.727901 master-0 kubenswrapper[18707]: I0320 08:43:27.727793 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 20 08:43:27.740079 master-0 kubenswrapper[18707]: I0320 08:43:27.740000 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 08:43:27.773071 master-0 kubenswrapper[18707]: I0320 08:43:27.772994 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 08:43:27.798032 master-0 kubenswrapper[18707]: I0320 08:43:27.797927 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 08:43:27.814592 master-0 kubenswrapper[18707]: I0320 08:43:27.814425 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 08:43:27.906991 master-0 kubenswrapper[18707]: I0320 08:43:27.906869 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 08:43:27.927914 master-0 kubenswrapper[18707]: I0320 08:43:27.927828 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 08:43:27.930589 master-0 kubenswrapper[18707]: I0320 08:43:27.930530 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 08:43:28.044130 master-0 kubenswrapper[18707]: I0320 08:43:28.044003 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 08:43:28.137179 master-0 kubenswrapper[18707]: I0320 08:43:28.137117 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 20 08:43:28.167000 master-0 kubenswrapper[18707]: I0320 08:43:28.166927 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 08:43:28.167583 master-0 kubenswrapper[18707]: I0320 08:43:28.167520 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 08:43:28.186328 master-0 kubenswrapper[18707]: I0320 08:43:28.186251 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 20 08:43:28.191786 master-0 kubenswrapper[18707]: I0320 08:43:28.191744 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 08:43:28.230496 master-0 kubenswrapper[18707]: I0320 08:43:28.230441 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 20 08:43:28.342492 master-0 kubenswrapper[18707]: I0320 08:43:28.342254 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 08:43:28.402796 master-0 kubenswrapper[18707]: I0320 08:43:28.402692 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 08:43:28.427331 master-0 kubenswrapper[18707]: I0320 08:43:28.427253 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 08:43:28.551147 master-0 kubenswrapper[18707]: I0320 08:43:28.551086 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 08:43:28.566299 master-0 kubenswrapper[18707]: I0320 08:43:28.566179 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 08:43:28.573154 master-0 kubenswrapper[18707]: I0320 08:43:28.573100 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-vd4cn" Mar 20 08:43:28.676023 master-0 kubenswrapper[18707]: I0320 08:43:28.675886 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 08:43:28.684298 master-0 kubenswrapper[18707]: I0320 08:43:28.684275 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 08:43:28.748979 master-0 kubenswrapper[18707]: I0320 08:43:28.748786 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-js69c" Mar 20 08:43:28.814708 master-0 kubenswrapper[18707]: I0320 08:43:28.814639 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 08:43:28.877315 master-0 kubenswrapper[18707]: I0320 08:43:28.876490 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 08:43:28.978603 master-0 kubenswrapper[18707]: I0320 08:43:28.978437 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 08:43:29.053568 master-0 kubenswrapper[18707]: I0320 08:43:29.053015 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 08:43:29.082731 master-0 kubenswrapper[18707]: I0320 08:43:29.082659 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 08:43:29.215350 master-0 kubenswrapper[18707]: I0320 08:43:29.214734 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 08:43:29.239065 master-0 kubenswrapper[18707]: I0320 08:43:29.238926 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 08:43:29.288858 master-0 kubenswrapper[18707]: I0320 08:43:29.288797 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 08:43:29.328923 master-0 kubenswrapper[18707]: I0320 08:43:29.328848 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 08:43:29.336571 master-0 kubenswrapper[18707]: I0320 08:43:29.336505 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 08:43:29.341648 master-0 kubenswrapper[18707]: I0320 08:43:29.341624 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-scgh2" Mar 20 08:43:29.357725 master-0 kubenswrapper[18707]: I0320 08:43:29.357665 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 20 08:43:29.409041 master-0 kubenswrapper[18707]: I0320 08:43:29.408973 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-40c28rqu4fltf" Mar 20 08:43:29.416314 master-0 kubenswrapper[18707]: I0320 08:43:29.416281 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 08:43:29.485830 master-0 kubenswrapper[18707]: I0320 08:43:29.485761 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 08:43:29.503326 master-0 kubenswrapper[18707]: I0320 08:43:29.503233 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 08:43:29.507398 master-0 kubenswrapper[18707]: I0320 08:43:29.507360 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 08:43:29.533354 master-0 kubenswrapper[18707]: I0320 08:43:29.533273 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 08:43:29.560468 master-0 kubenswrapper[18707]: I0320 08:43:29.560376 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 08:43:29.588711 master-0 kubenswrapper[18707]: I0320 08:43:29.588647 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 08:43:29.614311 master-0 kubenswrapper[18707]: I0320 08:43:29.613972 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 08:43:29.626633 master-0 kubenswrapper[18707]: I0320 08:43:29.626555 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 20 08:43:29.737419 master-0 kubenswrapper[18707]: I0320 08:43:29.737317 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 08:43:29.839666 master-0 kubenswrapper[18707]: I0320 08:43:29.839585 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 08:43:30.029706 master-0 kubenswrapper[18707]: I0320 08:43:30.029612 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 08:43:30.084828 master-0 kubenswrapper[18707]: I0320 08:43:30.084779 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 08:43:30.127946 master-0 kubenswrapper[18707]: I0320 08:43:30.127523 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 08:43:30.202509 master-0 kubenswrapper[18707]: I0320 08:43:30.202443 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 08:43:30.213229 master-0 kubenswrapper[18707]: I0320 08:43:30.213172 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 20 08:43:30.253113 master-0 kubenswrapper[18707]: I0320 08:43:30.253024 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 08:43:30.270486 master-0 kubenswrapper[18707]: I0320 08:43:30.270403 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-rfqnk" Mar 20 08:43:30.291312 master-0 kubenswrapper[18707]: I0320 08:43:30.291230 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 08:43:30.321179 master-0 kubenswrapper[18707]: I0320 08:43:30.321096 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 08:43:30.357772 master-0 kubenswrapper[18707]: I0320 08:43:30.357702 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 08:43:30.558981 master-0 kubenswrapper[18707]: I0320 08:43:30.558870 18707 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 08:43:30.571375 master-0 kubenswrapper[18707]: I0320 08:43:30.571255 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 20 08:43:30.571375 master-0 kubenswrapper[18707]: I0320 08:43:30.571363 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 20 08:43:30.575990 master-0 kubenswrapper[18707]: I0320 08:43:30.575925 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 08:43:30.579908 master-0 kubenswrapper[18707]: I0320 08:43:30.579818 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:43:30.609381 master-0 kubenswrapper[18707]: I0320 08:43:30.609246 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=20.609223606 podStartE2EDuration="20.609223606s" podCreationTimestamp="2026-03-20 08:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:43:30.60662333 +0000 UTC m=+155.762803686" watchObservedRunningTime="2026-03-20 08:43:30.609223606 +0000 UTC m=+155.765403972" Mar 20 08:43:30.647293 master-0 kubenswrapper[18707]: I0320 08:43:30.647096 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 08:43:30.674509 master-0 kubenswrapper[18707]: I0320 08:43:30.674455 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 20 08:43:30.698907 master-0 kubenswrapper[18707]: I0320 08:43:30.698869 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 08:43:30.744336 master-0 kubenswrapper[18707]: I0320 08:43:30.744273 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 08:43:30.769248 master-0 kubenswrapper[18707]: I0320 08:43:30.769101 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 08:43:30.807363 master-0 kubenswrapper[18707]: I0320 08:43:30.807180 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 08:43:30.808817 master-0 kubenswrapper[18707]: I0320 08:43:30.808730 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 20 08:43:30.820915 master-0 kubenswrapper[18707]: I0320 08:43:30.820718 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 08:43:31.071319 master-0 kubenswrapper[18707]: I0320 08:43:31.071146 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 08:43:31.101362 master-0 kubenswrapper[18707]: I0320 08:43:31.101293 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 08:43:31.123175 master-0 kubenswrapper[18707]: I0320 08:43:31.123115 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 08:43:31.125973 master-0 kubenswrapper[18707]: I0320 08:43:31.125927 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 08:43:31.178673 master-0 kubenswrapper[18707]: I0320 08:43:31.178593 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 08:43:31.202769 master-0 kubenswrapper[18707]: I0320 08:43:31.202690 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 08:43:31.223754 master-0 kubenswrapper[18707]: I0320 08:43:31.223693 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 08:43:31.248834 master-0 kubenswrapper[18707]: I0320 08:43:31.248747 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-wftxn" Mar 20 08:43:31.254327 master-0 kubenswrapper[18707]: I0320 08:43:31.254287 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 08:43:31.316436 master-0 kubenswrapper[18707]: I0320 08:43:31.316368 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 20 08:43:31.342337 master-0 kubenswrapper[18707]: I0320 08:43:31.342167 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 08:43:31.370395 master-0 kubenswrapper[18707]: I0320 08:43:31.370339 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 08:43:31.629549 master-0 kubenswrapper[18707]: I0320 08:43:31.629396 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 08:43:31.700970 master-0 kubenswrapper[18707]: I0320 08:43:31.700909 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 08:43:31.710011 master-0 kubenswrapper[18707]: I0320 08:43:31.709965 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 20 08:43:31.793293 master-0 kubenswrapper[18707]: I0320 08:43:31.792178 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 08:43:31.831710 master-0 kubenswrapper[18707]: I0320 08:43:31.831575 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 20 08:43:31.863210 master-0 kubenswrapper[18707]: I0320 08:43:31.863131 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 20 08:43:31.901339 master-0 kubenswrapper[18707]: I0320 08:43:31.901162 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 08:43:31.972522 master-0 kubenswrapper[18707]: I0320 08:43:31.972450 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 20 08:43:32.020482 master-0 kubenswrapper[18707]: I0320 08:43:32.020397 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 08:43:32.036524 master-0 kubenswrapper[18707]: I0320 08:43:32.036325 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 08:43:32.079572 master-0 kubenswrapper[18707]: I0320 08:43:32.079454 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 08:43:32.080724 master-0 kubenswrapper[18707]: I0320 08:43:32.079709 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 08:43:32.115638 master-0 kubenswrapper[18707]: I0320 08:43:32.115547 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 08:43:32.129014 master-0 kubenswrapper[18707]: I0320 08:43:32.128912 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 08:43:32.157134 master-0 kubenswrapper[18707]: I0320 08:43:32.156931 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 08:43:32.261882 master-0 kubenswrapper[18707]: I0320 08:43:32.261792 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 08:43:32.334110 master-0 kubenswrapper[18707]: I0320 08:43:32.334044 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 20 08:43:32.390069 master-0 kubenswrapper[18707]: I0320 08:43:32.389974 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 08:43:32.402792 master-0 kubenswrapper[18707]: I0320 08:43:32.402728 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 08:43:32.445165 master-0 kubenswrapper[18707]: I0320 08:43:32.444950 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 20 08:43:32.445579 master-0 kubenswrapper[18707]: I0320 08:43:32.445356 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 08:43:32.483249 master-0 kubenswrapper[18707]: I0320 08:43:32.483134 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 08:43:32.523224 master-0 kubenswrapper[18707]: I0320 08:43:32.523019 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:43:32.523224 master-0 kubenswrapper[18707]: I0320 08:43:32.523132 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 20 08:43:32.594226 master-0 kubenswrapper[18707]: I0320 08:43:32.594130 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 08:43:32.634830 master-0 kubenswrapper[18707]: I0320 08:43:32.634760 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 08:43:32.662737 master-0 kubenswrapper[18707]: I0320 08:43:32.662677 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 20 08:43:32.756135 master-0 kubenswrapper[18707]: I0320 08:43:32.755932 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 20 08:43:32.832601 master-0 kubenswrapper[18707]: I0320 08:43:32.832536 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 08:43:32.897243 master-0 kubenswrapper[18707]: I0320 08:43:32.897161 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 08:43:32.916994 master-0 kubenswrapper[18707]: I0320 08:43:32.916905 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 08:43:32.917327 master-0 kubenswrapper[18707]: I0320 08:43:32.917285 18707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 20 08:43:32.917678 master-0 kubenswrapper[18707]: I0320 08:43:32.917611 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" containerName="startup-monitor" containerID="cri-o://7becf1bd30be15317505d9b734fa3236c48dd5943a2c295e96cc2c154bf77f5e" gracePeriod=5 Mar 20 08:43:32.968791 master-0 kubenswrapper[18707]: I0320 08:43:32.968705 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 08:43:33.117114 master-0 kubenswrapper[18707]: I0320 08:43:33.117056 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 20 08:43:33.157667 master-0 kubenswrapper[18707]: I0320 08:43:33.157601 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 08:43:33.284821 master-0 kubenswrapper[18707]: I0320 08:43:33.284773 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 08:43:33.285977 master-0 kubenswrapper[18707]: I0320 08:43:33.285924 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 08:43:33.296564 master-0 kubenswrapper[18707]: I0320 08:43:33.296518 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 08:43:33.383434 master-0 kubenswrapper[18707]: I0320 08:43:33.383304 18707 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 08:43:33.484881 master-0 kubenswrapper[18707]: I0320 08:43:33.484812 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 20 08:43:33.527003 master-0 kubenswrapper[18707]: I0320 08:43:33.526942 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 08:43:33.568377 master-0 kubenswrapper[18707]: I0320 08:43:33.568283 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 20 08:43:33.637843 master-0 kubenswrapper[18707]: I0320 08:43:33.637701 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-n6dht" Mar 20 08:43:33.712947 master-0 kubenswrapper[18707]: I0320 08:43:33.712891 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 08:43:33.717554 master-0 kubenswrapper[18707]: I0320 08:43:33.717503 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 08:43:33.807003 master-0 kubenswrapper[18707]: I0320 08:43:33.806922 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 08:43:33.975986 master-0 kubenswrapper[18707]: I0320 08:43:33.975843 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 08:43:34.014304 master-0 kubenswrapper[18707]: I0320 08:43:34.014169 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 08:43:34.093943 master-0 kubenswrapper[18707]: I0320 08:43:34.093891 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 20 08:43:34.129709 master-0 kubenswrapper[18707]: I0320 08:43:34.129646 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 08:43:34.218554 master-0 kubenswrapper[18707]: I0320 08:43:34.218480 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 08:43:34.251689 master-0 kubenswrapper[18707]: I0320 08:43:34.251543 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 08:43:34.290881 master-0 kubenswrapper[18707]: I0320 08:43:34.290781 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 08:43:34.324120 master-0 kubenswrapper[18707]: I0320 08:43:34.324070 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 08:43:34.374087 master-0 kubenswrapper[18707]: I0320 08:43:34.374038 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 08:43:34.444331 master-0 kubenswrapper[18707]: I0320 08:43:34.444277 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 08:43:34.485925 master-0 kubenswrapper[18707]: I0320 08:43:34.485853 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 08:43:34.537917 master-0 kubenswrapper[18707]: I0320 08:43:34.537784 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 08:43:34.659419 master-0 kubenswrapper[18707]: I0320 08:43:34.659358 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 20 08:43:34.812463 master-0 kubenswrapper[18707]: I0320 08:43:34.812387 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 08:43:34.836280 master-0 kubenswrapper[18707]: I0320 08:43:34.836226 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 08:43:34.958313 master-0 kubenswrapper[18707]: I0320 08:43:34.958215 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 08:43:35.169706 master-0 kubenswrapper[18707]: I0320 08:43:35.169482 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 20 08:43:35.196566 master-0 kubenswrapper[18707]: I0320 08:43:35.196488 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 20 08:43:35.255160 master-0 kubenswrapper[18707]: I0320 08:43:35.255093 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 08:43:35.305295 master-0 kubenswrapper[18707]: I0320 08:43:35.305211 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 08:43:35.404083 master-0 kubenswrapper[18707]: I0320 08:43:35.404009 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 20 08:43:35.406000 master-0 kubenswrapper[18707]: I0320 08:43:35.405969 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 08:43:35.453968 master-0 kubenswrapper[18707]: I0320 08:43:35.453770 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 20 08:43:35.759996 master-0 kubenswrapper[18707]: I0320 08:43:35.759816 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 08:43:35.888382 master-0 kubenswrapper[18707]: I0320 08:43:35.888302 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-d7bxn" Mar 20 08:43:36.221685 master-0 kubenswrapper[18707]: I0320 08:43:36.221624 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 08:43:36.272279 master-0 kubenswrapper[18707]: I0320 08:43:36.272181 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 08:43:36.918821 master-0 kubenswrapper[18707]: I0320 08:43:36.918741 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 08:43:36.961293 master-0 kubenswrapper[18707]: I0320 08:43:36.961179 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 08:43:37.313132 master-0 kubenswrapper[18707]: I0320 08:43:37.313069 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-62tl6" Mar 20 08:43:38.355437 master-0 kubenswrapper[18707]: I0320 08:43:38.355382 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_16fb4ea7f83036d9c6adf3454fc7e9db/startup-monitor/0.log" Mar 20 08:43:38.355437 master-0 kubenswrapper[18707]: I0320 08:43:38.355457 18707 generic.go:334] "Generic (PLEG): container finished" podID="16fb4ea7f83036d9c6adf3454fc7e9db" containerID="7becf1bd30be15317505d9b734fa3236c48dd5943a2c295e96cc2c154bf77f5e" exitCode=137 Mar 20 08:43:38.511377 master-0 kubenswrapper[18707]: I0320 08:43:38.511320 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_16fb4ea7f83036d9c6adf3454fc7e9db/startup-monitor/0.log" Mar 20 08:43:38.511773 master-0 kubenswrapper[18707]: I0320 08:43:38.511758 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:43:38.598780 master-0 kubenswrapper[18707]: I0320 08:43:38.598709 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock\") pod \"16fb4ea7f83036d9c6adf3454fc7e9db\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " Mar 20 08:43:38.599381 master-0 kubenswrapper[18707]: I0320 08:43:38.598957 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock" (OuterVolumeSpecName: "var-lock") pod "16fb4ea7f83036d9c6adf3454fc7e9db" (UID: "16fb4ea7f83036d9c6adf3454fc7e9db"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:43:38.599701 master-0 kubenswrapper[18707]: I0320 08:43:38.599575 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir\") pod \"16fb4ea7f83036d9c6adf3454fc7e9db\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " Mar 20 08:43:38.599701 master-0 kubenswrapper[18707]: I0320 08:43:38.599641 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log\") pod \"16fb4ea7f83036d9c6adf3454fc7e9db\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " Mar 20 08:43:38.599701 master-0 kubenswrapper[18707]: I0320 08:43:38.599666 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir\") pod \"16fb4ea7f83036d9c6adf3454fc7e9db\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " Mar 20 08:43:38.599873 master-0 kubenswrapper[18707]: I0320 08:43:38.599672 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "16fb4ea7f83036d9c6adf3454fc7e9db" (UID: "16fb4ea7f83036d9c6adf3454fc7e9db"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:43:38.599873 master-0 kubenswrapper[18707]: I0320 08:43:38.599760 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log" (OuterVolumeSpecName: "var-log") pod "16fb4ea7f83036d9c6adf3454fc7e9db" (UID: "16fb4ea7f83036d9c6adf3454fc7e9db"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:43:38.600099 master-0 kubenswrapper[18707]: I0320 08:43:38.599994 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests\") pod \"16fb4ea7f83036d9c6adf3454fc7e9db\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " Mar 20 08:43:38.600260 master-0 kubenswrapper[18707]: I0320 08:43:38.600176 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests" (OuterVolumeSpecName: "manifests") pod "16fb4ea7f83036d9c6adf3454fc7e9db" (UID: "16fb4ea7f83036d9c6adf3454fc7e9db"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:43:38.600831 master-0 kubenswrapper[18707]: I0320 08:43:38.600764 18707 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests\") on node \"master-0\" DevicePath \"\"" Mar 20 08:43:38.600831 master-0 kubenswrapper[18707]: I0320 08:43:38.600789 18707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:43:38.601043 master-0 kubenswrapper[18707]: I0320 08:43:38.600805 18707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:43:38.601043 master-0 kubenswrapper[18707]: I0320 08:43:38.601022 18707 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log\") on node \"master-0\" DevicePath \"\"" Mar 20 08:43:38.604658 master-0 kubenswrapper[18707]: I0320 08:43:38.604583 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "16fb4ea7f83036d9c6adf3454fc7e9db" (UID: "16fb4ea7f83036d9c6adf3454fc7e9db"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:43:38.702492 master-0 kubenswrapper[18707]: I0320 08:43:38.702393 18707 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:43:39.102968 master-0 kubenswrapper[18707]: I0320 08:43:39.102898 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" path="/var/lib/kubelet/pods/16fb4ea7f83036d9c6adf3454fc7e9db/volumes" Mar 20 08:43:39.365131 master-0 kubenswrapper[18707]: I0320 08:43:39.364990 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_16fb4ea7f83036d9c6adf3454fc7e9db/startup-monitor/0.log" Mar 20 08:43:39.365131 master-0 kubenswrapper[18707]: I0320 08:43:39.365103 18707 scope.go:117] "RemoveContainer" containerID="7becf1bd30be15317505d9b734fa3236c48dd5943a2c295e96cc2c154bf77f5e" Mar 20 08:43:39.365727 master-0 kubenswrapper[18707]: I0320 08:43:39.365195 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:43:51.674630 master-0 kubenswrapper[18707]: I0320 08:43:51.674543 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-x2bwv"] Mar 20 08:43:51.677383 master-0 kubenswrapper[18707]: E0320 08:43:51.674913 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" containerName="startup-monitor" Mar 20 08:43:51.677383 master-0 kubenswrapper[18707]: I0320 08:43:51.674932 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" containerName="startup-monitor" Mar 20 08:43:51.677383 master-0 kubenswrapper[18707]: E0320 08:43:51.674962 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5248d42-4743-4a8f-a554-9ae427b73597" containerName="installer" Mar 20 08:43:51.677383 master-0 kubenswrapper[18707]: I0320 08:43:51.674972 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5248d42-4743-4a8f-a554-9ae427b73597" containerName="installer" Mar 20 08:43:51.677383 master-0 kubenswrapper[18707]: I0320 08:43:51.675245 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" containerName="startup-monitor" Mar 20 08:43:51.677383 master-0 kubenswrapper[18707]: I0320 08:43:51.675295 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5248d42-4743-4a8f-a554-9ae427b73597" containerName="installer" Mar 20 08:43:51.677383 master-0 kubenswrapper[18707]: I0320 08:43:51.675981 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:51.678920 master-0 kubenswrapper[18707]: I0320 08:43:51.678862 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-qjs5w" Mar 20 08:43:51.679292 master-0 kubenswrapper[18707]: I0320 08:43:51.679238 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 20 08:43:51.827894 master-0 kubenswrapper[18707]: I0320 08:43:51.827816 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dmxj\" (UniqueName: \"kubernetes.io/projected/e6807720-382e-4b04-ad82-35cb4c225138-kube-api-access-6dmxj\") pod \"cni-sysctl-allowlist-ds-x2bwv\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:51.828241 master-0 kubenswrapper[18707]: I0320 08:43:51.827988 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6807720-382e-4b04-ad82-35cb4c225138-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-x2bwv\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:51.828241 master-0 kubenswrapper[18707]: I0320 08:43:51.828032 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/e6807720-382e-4b04-ad82-35cb4c225138-ready\") pod \"cni-sysctl-allowlist-ds-x2bwv\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:51.828241 master-0 kubenswrapper[18707]: I0320 08:43:51.828068 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6807720-382e-4b04-ad82-35cb4c225138-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-x2bwv\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:51.940017 master-0 kubenswrapper[18707]: I0320 08:43:51.930012 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dmxj\" (UniqueName: \"kubernetes.io/projected/e6807720-382e-4b04-ad82-35cb4c225138-kube-api-access-6dmxj\") pod \"cni-sysctl-allowlist-ds-x2bwv\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:51.940017 master-0 kubenswrapper[18707]: I0320 08:43:51.930097 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6807720-382e-4b04-ad82-35cb4c225138-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-x2bwv\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:51.940017 master-0 kubenswrapper[18707]: I0320 08:43:51.930118 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/e6807720-382e-4b04-ad82-35cb4c225138-ready\") pod \"cni-sysctl-allowlist-ds-x2bwv\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:51.940017 master-0 kubenswrapper[18707]: I0320 08:43:51.930140 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6807720-382e-4b04-ad82-35cb4c225138-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-x2bwv\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:51.940017 master-0 kubenswrapper[18707]: I0320 08:43:51.930252 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6807720-382e-4b04-ad82-35cb4c225138-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-x2bwv\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:51.940017 master-0 kubenswrapper[18707]: I0320 08:43:51.931174 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6807720-382e-4b04-ad82-35cb4c225138-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-x2bwv\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:51.940017 master-0 kubenswrapper[18707]: I0320 08:43:51.931439 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/e6807720-382e-4b04-ad82-35cb4c225138-ready\") pod \"cni-sysctl-allowlist-ds-x2bwv\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:51.960418 master-0 kubenswrapper[18707]: I0320 08:43:51.960360 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dmxj\" (UniqueName: \"kubernetes.io/projected/e6807720-382e-4b04-ad82-35cb4c225138-kube-api-access-6dmxj\") pod \"cni-sysctl-allowlist-ds-x2bwv\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:51.997568 master-0 kubenswrapper[18707]: I0320 08:43:51.997409 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:52.018284 master-0 kubenswrapper[18707]: W0320 08:43:52.018223 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6807720_382e_4b04_ad82_35cb4c225138.slice/crio-c7d73961da64ca498ab44f81d8c88b31a8d73b74ca83992e05641fb80dbd5877 WatchSource:0}: Error finding container c7d73961da64ca498ab44f81d8c88b31a8d73b74ca83992e05641fb80dbd5877: Status 404 returned error can't find the container with id c7d73961da64ca498ab44f81d8c88b31a8d73b74ca83992e05641fb80dbd5877 Mar 20 08:43:52.469969 master-0 kubenswrapper[18707]: I0320 08:43:52.469774 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" event={"ID":"e6807720-382e-4b04-ad82-35cb4c225138","Type":"ContainerStarted","Data":"7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09"} Mar 20 08:43:52.469969 master-0 kubenswrapper[18707]: I0320 08:43:52.469845 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" event={"ID":"e6807720-382e-4b04-ad82-35cb4c225138","Type":"ContainerStarted","Data":"c7d73961da64ca498ab44f81d8c88b31a8d73b74ca83992e05641fb80dbd5877"} Mar 20 08:43:52.470411 master-0 kubenswrapper[18707]: I0320 08:43:52.470075 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:52.493865 master-0 kubenswrapper[18707]: I0320 08:43:52.493573 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" podStartSLOduration=1.493545669 podStartE2EDuration="1.493545669s" podCreationTimestamp="2026-03-20 08:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:43:52.48843434 +0000 UTC m=+177.644614706" watchObservedRunningTime="2026-03-20 08:43:52.493545669 +0000 UTC m=+177.649726035" Mar 20 08:43:53.145497 master-0 kubenswrapper[18707]: I0320 08:43:53.141966 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5"] Mar 20 08:43:53.145497 master-0 kubenswrapper[18707]: I0320 08:43:53.142041 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-fc56bb77c-qd4sn"] Mar 20 08:43:53.145497 master-0 kubenswrapper[18707]: I0320 08:43:53.142331 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" podUID="a9a9ecf2-c476-4962-8333-21f242dbcb89" containerName="controller-manager" containerID="cri-o://c11d7390f44d6be5bc3fcba45513f31241f9e0af79908ed280e1f11a5db8a9db" gracePeriod=30 Mar 20 08:43:53.145497 master-0 kubenswrapper[18707]: I0320 08:43:53.143019 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" podUID="a638c468-010c-4da3-ad62-26f5f2bbdbb9" containerName="route-controller-manager" containerID="cri-o://ba86f095c655a63b2bbaa35e0d6ce4a38b529d13f07b295310237aeaa7d4bc4f" gracePeriod=30 Mar 20 08:43:53.482276 master-0 kubenswrapper[18707]: I0320 08:43:53.482102 18707 generic.go:334] "Generic (PLEG): container finished" podID="a638c468-010c-4da3-ad62-26f5f2bbdbb9" containerID="ba86f095c655a63b2bbaa35e0d6ce4a38b529d13f07b295310237aeaa7d4bc4f" exitCode=0 Mar 20 08:43:53.482472 master-0 kubenswrapper[18707]: I0320 08:43:53.482414 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" event={"ID":"a638c468-010c-4da3-ad62-26f5f2bbdbb9","Type":"ContainerDied","Data":"ba86f095c655a63b2bbaa35e0d6ce4a38b529d13f07b295310237aeaa7d4bc4f"} Mar 20 08:43:53.484948 master-0 kubenswrapper[18707]: I0320 08:43:53.484914 18707 generic.go:334] "Generic (PLEG): container finished" podID="a9a9ecf2-c476-4962-8333-21f242dbcb89" containerID="c11d7390f44d6be5bc3fcba45513f31241f9e0af79908ed280e1f11a5db8a9db" exitCode=0 Mar 20 08:43:53.485378 master-0 kubenswrapper[18707]: I0320 08:43:53.485343 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" event={"ID":"a9a9ecf2-c476-4962-8333-21f242dbcb89","Type":"ContainerDied","Data":"c11d7390f44d6be5bc3fcba45513f31241f9e0af79908ed280e1f11a5db8a9db"} Mar 20 08:43:53.537815 master-0 kubenswrapper[18707]: I0320 08:43:53.537759 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:43:53.547406 master-0 kubenswrapper[18707]: I0320 08:43:53.547318 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:43:53.649178 master-0 kubenswrapper[18707]: I0320 08:43:53.649118 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-x2bwv"] Mar 20 08:43:53.659709 master-0 kubenswrapper[18707]: I0320 08:43:53.659674 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:43:53.661281 master-0 kubenswrapper[18707]: I0320 08:43:53.660316 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a638c468-010c-4da3-ad62-26f5f2bbdbb9-serving-cert\") pod \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " Mar 20 08:43:53.661281 master-0 kubenswrapper[18707]: I0320 08:43:53.660396 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-config\") pod \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " Mar 20 08:43:53.661281 master-0 kubenswrapper[18707]: I0320 08:43:53.660550 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdbds\" (UniqueName: \"kubernetes.io/projected/a638c468-010c-4da3-ad62-26f5f2bbdbb9-kube-api-access-sdbds\") pod \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " Mar 20 08:43:53.661281 master-0 kubenswrapper[18707]: I0320 08:43:53.660668 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-client-ca\") pod \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\" (UID: \"a638c468-010c-4da3-ad62-26f5f2bbdbb9\") " Mar 20 08:43:53.661281 master-0 kubenswrapper[18707]: I0320 08:43:53.661002 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-config" (OuterVolumeSpecName: "config") pod "a638c468-010c-4da3-ad62-26f5f2bbdbb9" (UID: "a638c468-010c-4da3-ad62-26f5f2bbdbb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:43:53.662351 master-0 kubenswrapper[18707]: I0320 08:43:53.661421 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-client-ca" (OuterVolumeSpecName: "client-ca") pod "a638c468-010c-4da3-ad62-26f5f2bbdbb9" (UID: "a638c468-010c-4da3-ad62-26f5f2bbdbb9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:43:53.662351 master-0 kubenswrapper[18707]: I0320 08:43:53.661469 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:43:53.663751 master-0 kubenswrapper[18707]: I0320 08:43:53.663707 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a638c468-010c-4da3-ad62-26f5f2bbdbb9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a638c468-010c-4da3-ad62-26f5f2bbdbb9" (UID: "a638c468-010c-4da3-ad62-26f5f2bbdbb9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:43:53.665178 master-0 kubenswrapper[18707]: I0320 08:43:53.665138 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a638c468-010c-4da3-ad62-26f5f2bbdbb9-kube-api-access-sdbds" (OuterVolumeSpecName: "kube-api-access-sdbds") pod "a638c468-010c-4da3-ad62-26f5f2bbdbb9" (UID: "a638c468-010c-4da3-ad62-26f5f2bbdbb9"). InnerVolumeSpecName "kube-api-access-sdbds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:43:53.762964 master-0 kubenswrapper[18707]: I0320 08:43:53.762797 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-proxy-ca-bundles\") pod \"a9a9ecf2-c476-4962-8333-21f242dbcb89\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " Mar 20 08:43:53.763223 master-0 kubenswrapper[18707]: I0320 08:43:53.763054 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-client-ca\") pod \"a9a9ecf2-c476-4962-8333-21f242dbcb89\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " Mar 20 08:43:53.763616 master-0 kubenswrapper[18707]: I0320 08:43:53.763544 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a9a9ecf2-c476-4962-8333-21f242dbcb89" (UID: "a9a9ecf2-c476-4962-8333-21f242dbcb89"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:43:53.763809 master-0 kubenswrapper[18707]: I0320 08:43:53.763584 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5v9f\" (UniqueName: \"kubernetes.io/projected/a9a9ecf2-c476-4962-8333-21f242dbcb89-kube-api-access-d5v9f\") pod \"a9a9ecf2-c476-4962-8333-21f242dbcb89\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " Mar 20 08:43:53.763928 master-0 kubenswrapper[18707]: I0320 08:43:53.763896 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9a9ecf2-c476-4962-8333-21f242dbcb89-serving-cert\") pod \"a9a9ecf2-c476-4962-8333-21f242dbcb89\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " Mar 20 08:43:53.763973 master-0 kubenswrapper[18707]: I0320 08:43:53.763901 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-client-ca" (OuterVolumeSpecName: "client-ca") pod "a9a9ecf2-c476-4962-8333-21f242dbcb89" (UID: "a9a9ecf2-c476-4962-8333-21f242dbcb89"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:43:53.764007 master-0 kubenswrapper[18707]: I0320 08:43:53.763983 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-config\") pod \"a9a9ecf2-c476-4962-8333-21f242dbcb89\" (UID: \"a9a9ecf2-c476-4962-8333-21f242dbcb89\") " Mar 20 08:43:53.764820 master-0 kubenswrapper[18707]: I0320 08:43:53.764769 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-config" (OuterVolumeSpecName: "config") pod "a9a9ecf2-c476-4962-8333-21f242dbcb89" (UID: "a9a9ecf2-c476-4962-8333-21f242dbcb89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:43:53.765491 master-0 kubenswrapper[18707]: I0320 08:43:53.765448 18707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 20 08:43:53.765541 master-0 kubenswrapper[18707]: I0320 08:43:53.765501 18707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a638c468-010c-4da3-ad62-26f5f2bbdbb9-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:43:53.765541 master-0 kubenswrapper[18707]: I0320 08:43:53.765521 18707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:43:53.765541 master-0 kubenswrapper[18707]: I0320 08:43:53.765534 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdbds\" (UniqueName: \"kubernetes.io/projected/a638c468-010c-4da3-ad62-26f5f2bbdbb9-kube-api-access-sdbds\") on node \"master-0\" DevicePath \"\"" Mar 20 08:43:53.765681 master-0 kubenswrapper[18707]: I0320 08:43:53.765549 18707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a638c468-010c-4da3-ad62-26f5f2bbdbb9-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:43:53.765681 master-0 kubenswrapper[18707]: I0320 08:43:53.765568 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9a9ecf2-c476-4962-8333-21f242dbcb89-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:43:53.767243 master-0 kubenswrapper[18707]: I0320 08:43:53.767147 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9a9ecf2-c476-4962-8333-21f242dbcb89-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a9a9ecf2-c476-4962-8333-21f242dbcb89" (UID: "a9a9ecf2-c476-4962-8333-21f242dbcb89"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:43:53.768692 master-0 kubenswrapper[18707]: I0320 08:43:53.768630 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9a9ecf2-c476-4962-8333-21f242dbcb89-kube-api-access-d5v9f" (OuterVolumeSpecName: "kube-api-access-d5v9f") pod "a9a9ecf2-c476-4962-8333-21f242dbcb89" (UID: "a9a9ecf2-c476-4962-8333-21f242dbcb89"). InnerVolumeSpecName "kube-api-access-d5v9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:43:53.868114 master-0 kubenswrapper[18707]: I0320 08:43:53.868006 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5v9f\" (UniqueName: \"kubernetes.io/projected/a9a9ecf2-c476-4962-8333-21f242dbcb89-kube-api-access-d5v9f\") on node \"master-0\" DevicePath \"\"" Mar 20 08:43:53.868114 master-0 kubenswrapper[18707]: I0320 08:43:53.868068 18707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9a9ecf2-c476-4962-8333-21f242dbcb89-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:43:54.494676 master-0 kubenswrapper[18707]: I0320 08:43:54.494385 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" event={"ID":"a638c468-010c-4da3-ad62-26f5f2bbdbb9","Type":"ContainerDied","Data":"e9d1c009ab8bfc1b5d9e5ffc8e765a713719b93c5edffb019bac7a72776dcb9e"} Mar 20 08:43:54.494676 master-0 kubenswrapper[18707]: I0320 08:43:54.494507 18707 scope.go:117] "RemoveContainer" containerID="ba86f095c655a63b2bbaa35e0d6ce4a38b529d13f07b295310237aeaa7d4bc4f" Mar 20 08:43:54.494676 master-0 kubenswrapper[18707]: I0320 08:43:54.494427 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5" Mar 20 08:43:54.497325 master-0 kubenswrapper[18707]: I0320 08:43:54.497272 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" Mar 20 08:43:54.497417 master-0 kubenswrapper[18707]: I0320 08:43:54.497371 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fc56bb77c-qd4sn" event={"ID":"a9a9ecf2-c476-4962-8333-21f242dbcb89","Type":"ContainerDied","Data":"0885038f20b8301926b3fbf1704c402c0cf75d3a9c64af977d4466bc2f1fe1b6"} Mar 20 08:43:54.513096 master-0 kubenswrapper[18707]: I0320 08:43:54.513039 18707 scope.go:117] "RemoveContainer" containerID="c11d7390f44d6be5bc3fcba45513f31241f9e0af79908ed280e1f11a5db8a9db" Mar 20 08:43:54.552132 master-0 kubenswrapper[18707]: I0320 08:43:54.552073 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-fc56bb77c-qd4sn"] Mar 20 08:43:54.556767 master-0 kubenswrapper[18707]: I0320 08:43:54.556715 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-fc56bb77c-qd4sn"] Mar 20 08:43:54.569582 master-0 kubenswrapper[18707]: I0320 08:43:54.569383 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5"] Mar 20 08:43:54.573639 master-0 kubenswrapper[18707]: I0320 08:43:54.573590 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56f686584b-fdcx5"] Mar 20 08:43:54.646735 master-0 kubenswrapper[18707]: I0320 08:43:54.646508 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bfcf4df58-l6xz7"] Mar 20 08:43:54.652276 master-0 kubenswrapper[18707]: E0320 08:43:54.650686 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a638c468-010c-4da3-ad62-26f5f2bbdbb9" containerName="route-controller-manager" Mar 20 08:43:54.652276 master-0 kubenswrapper[18707]: I0320 08:43:54.650746 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a638c468-010c-4da3-ad62-26f5f2bbdbb9" containerName="route-controller-manager" Mar 20 08:43:54.652276 master-0 kubenswrapper[18707]: E0320 08:43:54.650796 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a9ecf2-c476-4962-8333-21f242dbcb89" containerName="controller-manager" Mar 20 08:43:54.652276 master-0 kubenswrapper[18707]: I0320 08:43:54.650813 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a9ecf2-c476-4962-8333-21f242dbcb89" containerName="controller-manager" Mar 20 08:43:54.652276 master-0 kubenswrapper[18707]: I0320 08:43:54.650988 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9a9ecf2-c476-4962-8333-21f242dbcb89" containerName="controller-manager" Mar 20 08:43:54.652276 master-0 kubenswrapper[18707]: I0320 08:43:54.651038 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a638c468-010c-4da3-ad62-26f5f2bbdbb9" containerName="route-controller-manager" Mar 20 08:43:54.652276 master-0 kubenswrapper[18707]: I0320 08:43:54.651681 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:54.659828 master-0 kubenswrapper[18707]: I0320 08:43:54.659743 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx"] Mar 20 08:43:54.660775 master-0 kubenswrapper[18707]: I0320 08:43:54.660706 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 08:43:54.661055 master-0 kubenswrapper[18707]: I0320 08:43:54.660986 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 08:43:54.661286 master-0 kubenswrapper[18707]: I0320 08:43:54.661249 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 08:43:54.661409 master-0 kubenswrapper[18707]: I0320 08:43:54.661389 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 08:43:54.661606 master-0 kubenswrapper[18707]: I0320 08:43:54.661553 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:54.661724 master-0 kubenswrapper[18707]: I0320 08:43:54.661690 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-7d577" Mar 20 08:43:54.663620 master-0 kubenswrapper[18707]: I0320 08:43:54.663520 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 08:43:54.668426 master-0 kubenswrapper[18707]: I0320 08:43:54.668217 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-4smb9" Mar 20 08:43:54.668426 master-0 kubenswrapper[18707]: I0320 08:43:54.668241 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 08:43:54.668701 master-0 kubenswrapper[18707]: I0320 08:43:54.668628 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 08:43:54.668755 master-0 kubenswrapper[18707]: I0320 08:43:54.668637 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 08:43:54.668801 master-0 kubenswrapper[18707]: I0320 08:43:54.668755 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 08:43:54.669536 master-0 kubenswrapper[18707]: I0320 08:43:54.668874 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 08:43:54.672946 master-0 kubenswrapper[18707]: I0320 08:43:54.672858 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bfcf4df58-l6xz7"] Mar 20 08:43:54.682407 master-0 kubenswrapper[18707]: I0320 08:43:54.682352 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx"] Mar 20 08:43:54.689779 master-0 kubenswrapper[18707]: I0320 08:43:54.689708 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 08:43:54.786614 master-0 kubenswrapper[18707]: I0320 08:43:54.786419 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6d2e841b-2070-42a9-b9c1-74411ddebee4-proxy-ca-bundles\") pod \"controller-manager-bfcf4df58-l6xz7\" (UID: \"6d2e841b-2070-42a9-b9c1-74411ddebee4\") " pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:54.786872 master-0 kubenswrapper[18707]: I0320 08:43:54.786748 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp6tc\" (UniqueName: \"kubernetes.io/projected/5e3b82e6-25e8-49f6-bbe7-1365425c4b7f-kube-api-access-cp6tc\") pod \"route-controller-manager-ffdd4b479-rhmfx\" (UID: \"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f\") " pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:54.786872 master-0 kubenswrapper[18707]: I0320 08:43:54.786839 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grfwn\" (UniqueName: \"kubernetes.io/projected/6d2e841b-2070-42a9-b9c1-74411ddebee4-kube-api-access-grfwn\") pod \"controller-manager-bfcf4df58-l6xz7\" (UID: \"6d2e841b-2070-42a9-b9c1-74411ddebee4\") " pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:54.786987 master-0 kubenswrapper[18707]: I0320 08:43:54.786955 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d2e841b-2070-42a9-b9c1-74411ddebee4-serving-cert\") pod \"controller-manager-bfcf4df58-l6xz7\" (UID: \"6d2e841b-2070-42a9-b9c1-74411ddebee4\") " pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:54.787036 master-0 kubenswrapper[18707]: I0320 08:43:54.786997 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e3b82e6-25e8-49f6-bbe7-1365425c4b7f-serving-cert\") pod \"route-controller-manager-ffdd4b479-rhmfx\" (UID: \"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f\") " pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:54.787084 master-0 kubenswrapper[18707]: I0320 08:43:54.787068 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d2e841b-2070-42a9-b9c1-74411ddebee4-client-ca\") pod \"controller-manager-bfcf4df58-l6xz7\" (UID: \"6d2e841b-2070-42a9-b9c1-74411ddebee4\") " pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:54.787163 master-0 kubenswrapper[18707]: I0320 08:43:54.787138 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3b82e6-25e8-49f6-bbe7-1365425c4b7f-config\") pod \"route-controller-manager-ffdd4b479-rhmfx\" (UID: \"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f\") " pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:54.787267 master-0 kubenswrapper[18707]: I0320 08:43:54.787164 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e3b82e6-25e8-49f6-bbe7-1365425c4b7f-client-ca\") pod \"route-controller-manager-ffdd4b479-rhmfx\" (UID: \"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f\") " pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:54.787267 master-0 kubenswrapper[18707]: I0320 08:43:54.787248 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d2e841b-2070-42a9-b9c1-74411ddebee4-config\") pod \"controller-manager-bfcf4df58-l6xz7\" (UID: \"6d2e841b-2070-42a9-b9c1-74411ddebee4\") " pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:54.890162 master-0 kubenswrapper[18707]: I0320 08:43:54.890072 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp6tc\" (UniqueName: \"kubernetes.io/projected/5e3b82e6-25e8-49f6-bbe7-1365425c4b7f-kube-api-access-cp6tc\") pod \"route-controller-manager-ffdd4b479-rhmfx\" (UID: \"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f\") " pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:54.890162 master-0 kubenswrapper[18707]: I0320 08:43:54.890156 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grfwn\" (UniqueName: \"kubernetes.io/projected/6d2e841b-2070-42a9-b9c1-74411ddebee4-kube-api-access-grfwn\") pod \"controller-manager-bfcf4df58-l6xz7\" (UID: \"6d2e841b-2070-42a9-b9c1-74411ddebee4\") " pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:54.890555 master-0 kubenswrapper[18707]: I0320 08:43:54.890409 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d2e841b-2070-42a9-b9c1-74411ddebee4-serving-cert\") pod \"controller-manager-bfcf4df58-l6xz7\" (UID: \"6d2e841b-2070-42a9-b9c1-74411ddebee4\") " pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:54.890555 master-0 kubenswrapper[18707]: I0320 08:43:54.890449 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e3b82e6-25e8-49f6-bbe7-1365425c4b7f-serving-cert\") pod \"route-controller-manager-ffdd4b479-rhmfx\" (UID: \"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f\") " pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:54.890692 master-0 kubenswrapper[18707]: I0320 08:43:54.890628 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d2e841b-2070-42a9-b9c1-74411ddebee4-client-ca\") pod \"controller-manager-bfcf4df58-l6xz7\" (UID: \"6d2e841b-2070-42a9-b9c1-74411ddebee4\") " pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:54.890773 master-0 kubenswrapper[18707]: I0320 08:43:54.890688 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3b82e6-25e8-49f6-bbe7-1365425c4b7f-config\") pod \"route-controller-manager-ffdd4b479-rhmfx\" (UID: \"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f\") " pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:54.891345 master-0 kubenswrapper[18707]: I0320 08:43:54.891303 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e3b82e6-25e8-49f6-bbe7-1365425c4b7f-client-ca\") pod \"route-controller-manager-ffdd4b479-rhmfx\" (UID: \"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f\") " pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:54.891478 master-0 kubenswrapper[18707]: I0320 08:43:54.891390 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d2e841b-2070-42a9-b9c1-74411ddebee4-config\") pod \"controller-manager-bfcf4df58-l6xz7\" (UID: \"6d2e841b-2070-42a9-b9c1-74411ddebee4\") " pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:54.891478 master-0 kubenswrapper[18707]: I0320 08:43:54.891446 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6d2e841b-2070-42a9-b9c1-74411ddebee4-proxy-ca-bundles\") pod \"controller-manager-bfcf4df58-l6xz7\" (UID: \"6d2e841b-2070-42a9-b9c1-74411ddebee4\") " pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:54.892119 master-0 kubenswrapper[18707]: I0320 08:43:54.892057 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6d2e841b-2070-42a9-b9c1-74411ddebee4-client-ca\") pod \"controller-manager-bfcf4df58-l6xz7\" (UID: \"6d2e841b-2070-42a9-b9c1-74411ddebee4\") " pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:54.892315 master-0 kubenswrapper[18707]: I0320 08:43:54.892271 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3b82e6-25e8-49f6-bbe7-1365425c4b7f-config\") pod \"route-controller-manager-ffdd4b479-rhmfx\" (UID: \"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f\") " pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:54.893264 master-0 kubenswrapper[18707]: I0320 08:43:54.892555 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6d2e841b-2070-42a9-b9c1-74411ddebee4-proxy-ca-bundles\") pod \"controller-manager-bfcf4df58-l6xz7\" (UID: \"6d2e841b-2070-42a9-b9c1-74411ddebee4\") " pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:54.893264 master-0 kubenswrapper[18707]: I0320 08:43:54.893009 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e3b82e6-25e8-49f6-bbe7-1365425c4b7f-client-ca\") pod \"route-controller-manager-ffdd4b479-rhmfx\" (UID: \"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f\") " pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:54.893264 master-0 kubenswrapper[18707]: I0320 08:43:54.893216 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d2e841b-2070-42a9-b9c1-74411ddebee4-config\") pod \"controller-manager-bfcf4df58-l6xz7\" (UID: \"6d2e841b-2070-42a9-b9c1-74411ddebee4\") " pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:54.895215 master-0 kubenswrapper[18707]: I0320 08:43:54.895156 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d2e841b-2070-42a9-b9c1-74411ddebee4-serving-cert\") pod \"controller-manager-bfcf4df58-l6xz7\" (UID: \"6d2e841b-2070-42a9-b9c1-74411ddebee4\") " pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:54.897171 master-0 kubenswrapper[18707]: I0320 08:43:54.897112 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e3b82e6-25e8-49f6-bbe7-1365425c4b7f-serving-cert\") pod \"route-controller-manager-ffdd4b479-rhmfx\" (UID: \"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f\") " pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:54.914153 master-0 kubenswrapper[18707]: I0320 08:43:54.914105 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp6tc\" (UniqueName: \"kubernetes.io/projected/5e3b82e6-25e8-49f6-bbe7-1365425c4b7f-kube-api-access-cp6tc\") pod \"route-controller-manager-ffdd4b479-rhmfx\" (UID: \"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f\") " pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:54.923578 master-0 kubenswrapper[18707]: I0320 08:43:54.923513 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grfwn\" (UniqueName: \"kubernetes.io/projected/6d2e841b-2070-42a9-b9c1-74411ddebee4-kube-api-access-grfwn\") pod \"controller-manager-bfcf4df58-l6xz7\" (UID: \"6d2e841b-2070-42a9-b9c1-74411ddebee4\") " pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:55.007311 master-0 kubenswrapper[18707]: I0320 08:43:55.007178 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:55.032457 master-0 kubenswrapper[18707]: I0320 08:43:55.032413 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-4smb9" Mar 20 08:43:55.040921 master-0 kubenswrapper[18707]: I0320 08:43:55.040714 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:55.110411 master-0 kubenswrapper[18707]: I0320 08:43:55.110349 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a638c468-010c-4da3-ad62-26f5f2bbdbb9" path="/var/lib/kubelet/pods/a638c468-010c-4da3-ad62-26f5f2bbdbb9/volumes" Mar 20 08:43:55.111178 master-0 kubenswrapper[18707]: I0320 08:43:55.111140 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9a9ecf2-c476-4962-8333-21f242dbcb89" path="/var/lib/kubelet/pods/a9a9ecf2-c476-4962-8333-21f242dbcb89/volumes" Mar 20 08:43:55.473864 master-0 kubenswrapper[18707]: I0320 08:43:55.473799 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bfcf4df58-l6xz7"] Mar 20 08:43:55.504217 master-0 kubenswrapper[18707]: I0320 08:43:55.504101 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" podUID="e6807720-382e-4b04-ad82-35cb4c225138" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09" gracePeriod=30 Mar 20 08:43:55.561756 master-0 kubenswrapper[18707]: I0320 08:43:55.561697 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx"] Mar 20 08:43:55.566267 master-0 kubenswrapper[18707]: W0320 08:43:55.566157 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3b82e6_25e8_49f6_bbe7_1365425c4b7f.slice/crio-11d4ac0042bbce233ce5de1e1f61d7ea5464463cd44991960274885392d66df9 WatchSource:0}: Error finding container 11d4ac0042bbce233ce5de1e1f61d7ea5464463cd44991960274885392d66df9: Status 404 returned error can't find the container with id 11d4ac0042bbce233ce5de1e1f61d7ea5464463cd44991960274885392d66df9 Mar 20 08:43:56.513383 master-0 kubenswrapper[18707]: I0320 08:43:56.513312 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" event={"ID":"6d2e841b-2070-42a9-b9c1-74411ddebee4","Type":"ContainerStarted","Data":"9bde44a84ae0d5d3ed2a7201c7b97eb3264198bec510b101380ad7b4c98aa7b7"} Mar 20 08:43:56.513383 master-0 kubenswrapper[18707]: I0320 08:43:56.513375 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" event={"ID":"6d2e841b-2070-42a9-b9c1-74411ddebee4","Type":"ContainerStarted","Data":"08536f61b17c787f91abe689cbe986178625dea9dc295b50fa754eddcb9767d8"} Mar 20 08:43:56.514267 master-0 kubenswrapper[18707]: I0320 08:43:56.513713 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:56.517342 master-0 kubenswrapper[18707]: I0320 08:43:56.517292 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" event={"ID":"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f","Type":"ContainerStarted","Data":"3229c4e0fdc2d8c60bad75326ff1f2872340ec6b674781d5e8f4649fb7a07f12"} Mar 20 08:43:56.517883 master-0 kubenswrapper[18707]: I0320 08:43:56.517865 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:56.517970 master-0 kubenswrapper[18707]: I0320 08:43:56.517954 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" event={"ID":"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f","Type":"ContainerStarted","Data":"11d4ac0042bbce233ce5de1e1f61d7ea5464463cd44991960274885392d66df9"} Mar 20 08:43:56.520379 master-0 kubenswrapper[18707]: I0320 08:43:56.520359 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:43:56.526012 master-0 kubenswrapper[18707]: I0320 08:43:56.522326 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:43:56.542404 master-0 kubenswrapper[18707]: I0320 08:43:56.541441 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" podStartSLOduration=3.541417987 podStartE2EDuration="3.541417987s" podCreationTimestamp="2026-03-20 08:43:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:43:56.539645655 +0000 UTC m=+181.695826011" watchObservedRunningTime="2026-03-20 08:43:56.541417987 +0000 UTC m=+181.697598343" Mar 20 08:43:56.601670 master-0 kubenswrapper[18707]: I0320 08:43:56.601577 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" podStartSLOduration=3.601550042 podStartE2EDuration="3.601550042s" podCreationTimestamp="2026-03-20 08:43:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:43:56.600565153 +0000 UTC m=+181.756745529" watchObservedRunningTime="2026-03-20 08:43:56.601550042 +0000 UTC m=+181.757730398" Mar 20 08:43:56.688920 master-0 kubenswrapper[18707]: I0320 08:43:56.688833 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-54684fb888-p62jk"] Mar 20 08:43:56.691860 master-0 kubenswrapper[18707]: I0320 08:43:56.691769 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.698606 master-0 kubenswrapper[18707]: I0320 08:43:56.697783 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 20 08:43:56.698606 master-0 kubenswrapper[18707]: I0320 08:43:56.698080 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 20 08:43:56.698606 master-0 kubenswrapper[18707]: I0320 08:43:56.698342 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 20 08:43:56.698606 master-0 kubenswrapper[18707]: I0320 08:43:56.698361 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-54684fb888-p62jk"] Mar 20 08:43:56.698606 master-0 kubenswrapper[18707]: I0320 08:43:56.698510 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-v9mgl" Mar 20 08:43:56.698956 master-0 kubenswrapper[18707]: I0320 08:43:56.698652 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 20 08:43:56.698990 master-0 kubenswrapper[18707]: I0320 08:43:56.698963 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 20 08:43:56.702678 master-0 kubenswrapper[18707]: I0320 08:43:56.702493 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 20 08:43:56.831918 master-0 kubenswrapper[18707]: I0320 08:43:56.831836 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.832308 master-0 kubenswrapper[18707]: I0320 08:43:56.831949 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-telemeter-client-tls\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.832308 master-0 kubenswrapper[18707]: I0320 08:43:56.831999 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d7nm\" (UniqueName: \"kubernetes.io/projected/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-kube-api-access-8d7nm\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.832308 master-0 kubenswrapper[18707]: I0320 08:43:56.832171 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-serving-certs-ca-bundle\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.832308 master-0 kubenswrapper[18707]: I0320 08:43:56.832264 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.832509 master-0 kubenswrapper[18707]: I0320 08:43:56.832341 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-metrics-client-ca\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.832509 master-0 kubenswrapper[18707]: I0320 08:43:56.832432 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-federate-client-tls\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.832614 master-0 kubenswrapper[18707]: I0320 08:43:56.832560 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-secret-telemeter-client\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.934527 master-0 kubenswrapper[18707]: I0320 08:43:56.934452 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.934865 master-0 kubenswrapper[18707]: I0320 08:43:56.934584 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-telemeter-client-tls\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.934865 master-0 kubenswrapper[18707]: I0320 08:43:56.934644 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d7nm\" (UniqueName: \"kubernetes.io/projected/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-kube-api-access-8d7nm\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.935235 master-0 kubenswrapper[18707]: I0320 08:43:56.935136 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-serving-certs-ca-bundle\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.935612 master-0 kubenswrapper[18707]: I0320 08:43:56.935546 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.935737 master-0 kubenswrapper[18707]: I0320 08:43:56.935704 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-metrics-client-ca\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.935855 master-0 kubenswrapper[18707]: I0320 08:43:56.935797 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-federate-client-tls\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.935918 master-0 kubenswrapper[18707]: I0320 08:43:56.935883 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-secret-telemeter-client\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.935918 master-0 kubenswrapper[18707]: I0320 08:43:56.935887 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.936175 master-0 kubenswrapper[18707]: I0320 08:43:56.936147 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-serving-certs-ca-bundle\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.936726 master-0 kubenswrapper[18707]: I0320 08:43:56.936693 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-metrics-client-ca\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.938468 master-0 kubenswrapper[18707]: I0320 08:43:56.938417 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-telemeter-client-tls\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.940283 master-0 kubenswrapper[18707]: I0320 08:43:56.940017 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.941287 master-0 kubenswrapper[18707]: I0320 08:43:56.941232 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-secret-telemeter-client\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.944157 master-0 kubenswrapper[18707]: I0320 08:43:56.944117 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-federate-client-tls\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:56.956505 master-0 kubenswrapper[18707]: I0320 08:43:56.956466 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d7nm\" (UniqueName: \"kubernetes.io/projected/1ccc9acf-1a3e-4303-8b0e-c610eb9ce529-kube-api-access-8d7nm\") pod \"telemeter-client-54684fb888-p62jk\" (UID: \"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529\") " pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:57.020766 master-0 kubenswrapper[18707]: I0320 08:43:57.020689 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" Mar 20 08:43:57.489747 master-0 kubenswrapper[18707]: I0320 08:43:57.489647 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-54684fb888-p62jk"] Mar 20 08:43:57.529619 master-0 kubenswrapper[18707]: I0320 08:43:57.529533 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" event={"ID":"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529","Type":"ContainerStarted","Data":"ef32225945ba8c167b6c8ae98da593aae8121ff193bf19dea143d0e29af044fc"} Mar 20 08:44:00.567827 master-0 kubenswrapper[18707]: I0320 08:44:00.567087 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" event={"ID":"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529","Type":"ContainerStarted","Data":"38b3f291d969d7f7fdae83f65df16381bdad86e5a1ff9a2f229c9a0f57258dc1"} Mar 20 08:44:01.237379 master-0 kubenswrapper[18707]: I0320 08:44:01.236541 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-649577484c-p72cd"] Mar 20 08:44:01.238138 master-0 kubenswrapper[18707]: I0320 08:44:01.238060 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-649577484c-p72cd" Mar 20 08:44:01.241586 master-0 kubenswrapper[18707]: I0320 08:44:01.241521 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-kv9b9" Mar 20 08:44:01.248558 master-0 kubenswrapper[18707]: I0320 08:44:01.246800 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-649577484c-p72cd"] Mar 20 08:44:01.424685 master-0 kubenswrapper[18707]: I0320 08:44:01.424615 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546-webhook-certs\") pod \"multus-admission-controller-649577484c-p72cd\" (UID: \"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546\") " pod="openshift-multus/multus-admission-controller-649577484c-p72cd" Mar 20 08:44:01.424867 master-0 kubenswrapper[18707]: I0320 08:44:01.424710 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sffd6\" (UniqueName: \"kubernetes.io/projected/9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546-kube-api-access-sffd6\") pod \"multus-admission-controller-649577484c-p72cd\" (UID: \"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546\") " pod="openshift-multus/multus-admission-controller-649577484c-p72cd" Mar 20 08:44:01.526819 master-0 kubenswrapper[18707]: I0320 08:44:01.526752 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sffd6\" (UniqueName: \"kubernetes.io/projected/9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546-kube-api-access-sffd6\") pod \"multus-admission-controller-649577484c-p72cd\" (UID: \"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546\") " pod="openshift-multus/multus-admission-controller-649577484c-p72cd" Mar 20 08:44:01.527062 master-0 kubenswrapper[18707]: I0320 08:44:01.527025 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546-webhook-certs\") pod \"multus-admission-controller-649577484c-p72cd\" (UID: \"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546\") " pod="openshift-multus/multus-admission-controller-649577484c-p72cd" Mar 20 08:44:01.531416 master-0 kubenswrapper[18707]: I0320 08:44:01.531372 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546-webhook-certs\") pod \"multus-admission-controller-649577484c-p72cd\" (UID: \"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546\") " pod="openshift-multus/multus-admission-controller-649577484c-p72cd" Mar 20 08:44:01.547129 master-0 kubenswrapper[18707]: I0320 08:44:01.547069 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sffd6\" (UniqueName: \"kubernetes.io/projected/9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546-kube-api-access-sffd6\") pod \"multus-admission-controller-649577484c-p72cd\" (UID: \"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546\") " pod="openshift-multus/multus-admission-controller-649577484c-p72cd" Mar 20 08:44:01.577100 master-0 kubenswrapper[18707]: I0320 08:44:01.577009 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-649577484c-p72cd" Mar 20 08:44:01.606822 master-0 kubenswrapper[18707]: I0320 08:44:01.606336 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" event={"ID":"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529","Type":"ContainerStarted","Data":"e9e300efcdf0a4ad9c2e06ffa57cd633c3b20e510b0af6df21adddd4f6906a44"} Mar 20 08:44:01.607847 master-0 kubenswrapper[18707]: I0320 08:44:01.607709 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" event={"ID":"1ccc9acf-1a3e-4303-8b0e-c610eb9ce529","Type":"ContainerStarted","Data":"cb76b6a7768780264bb5d08b21c0d3f27de5e0f7fd61bf898f5f55540a348667"} Mar 20 08:44:01.653950 master-0 kubenswrapper[18707]: I0320 08:44:01.650116 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-54684fb888-p62jk" podStartSLOduration=1.9095953030000001 podStartE2EDuration="5.650024329s" podCreationTimestamp="2026-03-20 08:43:56 +0000 UTC" firstStartedPulling="2026-03-20 08:43:57.498146847 +0000 UTC m=+182.654327213" lastFinishedPulling="2026-03-20 08:44:01.238575883 +0000 UTC m=+186.394756239" observedRunningTime="2026-03-20 08:44:01.640200353 +0000 UTC m=+186.796380709" watchObservedRunningTime="2026-03-20 08:44:01.650024329 +0000 UTC m=+186.806204685" Mar 20 08:44:02.001436 master-0 kubenswrapper[18707]: E0320 08:44:02.001324 18707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:44:02.003279 master-0 kubenswrapper[18707]: E0320 08:44:02.003213 18707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:44:02.004836 master-0 kubenswrapper[18707]: E0320 08:44:02.004758 18707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:44:02.004946 master-0 kubenswrapper[18707]: E0320 08:44:02.004847 18707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" podUID="e6807720-382e-4b04-ad82-35cb4c225138" containerName="kube-multus-additional-cni-plugins" Mar 20 08:44:02.037936 master-0 kubenswrapper[18707]: I0320 08:44:02.037874 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-649577484c-p72cd"] Mar 20 08:44:02.615347 master-0 kubenswrapper[18707]: I0320 08:44:02.615272 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-649577484c-p72cd" event={"ID":"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546","Type":"ContainerStarted","Data":"2e6d9acc962059a71851afc29bfffa1cd8b01261ca8f4794a3d9883228efc778"} Mar 20 08:44:02.615347 master-0 kubenswrapper[18707]: I0320 08:44:02.615320 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-649577484c-p72cd" event={"ID":"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546","Type":"ContainerStarted","Data":"70c28deb955d7c56353c6f5f88a7741aa086cb22ed0b1cbdc8f73512b3523f0a"} Mar 20 08:44:03.630208 master-0 kubenswrapper[18707]: I0320 08:44:03.630119 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-649577484c-p72cd" event={"ID":"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546","Type":"ContainerStarted","Data":"ac514f213147fbb98ac1f2791391ac7cc4bbf81039d4cbb379d8da7b1cb3d38b"} Mar 20 08:44:03.671018 master-0 kubenswrapper[18707]: I0320 08:44:03.670324 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-649577484c-p72cd" podStartSLOduration=2.670300317 podStartE2EDuration="2.670300317s" podCreationTimestamp="2026-03-20 08:44:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:44:03.663173969 +0000 UTC m=+188.819354335" watchObservedRunningTime="2026-03-20 08:44:03.670300317 +0000 UTC m=+188.826480683" Mar 20 08:44:03.760459 master-0 kubenswrapper[18707]: I0320 08:44:03.760378 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh"] Mar 20 08:44:03.760768 master-0 kubenswrapper[18707]: I0320 08:44:03.760725 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" podUID="6a80bd6f-2263-4251-8197-5173193f8afc" containerName="multus-admission-controller" containerID="cri-o://199be5e4e7ebbf5d512a41b53ac83599e0d0e19351865228cfbac1cc0f5d6aa8" gracePeriod=30 Mar 20 08:44:03.760943 master-0 kubenswrapper[18707]: I0320 08:44:03.760915 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" podUID="6a80bd6f-2263-4251-8197-5173193f8afc" containerName="kube-rbac-proxy" containerID="cri-o://d02eea2683ac6efae0f70fd8f717e97bc121a80e06da8071bf5f8c5bd619e9f0" gracePeriod=30 Mar 20 08:44:04.642842 master-0 kubenswrapper[18707]: I0320 08:44:04.642753 18707 generic.go:334] "Generic (PLEG): container finished" podID="6a80bd6f-2263-4251-8197-5173193f8afc" containerID="d02eea2683ac6efae0f70fd8f717e97bc121a80e06da8071bf5f8c5bd619e9f0" exitCode=0 Mar 20 08:44:04.643770 master-0 kubenswrapper[18707]: I0320 08:44:04.642863 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" event={"ID":"6a80bd6f-2263-4251-8197-5173193f8afc","Type":"ContainerDied","Data":"d02eea2683ac6efae0f70fd8f717e97bc121a80e06da8071bf5f8c5bd619e9f0"} Mar 20 08:44:07.457053 master-0 kubenswrapper[18707]: I0320 08:44:07.456962 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 20 08:44:07.458359 master-0 kubenswrapper[18707]: I0320 08:44:07.458325 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 20 08:44:07.461678 master-0 kubenswrapper[18707]: I0320 08:44:07.461607 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 20 08:44:07.461901 master-0 kubenswrapper[18707]: I0320 08:44:07.461871 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-x5lzc" Mar 20 08:44:07.471967 master-0 kubenswrapper[18707]: I0320 08:44:07.471923 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 20 08:44:07.647749 master-0 kubenswrapper[18707]: I0320 08:44:07.647586 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3eda9567-712b-4541-9344-a333e7734fed-var-lock\") pod \"installer-2-master-0\" (UID: \"3eda9567-712b-4541-9344-a333e7734fed\") " pod="openshift-etcd/installer-2-master-0" Mar 20 08:44:07.647749 master-0 kubenswrapper[18707]: I0320 08:44:07.647729 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eda9567-712b-4541-9344-a333e7734fed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"3eda9567-712b-4541-9344-a333e7734fed\") " pod="openshift-etcd/installer-2-master-0" Mar 20 08:44:07.648103 master-0 kubenswrapper[18707]: I0320 08:44:07.647899 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3eda9567-712b-4541-9344-a333e7734fed-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"3eda9567-712b-4541-9344-a333e7734fed\") " pod="openshift-etcd/installer-2-master-0" Mar 20 08:44:07.749464 master-0 kubenswrapper[18707]: I0320 08:44:07.749269 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3eda9567-712b-4541-9344-a333e7734fed-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"3eda9567-712b-4541-9344-a333e7734fed\") " pod="openshift-etcd/installer-2-master-0" Mar 20 08:44:07.749464 master-0 kubenswrapper[18707]: I0320 08:44:07.749393 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3eda9567-712b-4541-9344-a333e7734fed-var-lock\") pod \"installer-2-master-0\" (UID: \"3eda9567-712b-4541-9344-a333e7734fed\") " pod="openshift-etcd/installer-2-master-0" Mar 20 08:44:07.749464 master-0 kubenswrapper[18707]: I0320 08:44:07.749423 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3eda9567-712b-4541-9344-a333e7734fed-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"3eda9567-712b-4541-9344-a333e7734fed\") " pod="openshift-etcd/installer-2-master-0" Mar 20 08:44:07.749464 master-0 kubenswrapper[18707]: I0320 08:44:07.749441 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eda9567-712b-4541-9344-a333e7734fed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"3eda9567-712b-4541-9344-a333e7734fed\") " pod="openshift-etcd/installer-2-master-0" Mar 20 08:44:07.750096 master-0 kubenswrapper[18707]: I0320 08:44:07.749571 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3eda9567-712b-4541-9344-a333e7734fed-var-lock\") pod \"installer-2-master-0\" (UID: \"3eda9567-712b-4541-9344-a333e7734fed\") " pod="openshift-etcd/installer-2-master-0" Mar 20 08:44:07.785416 master-0 kubenswrapper[18707]: I0320 08:44:07.785310 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eda9567-712b-4541-9344-a333e7734fed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"3eda9567-712b-4541-9344-a333e7734fed\") " pod="openshift-etcd/installer-2-master-0" Mar 20 08:44:07.825254 master-0 kubenswrapper[18707]: I0320 08:44:07.825143 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 20 08:44:08.253089 master-0 kubenswrapper[18707]: I0320 08:44:08.253021 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 20 08:44:08.679424 master-0 kubenswrapper[18707]: I0320 08:44:08.679119 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"3eda9567-712b-4541-9344-a333e7734fed","Type":"ContainerStarted","Data":"6a965a004a799a7227efab7ec68c05de7401e629ea703cff56c404bc2ecb8d83"} Mar 20 08:44:09.688701 master-0 kubenswrapper[18707]: I0320 08:44:09.688502 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"3eda9567-712b-4541-9344-a333e7734fed","Type":"ContainerStarted","Data":"6f05dd6a29969585010f10aa13bff6ed73728734772cadb9dab99cd2906be079"} Mar 20 08:44:09.706383 master-0 kubenswrapper[18707]: I0320 08:44:09.706285 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=2.706255902 podStartE2EDuration="2.706255902s" podCreationTimestamp="2026-03-20 08:44:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:44:09.705389557 +0000 UTC m=+194.861569953" watchObservedRunningTime="2026-03-20 08:44:09.706255902 +0000 UTC m=+194.862436268" Mar 20 08:44:12.001106 master-0 kubenswrapper[18707]: E0320 08:44:12.001016 18707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:44:12.002806 master-0 kubenswrapper[18707]: E0320 08:44:12.002710 18707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:44:12.004109 master-0 kubenswrapper[18707]: E0320 08:44:12.004070 18707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:44:12.004294 master-0 kubenswrapper[18707]: E0320 08:44:12.004261 18707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" podUID="e6807720-382e-4b04-ad82-35cb4c225138" containerName="kube-multus-additional-cni-plugins" Mar 20 08:44:18.436293 master-0 kubenswrapper[18707]: I0320 08:44:18.436182 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4"] Mar 20 08:44:18.437683 master-0 kubenswrapper[18707]: I0320 08:44:18.437646 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:44:18.440289 master-0 kubenswrapper[18707]: I0320 08:44:18.440241 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 08:44:18.440495 master-0 kubenswrapper[18707]: I0320 08:44:18.440454 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 08:44:18.452959 master-0 kubenswrapper[18707]: I0320 08:44:18.452862 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4"] Mar 20 08:44:18.553124 master-0 kubenswrapper[18707]: I0320 08:44:18.553056 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/044870dd-540a-402e-84cb-fa1bf3d6a318-nginx-conf\") pod \"networking-console-plugin-7c6b76c555-zbpk4\" (UID: \"044870dd-540a-402e-84cb-fa1bf3d6a318\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:44:18.553618 master-0 kubenswrapper[18707]: I0320 08:44:18.553591 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-zbpk4\" (UID: \"044870dd-540a-402e-84cb-fa1bf3d6a318\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:44:18.658636 master-0 kubenswrapper[18707]: I0320 08:44:18.658563 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/044870dd-540a-402e-84cb-fa1bf3d6a318-nginx-conf\") pod \"networking-console-plugin-7c6b76c555-zbpk4\" (UID: \"044870dd-540a-402e-84cb-fa1bf3d6a318\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:44:18.659111 master-0 kubenswrapper[18707]: I0320 08:44:18.659082 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-zbpk4\" (UID: \"044870dd-540a-402e-84cb-fa1bf3d6a318\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:44:18.659360 master-0 kubenswrapper[18707]: E0320 08:44:18.659311 18707 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 20 08:44:18.659453 master-0 kubenswrapper[18707]: E0320 08:44:18.659395 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert podName:044870dd-540a-402e-84cb-fa1bf3d6a318 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:19.159375009 +0000 UTC m=+204.315555365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert") pod "networking-console-plugin-7c6b76c555-zbpk4" (UID: "044870dd-540a-402e-84cb-fa1bf3d6a318") : secret "networking-console-plugin-cert" not found Mar 20 08:44:18.659877 master-0 kubenswrapper[18707]: I0320 08:44:18.659802 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/044870dd-540a-402e-84cb-fa1bf3d6a318-nginx-conf\") pod \"networking-console-plugin-7c6b76c555-zbpk4\" (UID: \"044870dd-540a-402e-84cb-fa1bf3d6a318\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:44:19.165962 master-0 kubenswrapper[18707]: I0320 08:44:19.165847 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-zbpk4\" (UID: \"044870dd-540a-402e-84cb-fa1bf3d6a318\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:44:19.166359 master-0 kubenswrapper[18707]: E0320 08:44:19.166042 18707 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 20 08:44:19.166359 master-0 kubenswrapper[18707]: E0320 08:44:19.166135 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert podName:044870dd-540a-402e-84cb-fa1bf3d6a318 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:20.166103716 +0000 UTC m=+205.322284082 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert") pod "networking-console-plugin-7c6b76c555-zbpk4" (UID: "044870dd-540a-402e-84cb-fa1bf3d6a318") : secret "networking-console-plugin-cert" not found Mar 20 08:44:20.188918 master-0 kubenswrapper[18707]: I0320 08:44:20.188820 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-zbpk4\" (UID: \"044870dd-540a-402e-84cb-fa1bf3d6a318\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:44:20.189970 master-0 kubenswrapper[18707]: E0320 08:44:20.189212 18707 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 20 08:44:20.189970 master-0 kubenswrapper[18707]: E0320 08:44:20.189349 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert podName:044870dd-540a-402e-84cb-fa1bf3d6a318 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:22.189308656 +0000 UTC m=+207.345489312 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert") pod "networking-console-plugin-7c6b76c555-zbpk4" (UID: "044870dd-540a-402e-84cb-fa1bf3d6a318") : secret "networking-console-plugin-cert" not found Mar 20 08:44:21.695587 master-0 kubenswrapper[18707]: I0320 08:44:21.695504 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5dc898db58-5m6qg"] Mar 20 08:44:21.697762 master-0 kubenswrapper[18707]: I0320 08:44:21.697714 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.701323 master-0 kubenswrapper[18707]: I0320 08:44:21.701277 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 20 08:44:21.701530 master-0 kubenswrapper[18707]: I0320 08:44:21.701489 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 20 08:44:21.703707 master-0 kubenswrapper[18707]: I0320 08:44:21.703658 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 20 08:44:21.704110 master-0 kubenswrapper[18707]: I0320 08:44:21.704073 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-anne24rv49795" Mar 20 08:44:21.707784 master-0 kubenswrapper[18707]: I0320 08:44:21.707736 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 20 08:44:21.709502 master-0 kubenswrapper[18707]: I0320 08:44:21.709431 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 20 08:44:21.716056 master-0 kubenswrapper[18707]: I0320 08:44:21.715984 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn99k\" (UniqueName: \"kubernetes.io/projected/cbcefc08-9a02-4ab3-86be-cad330c447b8-kube-api-access-hn99k\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.716283 master-0 kubenswrapper[18707]: I0320 08:44:21.716234 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.716401 master-0 kubenswrapper[18707]: I0320 08:44:21.716348 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.716486 master-0 kubenswrapper[18707]: I0320 08:44:21.716427 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-grpc-tls\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.716565 master-0 kubenswrapper[18707]: I0320 08:44:21.716488 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.716637 master-0 kubenswrapper[18707]: I0320 08:44:21.716572 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-thanos-querier-tls\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.716637 master-0 kubenswrapper[18707]: I0320 08:44:21.716624 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.716769 master-0 kubenswrapper[18707]: I0320 08:44:21.716749 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cbcefc08-9a02-4ab3-86be-cad330c447b8-metrics-client-ca\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.728997 master-0 kubenswrapper[18707]: I0320 08:44:21.728940 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5dc898db58-5m6qg"] Mar 20 08:44:21.818749 master-0 kubenswrapper[18707]: I0320 08:44:21.818681 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.819021 master-0 kubenswrapper[18707]: I0320 08:44:21.818768 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cbcefc08-9a02-4ab3-86be-cad330c447b8-metrics-client-ca\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.819217 master-0 kubenswrapper[18707]: I0320 08:44:21.819135 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn99k\" (UniqueName: \"kubernetes.io/projected/cbcefc08-9a02-4ab3-86be-cad330c447b8-kube-api-access-hn99k\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.819314 master-0 kubenswrapper[18707]: I0320 08:44:21.819284 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.819374 master-0 kubenswrapper[18707]: I0320 08:44:21.819344 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.819421 master-0 kubenswrapper[18707]: I0320 08:44:21.819375 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-grpc-tls\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.819421 master-0 kubenswrapper[18707]: I0320 08:44:21.819412 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.819905 master-0 kubenswrapper[18707]: I0320 08:44:21.819872 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cbcefc08-9a02-4ab3-86be-cad330c447b8-metrics-client-ca\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.821065 master-0 kubenswrapper[18707]: I0320 08:44:21.819992 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-thanos-querier-tls\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.822423 master-0 kubenswrapper[18707]: I0320 08:44:21.822390 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.823974 master-0 kubenswrapper[18707]: I0320 08:44:21.823901 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.824055 master-0 kubenswrapper[18707]: I0320 08:44:21.823919 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-grpc-tls\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.828047 master-0 kubenswrapper[18707]: I0320 08:44:21.828001 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-thanos-querier-tls\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.828466 master-0 kubenswrapper[18707]: I0320 08:44:21.828429 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.828556 master-0 kubenswrapper[18707]: I0320 08:44:21.828475 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cbcefc08-9a02-4ab3-86be-cad330c447b8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:21.839528 master-0 kubenswrapper[18707]: I0320 08:44:21.839481 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn99k\" (UniqueName: \"kubernetes.io/projected/cbcefc08-9a02-4ab3-86be-cad330c447b8-kube-api-access-hn99k\") pod \"thanos-querier-5dc898db58-5m6qg\" (UID: \"cbcefc08-9a02-4ab3-86be-cad330c447b8\") " pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:22.000925 master-0 kubenswrapper[18707]: E0320 08:44:22.000706 18707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:44:22.002664 master-0 kubenswrapper[18707]: E0320 08:44:22.002547 18707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:44:22.004125 master-0 kubenswrapper[18707]: E0320 08:44:22.004047 18707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:44:22.004216 master-0 kubenswrapper[18707]: E0320 08:44:22.004153 18707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" podUID="e6807720-382e-4b04-ad82-35cb4c225138" containerName="kube-multus-additional-cni-plugins" Mar 20 08:44:22.026445 master-0 kubenswrapper[18707]: I0320 08:44:22.026329 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:22.227926 master-0 kubenswrapper[18707]: I0320 08:44:22.227843 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-zbpk4\" (UID: \"044870dd-540a-402e-84cb-fa1bf3d6a318\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:44:22.228286 master-0 kubenswrapper[18707]: E0320 08:44:22.228107 18707 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 20 08:44:22.228286 master-0 kubenswrapper[18707]: E0320 08:44:22.228271 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert podName:044870dd-540a-402e-84cb-fa1bf3d6a318 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:26.228223097 +0000 UTC m=+211.384403453 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert") pod "networking-console-plugin-7c6b76c555-zbpk4" (UID: "044870dd-540a-402e-84cb-fa1bf3d6a318") : secret "networking-console-plugin-cert" not found Mar 20 08:44:22.465352 master-0 kubenswrapper[18707]: I0320 08:44:22.465306 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5dc898db58-5m6qg"] Mar 20 08:44:22.792382 master-0 kubenswrapper[18707]: I0320 08:44:22.792214 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" event={"ID":"cbcefc08-9a02-4ab3-86be-cad330c447b8","Type":"ContainerStarted","Data":"56ca4af1b577cf525b50a46a2ea1a7791a9feae8897ebb8903edee645584d8f5"} Mar 20 08:44:24.438122 master-0 kubenswrapper[18707]: I0320 08:44:24.437990 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-646d59cb8b-622bl"] Mar 20 08:44:24.447048 master-0 kubenswrapper[18707]: I0320 08:44:24.439329 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.447048 master-0 kubenswrapper[18707]: I0320 08:44:24.441980 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-15j6gm4rb8461" Mar 20 08:44:24.456481 master-0 kubenswrapper[18707]: I0320 08:44:24.456387 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-64c67d44c4-s7vfs"] Mar 20 08:44:24.456958 master-0 kubenswrapper[18707]: I0320 08:44:24.456828 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" podUID="a69e8d3a-a0b1-4688-8631-d9f265aa4c69" containerName="metrics-server" containerID="cri-o://3fbc2ee15fb564cff30cd6bea616239c5ddcc5c59f79a881b3626f244e98915b" gracePeriod=170 Mar 20 08:44:24.469053 master-0 kubenswrapper[18707]: I0320 08:44:24.468979 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-646d59cb8b-622bl"] Mar 20 08:44:24.479293 master-0 kubenswrapper[18707]: I0320 08:44:24.479251 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztdp4\" (UniqueName: \"kubernetes.io/projected/415d7f64-0fdd-4d6f-af37-fc928f35dde8-kube-api-access-ztdp4\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.479583 master-0 kubenswrapper[18707]: I0320 08:44:24.479373 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415d7f64-0fdd-4d6f-af37-fc928f35dde8-client-ca-bundle\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.479583 master-0 kubenswrapper[18707]: I0320 08:44:24.479484 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/415d7f64-0fdd-4d6f-af37-fc928f35dde8-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.479583 master-0 kubenswrapper[18707]: I0320 08:44:24.479518 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/415d7f64-0fdd-4d6f-af37-fc928f35dde8-audit-log\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.479583 master-0 kubenswrapper[18707]: I0320 08:44:24.479572 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/415d7f64-0fdd-4d6f-af37-fc928f35dde8-secret-metrics-server-tls\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.479725 master-0 kubenswrapper[18707]: I0320 08:44:24.479689 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/415d7f64-0fdd-4d6f-af37-fc928f35dde8-metrics-server-audit-profiles\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.479850 master-0 kubenswrapper[18707]: I0320 08:44:24.479830 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/415d7f64-0fdd-4d6f-af37-fc928f35dde8-secret-metrics-client-certs\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.581710 master-0 kubenswrapper[18707]: I0320 08:44:24.581603 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/415d7f64-0fdd-4d6f-af37-fc928f35dde8-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.581710 master-0 kubenswrapper[18707]: I0320 08:44:24.581723 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/415d7f64-0fdd-4d6f-af37-fc928f35dde8-audit-log\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.582079 master-0 kubenswrapper[18707]: I0320 08:44:24.581770 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/415d7f64-0fdd-4d6f-af37-fc928f35dde8-secret-metrics-server-tls\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.582079 master-0 kubenswrapper[18707]: I0320 08:44:24.581815 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/415d7f64-0fdd-4d6f-af37-fc928f35dde8-metrics-server-audit-profiles\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.582804 master-0 kubenswrapper[18707]: I0320 08:44:24.582743 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/415d7f64-0fdd-4d6f-af37-fc928f35dde8-secret-metrics-client-certs\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.583168 master-0 kubenswrapper[18707]: I0320 08:44:24.583137 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/415d7f64-0fdd-4d6f-af37-fc928f35dde8-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.583387 master-0 kubenswrapper[18707]: I0320 08:44:24.583326 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztdp4\" (UniqueName: \"kubernetes.io/projected/415d7f64-0fdd-4d6f-af37-fc928f35dde8-kube-api-access-ztdp4\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.583487 master-0 kubenswrapper[18707]: I0320 08:44:24.583471 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415d7f64-0fdd-4d6f-af37-fc928f35dde8-client-ca-bundle\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.584742 master-0 kubenswrapper[18707]: I0320 08:44:24.584718 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/415d7f64-0fdd-4d6f-af37-fc928f35dde8-audit-log\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.586785 master-0 kubenswrapper[18707]: I0320 08:44:24.586741 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/415d7f64-0fdd-4d6f-af37-fc928f35dde8-metrics-server-audit-profiles\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.589496 master-0 kubenswrapper[18707]: I0320 08:44:24.589445 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/415d7f64-0fdd-4d6f-af37-fc928f35dde8-secret-metrics-client-certs\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.591875 master-0 kubenswrapper[18707]: I0320 08:44:24.591731 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415d7f64-0fdd-4d6f-af37-fc928f35dde8-client-ca-bundle\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.597559 master-0 kubenswrapper[18707]: I0320 08:44:24.595846 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/415d7f64-0fdd-4d6f-af37-fc928f35dde8-secret-metrics-server-tls\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.604658 master-0 kubenswrapper[18707]: I0320 08:44:24.604612 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztdp4\" (UniqueName: \"kubernetes.io/projected/415d7f64-0fdd-4d6f-af37-fc928f35dde8-kube-api-access-ztdp4\") pod \"metrics-server-646d59cb8b-622bl\" (UID: \"415d7f64-0fdd-4d6f-af37-fc928f35dde8\") " pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:24.786801 master-0 kubenswrapper[18707]: I0320 08:44:24.786621 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:25.648885 master-0 kubenswrapper[18707]: E0320 08:44:25.648799 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a80bd6f_2263_4251_8197_5173193f8afc.slice/crio-conmon-d02eea2683ac6efae0f70fd8f717e97bc121a80e06da8071bf5f8c5bd619e9f0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6807720_382e_4b04_ad82_35cb4c225138.slice/crio-7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09.scope\": RecentStats: unable to find data in memory cache]" Mar 20 08:44:25.648885 master-0 kubenswrapper[18707]: E0320 08:44:25.648862 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a80bd6f_2263_4251_8197_5173193f8afc.slice/crio-conmon-d02eea2683ac6efae0f70fd8f717e97bc121a80e06da8071bf5f8c5bd619e9f0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6807720_382e_4b04_ad82_35cb4c225138.slice/crio-conmon-7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a80bd6f_2263_4251_8197_5173193f8afc.slice/crio-d02eea2683ac6efae0f70fd8f717e97bc121a80e06da8071bf5f8c5bd619e9f0.scope\": RecentStats: unable to find data in memory cache]" Mar 20 08:44:25.649900 master-0 kubenswrapper[18707]: E0320 08:44:25.649840 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a80bd6f_2263_4251_8197_5173193f8afc.slice/crio-d02eea2683ac6efae0f70fd8f717e97bc121a80e06da8071bf5f8c5bd619e9f0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a80bd6f_2263_4251_8197_5173193f8afc.slice/crio-conmon-d02eea2683ac6efae0f70fd8f717e97bc121a80e06da8071bf5f8c5bd619e9f0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6807720_382e_4b04_ad82_35cb4c225138.slice/crio-conmon-7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09.scope\": RecentStats: unable to find data in memory cache]" Mar 20 08:44:25.676795 master-0 kubenswrapper[18707]: I0320 08:44:25.675106 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-x2bwv_e6807720-382e-4b04-ad82-35cb4c225138/kube-multus-additional-cni-plugins/0.log" Mar 20 08:44:25.676795 master-0 kubenswrapper[18707]: I0320 08:44:25.675227 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:44:25.703705 master-0 kubenswrapper[18707]: I0320 08:44:25.703653 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/e6807720-382e-4b04-ad82-35cb4c225138-ready\") pod \"e6807720-382e-4b04-ad82-35cb4c225138\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " Mar 20 08:44:25.703848 master-0 kubenswrapper[18707]: I0320 08:44:25.703835 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6807720-382e-4b04-ad82-35cb4c225138-tuning-conf-dir\") pod \"e6807720-382e-4b04-ad82-35cb4c225138\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " Mar 20 08:44:25.703997 master-0 kubenswrapper[18707]: I0320 08:44:25.703969 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6807720-382e-4b04-ad82-35cb4c225138-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "e6807720-382e-4b04-ad82-35cb4c225138" (UID: "e6807720-382e-4b04-ad82-35cb4c225138"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:44:25.704092 master-0 kubenswrapper[18707]: I0320 08:44:25.704048 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6807720-382e-4b04-ad82-35cb4c225138-ready" (OuterVolumeSpecName: "ready") pod "e6807720-382e-4b04-ad82-35cb4c225138" (UID: "e6807720-382e-4b04-ad82-35cb4c225138"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:44:25.704092 master-0 kubenswrapper[18707]: I0320 08:44:25.704052 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6807720-382e-4b04-ad82-35cb4c225138-cni-sysctl-allowlist\") pod \"e6807720-382e-4b04-ad82-35cb4c225138\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " Mar 20 08:44:25.704242 master-0 kubenswrapper[18707]: I0320 08:44:25.704218 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dmxj\" (UniqueName: \"kubernetes.io/projected/e6807720-382e-4b04-ad82-35cb4c225138-kube-api-access-6dmxj\") pod \"e6807720-382e-4b04-ad82-35cb4c225138\" (UID: \"e6807720-382e-4b04-ad82-35cb4c225138\") " Mar 20 08:44:25.704673 master-0 kubenswrapper[18707]: I0320 08:44:25.704654 18707 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/e6807720-382e-4b04-ad82-35cb4c225138-ready\") on node \"master-0\" DevicePath \"\"" Mar 20 08:44:25.704673 master-0 kubenswrapper[18707]: I0320 08:44:25.704671 18707 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6807720-382e-4b04-ad82-35cb4c225138-tuning-conf-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:44:25.704983 master-0 kubenswrapper[18707]: I0320 08:44:25.704959 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6807720-382e-4b04-ad82-35cb4c225138-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "e6807720-382e-4b04-ad82-35cb4c225138" (UID: "e6807720-382e-4b04-ad82-35cb4c225138"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:25.705328 master-0 kubenswrapper[18707]: I0320 08:44:25.705260 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-646d59cb8b-622bl"] Mar 20 08:44:25.710228 master-0 kubenswrapper[18707]: I0320 08:44:25.708664 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6807720-382e-4b04-ad82-35cb4c225138-kube-api-access-6dmxj" (OuterVolumeSpecName: "kube-api-access-6dmxj") pod "e6807720-382e-4b04-ad82-35cb4c225138" (UID: "e6807720-382e-4b04-ad82-35cb4c225138"). InnerVolumeSpecName "kube-api-access-6dmxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:25.717452 master-0 kubenswrapper[18707]: W0320 08:44:25.717377 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod415d7f64_0fdd_4d6f_af37_fc928f35dde8.slice/crio-2707ecb6596e0df093eafbc8f1e974282b0b39c49e2c432f0a772b7a6ae3086b WatchSource:0}: Error finding container 2707ecb6596e0df093eafbc8f1e974282b0b39c49e2c432f0a772b7a6ae3086b: Status 404 returned error can't find the container with id 2707ecb6596e0df093eafbc8f1e974282b0b39c49e2c432f0a772b7a6ae3086b Mar 20 08:44:25.805864 master-0 kubenswrapper[18707]: I0320 08:44:25.805828 18707 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6807720-382e-4b04-ad82-35cb4c225138-cni-sysctl-allowlist\") on node \"master-0\" DevicePath \"\"" Mar 20 08:44:25.806026 master-0 kubenswrapper[18707]: I0320 08:44:25.805869 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dmxj\" (UniqueName: \"kubernetes.io/projected/e6807720-382e-4b04-ad82-35cb4c225138-kube-api-access-6dmxj\") on node \"master-0\" DevicePath \"\"" Mar 20 08:44:25.819369 master-0 kubenswrapper[18707]: I0320 08:44:25.818671 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-x2bwv_e6807720-382e-4b04-ad82-35cb4c225138/kube-multus-additional-cni-plugins/0.log" Mar 20 08:44:25.819369 master-0 kubenswrapper[18707]: I0320 08:44:25.818727 18707 generic.go:334] "Generic (PLEG): container finished" podID="e6807720-382e-4b04-ad82-35cb4c225138" containerID="7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09" exitCode=137 Mar 20 08:44:25.819369 master-0 kubenswrapper[18707]: I0320 08:44:25.818795 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" event={"ID":"e6807720-382e-4b04-ad82-35cb4c225138","Type":"ContainerDied","Data":"7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09"} Mar 20 08:44:25.819369 master-0 kubenswrapper[18707]: I0320 08:44:25.818829 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" event={"ID":"e6807720-382e-4b04-ad82-35cb4c225138","Type":"ContainerDied","Data":"c7d73961da64ca498ab44f81d8c88b31a8d73b74ca83992e05641fb80dbd5877"} Mar 20 08:44:25.819369 master-0 kubenswrapper[18707]: I0320 08:44:25.818855 18707 scope.go:117] "RemoveContainer" containerID="7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09" Mar 20 08:44:25.819369 master-0 kubenswrapper[18707]: I0320 08:44:25.818985 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-x2bwv" Mar 20 08:44:25.840169 master-0 kubenswrapper[18707]: I0320 08:44:25.840082 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" event={"ID":"415d7f64-0fdd-4d6f-af37-fc928f35dde8","Type":"ContainerStarted","Data":"2707ecb6596e0df093eafbc8f1e974282b0b39c49e2c432f0a772b7a6ae3086b"} Mar 20 08:44:25.851470 master-0 kubenswrapper[18707]: I0320 08:44:25.851404 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" event={"ID":"cbcefc08-9a02-4ab3-86be-cad330c447b8","Type":"ContainerStarted","Data":"981c21e706a72bd390c67bdee8795c250e4c0d6d0e75fb6d4a025983851b379a"} Mar 20 08:44:25.851470 master-0 kubenswrapper[18707]: I0320 08:44:25.851469 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" event={"ID":"cbcefc08-9a02-4ab3-86be-cad330c447b8","Type":"ContainerStarted","Data":"3045603f32f3c1b377053442a8a88d0387cc2dbbce0f8217ba090bd8a78ce231"} Mar 20 08:44:25.865268 master-0 kubenswrapper[18707]: I0320 08:44:25.865115 18707 scope.go:117] "RemoveContainer" containerID="7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09" Mar 20 08:44:25.865852 master-0 kubenswrapper[18707]: E0320 08:44:25.865789 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09\": container with ID starting with 7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09 not found: ID does not exist" containerID="7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09" Mar 20 08:44:25.865852 master-0 kubenswrapper[18707]: I0320 08:44:25.865820 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09"} err="failed to get container status \"7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09\": rpc error: code = NotFound desc = could not find container \"7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09\": container with ID starting with 7ea5019eb61f348ed5c35f82d4e9a010be6a7ea875dfde6b0e5c7f0ba1d78d09 not found: ID does not exist" Mar 20 08:44:25.879176 master-0 kubenswrapper[18707]: I0320 08:44:25.879095 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-x2bwv"] Mar 20 08:44:25.885112 master-0 kubenswrapper[18707]: I0320 08:44:25.885022 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-x2bwv"] Mar 20 08:44:26.314290 master-0 kubenswrapper[18707]: I0320 08:44:26.314165 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-zbpk4\" (UID: \"044870dd-540a-402e-84cb-fa1bf3d6a318\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:44:26.314581 master-0 kubenswrapper[18707]: E0320 08:44:26.314325 18707 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 20 08:44:26.314581 master-0 kubenswrapper[18707]: E0320 08:44:26.314400 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert podName:044870dd-540a-402e-84cb-fa1bf3d6a318 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:34.314381222 +0000 UTC m=+219.470561578 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert") pod "networking-console-plugin-7c6b76c555-zbpk4" (UID: "044870dd-540a-402e-84cb-fa1bf3d6a318") : secret "networking-console-plugin-cert" not found Mar 20 08:44:26.864037 master-0 kubenswrapper[18707]: I0320 08:44:26.863809 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" event={"ID":"415d7f64-0fdd-4d6f-af37-fc928f35dde8","Type":"ContainerStarted","Data":"275edc12a5fe2ecf8f280ea6e7afb59326e0184c2f1b60ad23899f5464792f0b"} Mar 20 08:44:26.868164 master-0 kubenswrapper[18707]: I0320 08:44:26.868124 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" event={"ID":"cbcefc08-9a02-4ab3-86be-cad330c447b8","Type":"ContainerStarted","Data":"dfbac50b053a2398baab28c95eb2b32d308cc9f3479de6dc2679d4de4b9e4aba"} Mar 20 08:44:26.894688 master-0 kubenswrapper[18707]: I0320 08:44:26.893955 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" podStartSLOduration=2.893926204 podStartE2EDuration="2.893926204s" podCreationTimestamp="2026-03-20 08:44:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:44:26.884756566 +0000 UTC m=+212.040936922" watchObservedRunningTime="2026-03-20 08:44:26.893926204 +0000 UTC m=+212.050106560" Mar 20 08:44:27.109378 master-0 kubenswrapper[18707]: I0320 08:44:27.109319 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6807720-382e-4b04-ad82-35cb4c225138" path="/var/lib/kubelet/pods/e6807720-382e-4b04-ad82-35cb4c225138/volumes" Mar 20 08:44:27.879592 master-0 kubenswrapper[18707]: I0320 08:44:27.879515 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" event={"ID":"cbcefc08-9a02-4ab3-86be-cad330c447b8","Type":"ContainerStarted","Data":"fc9b3bb523649f52646edbf915083a367e9a53d5c714db2433c2f3ed33c7b012"} Mar 20 08:44:27.879592 master-0 kubenswrapper[18707]: I0320 08:44:27.879598 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" event={"ID":"cbcefc08-9a02-4ab3-86be-cad330c447b8","Type":"ContainerStarted","Data":"44faffa6e170f723af27aecb21883a57c2b152d9d747fa6772548e009cdc5bd0"} Mar 20 08:44:27.880314 master-0 kubenswrapper[18707]: I0320 08:44:27.879617 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" event={"ID":"cbcefc08-9a02-4ab3-86be-cad330c447b8","Type":"ContainerStarted","Data":"23d48fba56776d9a1637e760a5a632bdf32ecd7b19d18de79351e71d5f19bb24"} Mar 20 08:44:27.914885 master-0 kubenswrapper[18707]: I0320 08:44:27.914758 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" podStartSLOduration=2.380286958 podStartE2EDuration="6.914732874s" podCreationTimestamp="2026-03-20 08:44:21 +0000 UTC" firstStartedPulling="2026-03-20 08:44:22.465909324 +0000 UTC m=+207.622089680" lastFinishedPulling="2026-03-20 08:44:27.00035524 +0000 UTC m=+212.156535596" observedRunningTime="2026-03-20 08:44:27.910953794 +0000 UTC m=+213.067134170" watchObservedRunningTime="2026-03-20 08:44:27.914732874 +0000 UTC m=+213.070913230" Mar 20 08:44:28.890029 master-0 kubenswrapper[18707]: I0320 08:44:28.889950 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:30.645180 master-0 kubenswrapper[18707]: I0320 08:44:30.645084 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 20 08:44:30.645836 master-0 kubenswrapper[18707]: E0320 08:44:30.645511 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6807720-382e-4b04-ad82-35cb4c225138" containerName="kube-multus-additional-cni-plugins" Mar 20 08:44:30.645836 master-0 kubenswrapper[18707]: I0320 08:44:30.645528 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6807720-382e-4b04-ad82-35cb4c225138" containerName="kube-multus-additional-cni-plugins" Mar 20 08:44:30.645836 master-0 kubenswrapper[18707]: I0320 08:44:30.645673 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6807720-382e-4b04-ad82-35cb4c225138" containerName="kube-multus-additional-cni-plugins" Mar 20 08:44:30.647784 master-0 kubenswrapper[18707]: I0320 08:44:30.647757 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.650242 master-0 kubenswrapper[18707]: I0320 08:44:30.650205 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 20 08:44:30.650672 master-0 kubenswrapper[18707]: I0320 08:44:30.650649 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 20 08:44:30.651054 master-0 kubenswrapper[18707]: I0320 08:44:30.651030 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 20 08:44:30.651157 master-0 kubenswrapper[18707]: I0320 08:44:30.651138 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 20 08:44:30.651682 master-0 kubenswrapper[18707]: I0320 08:44:30.651640 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 20 08:44:30.652166 master-0 kubenswrapper[18707]: I0320 08:44:30.652142 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 20 08:44:30.656956 master-0 kubenswrapper[18707]: I0320 08:44:30.656876 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 20 08:44:30.673693 master-0 kubenswrapper[18707]: I0320 08:44:30.673651 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 20 08:44:30.682772 master-0 kubenswrapper[18707]: I0320 08:44:30.682717 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 20 08:44:30.719116 master-0 kubenswrapper[18707]: I0320 08:44:30.719059 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.719498 master-0 kubenswrapper[18707]: I0320 08:44:30.719476 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-config-volume\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.719602 master-0 kubenswrapper[18707]: I0320 08:44:30.719588 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.719691 master-0 kubenswrapper[18707]: I0320 08:44:30.719677 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-config-out\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.719797 master-0 kubenswrapper[18707]: I0320 08:44:30.719781 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvshj\" (UniqueName: \"kubernetes.io/projected/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-kube-api-access-gvshj\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.719882 master-0 kubenswrapper[18707]: I0320 08:44:30.719869 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.721157 master-0 kubenswrapper[18707]: I0320 08:44:30.720027 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.721157 master-0 kubenswrapper[18707]: I0320 08:44:30.720166 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.721157 master-0 kubenswrapper[18707]: I0320 08:44:30.720227 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.721157 master-0 kubenswrapper[18707]: I0320 08:44:30.720328 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.721157 master-0 kubenswrapper[18707]: I0320 08:44:30.720358 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-web-config\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.721157 master-0 kubenswrapper[18707]: I0320 08:44:30.720374 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.821913 master-0 kubenswrapper[18707]: I0320 08:44:30.821752 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.821913 master-0 kubenswrapper[18707]: I0320 08:44:30.821902 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-config-out\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.822288 master-0 kubenswrapper[18707]: I0320 08:44:30.821953 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvshj\" (UniqueName: \"kubernetes.io/projected/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-kube-api-access-gvshj\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.822288 master-0 kubenswrapper[18707]: I0320 08:44:30.822214 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.822392 master-0 kubenswrapper[18707]: I0320 08:44:30.822355 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.822529 master-0 kubenswrapper[18707]: I0320 08:44:30.822499 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.822587 master-0 kubenswrapper[18707]: I0320 08:44:30.822559 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.822785 master-0 kubenswrapper[18707]: I0320 08:44:30.822746 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.822846 master-0 kubenswrapper[18707]: I0320 08:44:30.822795 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-web-config\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.822846 master-0 kubenswrapper[18707]: I0320 08:44:30.822822 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.822938 master-0 kubenswrapper[18707]: I0320 08:44:30.822914 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.823088 master-0 kubenswrapper[18707]: I0320 08:44:30.823062 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-config-volume\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.825659 master-0 kubenswrapper[18707]: E0320 08:44:30.825130 18707 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Mar 20 08:44:30.825659 master-0 kubenswrapper[18707]: E0320 08:44:30.825287 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls podName:14d29bfa-a0cf-43bd-a3b8-052c1a224fc9 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:31.32525591 +0000 UTC m=+216.481436276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9") : secret "alertmanager-main-tls" not found Mar 20 08:44:30.825659 master-0 kubenswrapper[18707]: I0320 08:44:30.825583 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.826591 master-0 kubenswrapper[18707]: I0320 08:44:30.826552 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.826684 master-0 kubenswrapper[18707]: I0320 08:44:30.826614 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-config-out\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.826821 master-0 kubenswrapper[18707]: I0320 08:44:30.826774 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.827744 master-0 kubenswrapper[18707]: I0320 08:44:30.827702 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.828044 master-0 kubenswrapper[18707]: I0320 08:44:30.827961 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-config-volume\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.828394 master-0 kubenswrapper[18707]: I0320 08:44:30.828362 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.828801 master-0 kubenswrapper[18707]: I0320 08:44:30.828760 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.829015 master-0 kubenswrapper[18707]: I0320 08:44:30.828987 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.830056 master-0 kubenswrapper[18707]: I0320 08:44:30.829983 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-web-config\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:30.845405 master-0 kubenswrapper[18707]: I0320 08:44:30.845333 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvshj\" (UniqueName: \"kubernetes.io/projected/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-kube-api-access-gvshj\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:31.330907 master-0 kubenswrapper[18707]: I0320 08:44:31.330831 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:31.331208 master-0 kubenswrapper[18707]: E0320 08:44:31.331157 18707 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Mar 20 08:44:31.331271 master-0 kubenswrapper[18707]: E0320 08:44:31.331246 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls podName:14d29bfa-a0cf-43bd-a3b8-052c1a224fc9 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:32.331223246 +0000 UTC m=+217.487403602 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9") : secret "alertmanager-main-tls" not found Mar 20 08:44:32.038009 master-0 kubenswrapper[18707]: I0320 08:44:32.037934 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5dc898db58-5m6qg" Mar 20 08:44:32.351930 master-0 kubenswrapper[18707]: I0320 08:44:32.351857 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:32.352362 master-0 kubenswrapper[18707]: E0320 08:44:32.352132 18707 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Mar 20 08:44:32.352629 master-0 kubenswrapper[18707]: E0320 08:44:32.352602 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls podName:14d29bfa-a0cf-43bd-a3b8-052c1a224fc9 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:34.352569362 +0000 UTC m=+219.508749748 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9") : secret "alertmanager-main-tls" not found Mar 20 08:44:33.932841 master-0 kubenswrapper[18707]: I0320 08:44:33.932757 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5dbbb8b86f-5rrrh_6a80bd6f-2263-4251-8197-5173193f8afc/multus-admission-controller/0.log" Mar 20 08:44:33.933583 master-0 kubenswrapper[18707]: I0320 08:44:33.932853 18707 generic.go:334] "Generic (PLEG): container finished" podID="6a80bd6f-2263-4251-8197-5173193f8afc" containerID="199be5e4e7ebbf5d512a41b53ac83599e0d0e19351865228cfbac1cc0f5d6aa8" exitCode=137 Mar 20 08:44:33.933583 master-0 kubenswrapper[18707]: I0320 08:44:33.932903 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" event={"ID":"6a80bd6f-2263-4251-8197-5173193f8afc","Type":"ContainerDied","Data":"199be5e4e7ebbf5d512a41b53ac83599e0d0e19351865228cfbac1cc0f5d6aa8"} Mar 20 08:44:34.386913 master-0 kubenswrapper[18707]: I0320 08:44:34.386825 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:34.387247 master-0 kubenswrapper[18707]: I0320 08:44:34.387012 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-zbpk4\" (UID: \"044870dd-540a-402e-84cb-fa1bf3d6a318\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:44:34.387591 master-0 kubenswrapper[18707]: E0320 08:44:34.387540 18707 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Mar 20 08:44:34.387770 master-0 kubenswrapper[18707]: E0320 08:44:34.387751 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls podName:14d29bfa-a0cf-43bd-a3b8-052c1a224fc9 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:38.387722033 +0000 UTC m=+223.543902399 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9") : secret "alertmanager-main-tls" not found Mar 20 08:44:34.388391 master-0 kubenswrapper[18707]: E0320 08:44:34.388369 18707 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 20 08:44:34.388554 master-0 kubenswrapper[18707]: E0320 08:44:34.388538 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert podName:044870dd-540a-402e-84cb-fa1bf3d6a318 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:50.388522056 +0000 UTC m=+235.544702432 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert") pod "networking-console-plugin-7c6b76c555-zbpk4" (UID: "044870dd-540a-402e-84cb-fa1bf3d6a318") : secret "networking-console-plugin-cert" not found Mar 20 08:44:34.667867 master-0 kubenswrapper[18707]: I0320 08:44:34.667805 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5dbbb8b86f-5rrrh_6a80bd6f-2263-4251-8197-5173193f8afc/multus-admission-controller/0.log" Mar 20 08:44:34.668229 master-0 kubenswrapper[18707]: I0320 08:44:34.667918 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:44:34.793781 master-0 kubenswrapper[18707]: I0320 08:44:34.793700 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqd2v\" (UniqueName: \"kubernetes.io/projected/6a80bd6f-2263-4251-8197-5173193f8afc-kube-api-access-tqd2v\") pod \"6a80bd6f-2263-4251-8197-5173193f8afc\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " Mar 20 08:44:34.794160 master-0 kubenswrapper[18707]: I0320 08:44:34.793956 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") pod \"6a80bd6f-2263-4251-8197-5173193f8afc\" (UID: \"6a80bd6f-2263-4251-8197-5173193f8afc\") " Mar 20 08:44:34.800515 master-0 kubenswrapper[18707]: I0320 08:44:34.800386 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a80bd6f-2263-4251-8197-5173193f8afc-kube-api-access-tqd2v" (OuterVolumeSpecName: "kube-api-access-tqd2v") pod "6a80bd6f-2263-4251-8197-5173193f8afc" (UID: "6a80bd6f-2263-4251-8197-5173193f8afc"). InnerVolumeSpecName "kube-api-access-tqd2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:34.802616 master-0 kubenswrapper[18707]: I0320 08:44:34.802532 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "6a80bd6f-2263-4251-8197-5173193f8afc" (UID: "6a80bd6f-2263-4251-8197-5173193f8afc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:44:34.896689 master-0 kubenswrapper[18707]: I0320 08:44:34.896489 18707 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6a80bd6f-2263-4251-8197-5173193f8afc-webhook-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 08:44:34.896689 master-0 kubenswrapper[18707]: I0320 08:44:34.896552 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqd2v\" (UniqueName: \"kubernetes.io/projected/6a80bd6f-2263-4251-8197-5173193f8afc-kube-api-access-tqd2v\") on node \"master-0\" DevicePath \"\"" Mar 20 08:44:34.944390 master-0 kubenswrapper[18707]: I0320 08:44:34.944327 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5dbbb8b86f-5rrrh_6a80bd6f-2263-4251-8197-5173193f8afc/multus-admission-controller/0.log" Mar 20 08:44:34.945354 master-0 kubenswrapper[18707]: I0320 08:44:34.944415 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" event={"ID":"6a80bd6f-2263-4251-8197-5173193f8afc","Type":"ContainerDied","Data":"4d57d4740bcb6c82be46e50156208d72579c0c7e7b87226d99fe9e3e9e1ff1bb"} Mar 20 08:44:34.945354 master-0 kubenswrapper[18707]: I0320 08:44:34.944467 18707 scope.go:117] "RemoveContainer" containerID="d02eea2683ac6efae0f70fd8f717e97bc121a80e06da8071bf5f8c5bd619e9f0" Mar 20 08:44:34.945354 master-0 kubenswrapper[18707]: I0320 08:44:34.944574 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh" Mar 20 08:44:34.964379 master-0 kubenswrapper[18707]: I0320 08:44:34.964248 18707 scope.go:117] "RemoveContainer" containerID="199be5e4e7ebbf5d512a41b53ac83599e0d0e19351865228cfbac1cc0f5d6aa8" Mar 20 08:44:35.028009 master-0 kubenswrapper[18707]: I0320 08:44:35.027865 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh"] Mar 20 08:44:35.042246 master-0 kubenswrapper[18707]: I0320 08:44:35.042161 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-5rrrh"] Mar 20 08:44:35.105241 master-0 kubenswrapper[18707]: I0320 08:44:35.105159 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a80bd6f-2263-4251-8197-5173193f8afc" path="/var/lib/kubelet/pods/6a80bd6f-2263-4251-8197-5173193f8afc/volumes" Mar 20 08:44:37.199490 master-0 kubenswrapper[18707]: I0320 08:44:37.199413 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 20 08:44:37.200219 master-0 kubenswrapper[18707]: E0320 08:44:37.199757 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a80bd6f-2263-4251-8197-5173193f8afc" containerName="multus-admission-controller" Mar 20 08:44:37.200219 master-0 kubenswrapper[18707]: I0320 08:44:37.199772 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a80bd6f-2263-4251-8197-5173193f8afc" containerName="multus-admission-controller" Mar 20 08:44:37.200219 master-0 kubenswrapper[18707]: E0320 08:44:37.199795 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a80bd6f-2263-4251-8197-5173193f8afc" containerName="kube-rbac-proxy" Mar 20 08:44:37.200219 master-0 kubenswrapper[18707]: I0320 08:44:37.199803 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a80bd6f-2263-4251-8197-5173193f8afc" containerName="kube-rbac-proxy" Mar 20 08:44:37.200219 master-0 kubenswrapper[18707]: I0320 08:44:37.199936 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a80bd6f-2263-4251-8197-5173193f8afc" containerName="multus-admission-controller" Mar 20 08:44:37.200219 master-0 kubenswrapper[18707]: I0320 08:44:37.199968 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a80bd6f-2263-4251-8197-5173193f8afc" containerName="kube-rbac-proxy" Mar 20 08:44:37.201858 master-0 kubenswrapper[18707]: I0320 08:44:37.201826 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.210717 master-0 kubenswrapper[18707]: I0320 08:44:37.210657 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 20 08:44:37.214906 master-0 kubenswrapper[18707]: I0320 08:44:37.214851 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 20 08:44:37.219493 master-0 kubenswrapper[18707]: I0320 08:44:37.219435 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 20 08:44:37.219748 master-0 kubenswrapper[18707]: I0320 08:44:37.219439 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 20 08:44:37.220138 master-0 kubenswrapper[18707]: I0320 08:44:37.220107 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 20 08:44:37.220138 master-0 kubenswrapper[18707]: I0320 08:44:37.220108 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 20 08:44:37.220718 master-0 kubenswrapper[18707]: I0320 08:44:37.220662 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 20 08:44:37.221312 master-0 kubenswrapper[18707]: I0320 08:44:37.221262 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 20 08:44:37.221422 master-0 kubenswrapper[18707]: I0320 08:44:37.221388 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-fofds5kbvc1lq" Mar 20 08:44:37.226423 master-0 kubenswrapper[18707]: I0320 08:44:37.226358 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 20 08:44:37.227157 master-0 kubenswrapper[18707]: I0320 08:44:37.226953 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 20 08:44:37.227157 master-0 kubenswrapper[18707]: I0320 08:44:37.226969 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 20 08:44:37.228941 master-0 kubenswrapper[18707]: I0320 08:44:37.228902 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258525 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsh7c\" (UniqueName: \"kubernetes.io/projected/77ddbb16-b96e-4717-9786-2feae0d0cc3f-kube-api-access-bsh7c\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258602 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258645 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77ddbb16-b96e-4717-9786-2feae0d0cc3f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258666 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258688 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258717 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258738 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258778 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77ddbb16-b96e-4717-9786-2feae0d0cc3f-config-out\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258810 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258834 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258857 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258900 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258917 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-config\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258945 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258965 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-web-config\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.258986 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.259007 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.259305 master-0 kubenswrapper[18707]: I0320 08:44:37.259026 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.361425 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77ddbb16-b96e-4717-9786-2feae0d0cc3f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.361493 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.361519 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.361553 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.361584 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.361624 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77ddbb16-b96e-4717-9786-2feae0d0cc3f-config-out\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.361654 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.361673 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.361701 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.361932 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-config\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.361948 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.361979 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-web-config\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.361998 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.362020 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.362043 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.362063 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.362083 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsh7c\" (UniqueName: \"kubernetes.io/projected/77ddbb16-b96e-4717-9786-2feae0d0cc3f-kube-api-access-bsh7c\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.362213 master-0 kubenswrapper[18707]: I0320 08:44:37.362103 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.367349 master-0 kubenswrapper[18707]: I0320 08:44:37.366835 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.367349 master-0 kubenswrapper[18707]: I0320 08:44:37.367000 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-web-config\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.367349 master-0 kubenswrapper[18707]: I0320 08:44:37.367233 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.367498 master-0 kubenswrapper[18707]: E0320 08:44:37.367363 18707 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Mar 20 08:44:37.367498 master-0 kubenswrapper[18707]: E0320 08:44:37.367443 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-thanos-sidecar-tls podName:77ddbb16-b96e-4717-9786-2feae0d0cc3f nodeName:}" failed. No retries permitted until 2026-03-20 08:44:37.867418788 +0000 UTC m=+223.023599334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f") : secret "prometheus-k8s-thanos-sidecar-tls" not found Mar 20 08:44:37.373239 master-0 kubenswrapper[18707]: I0320 08:44:37.371155 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.373908 master-0 kubenswrapper[18707]: I0320 08:44:37.373855 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77ddbb16-b96e-4717-9786-2feae0d0cc3f-config-out\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.374563 master-0 kubenswrapper[18707]: I0320 08:44:37.374531 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.376511 master-0 kubenswrapper[18707]: I0320 08:44:37.376454 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.377039 master-0 kubenswrapper[18707]: I0320 08:44:37.376988 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.377290 master-0 kubenswrapper[18707]: E0320 08:44:37.377261 18707 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Mar 20 08:44:37.377376 master-0 kubenswrapper[18707]: E0320 08:44:37.377354 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-tls podName:77ddbb16-b96e-4717-9786-2feae0d0cc3f nodeName:}" failed. No retries permitted until 2026-03-20 08:44:37.877329967 +0000 UTC m=+223.033510323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f") : secret "prometheus-k8s-tls" not found Mar 20 08:44:37.385210 master-0 kubenswrapper[18707]: I0320 08:44:37.378726 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.385210 master-0 kubenswrapper[18707]: I0320 08:44:37.378971 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.385481 master-0 kubenswrapper[18707]: I0320 08:44:37.385422 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.387894 master-0 kubenswrapper[18707]: I0320 08:44:37.385512 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.387894 master-0 kubenswrapper[18707]: I0320 08:44:37.386223 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.388793 master-0 kubenswrapper[18707]: I0320 08:44:37.388758 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77ddbb16-b96e-4717-9786-2feae0d0cc3f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.411711 master-0 kubenswrapper[18707]: I0320 08:44:37.411630 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsh7c\" (UniqueName: \"kubernetes.io/projected/77ddbb16-b96e-4717-9786-2feae0d0cc3f-kube-api-access-bsh7c\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.424914 master-0 kubenswrapper[18707]: I0320 08:44:37.424823 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-config\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.870530 master-0 kubenswrapper[18707]: I0320 08:44:37.870435 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.870808 master-0 kubenswrapper[18707]: E0320 08:44:37.870702 18707 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Mar 20 08:44:37.870876 master-0 kubenswrapper[18707]: E0320 08:44:37.870844 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-thanos-sidecar-tls podName:77ddbb16-b96e-4717-9786-2feae0d0cc3f nodeName:}" failed. No retries permitted until 2026-03-20 08:44:38.870812139 +0000 UTC m=+224.026992655 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f") : secret "prometheus-k8s-thanos-sidecar-tls" not found Mar 20 08:44:37.972848 master-0 kubenswrapper[18707]: I0320 08:44:37.972592 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:37.972848 master-0 kubenswrapper[18707]: E0320 08:44:37.972847 18707 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Mar 20 08:44:37.973136 master-0 kubenswrapper[18707]: E0320 08:44:37.972917 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-tls podName:77ddbb16-b96e-4717-9786-2feae0d0cc3f nodeName:}" failed. No retries permitted until 2026-03-20 08:44:38.972894258 +0000 UTC m=+224.129074614 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f") : secret "prometheus-k8s-tls" not found Mar 20 08:44:38.481164 master-0 kubenswrapper[18707]: I0320 08:44:38.481063 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:38.485769 master-0 kubenswrapper[18707]: I0320 08:44:38.485709 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:38.492300 master-0 kubenswrapper[18707]: I0320 08:44:38.492252 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:44:38.888024 master-0 kubenswrapper[18707]: I0320 08:44:38.887936 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:38.892086 master-0 kubenswrapper[18707]: I0320 08:44:38.892037 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:38.959637 master-0 kubenswrapper[18707]: I0320 08:44:38.959579 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 20 08:44:38.976536 master-0 kubenswrapper[18707]: W0320 08:44:38.976472 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14d29bfa_a0cf_43bd_a3b8_052c1a224fc9.slice/crio-a41dc620a626e76b0ea35ff7c7456036843137618f3c1bc68202d11850160449 WatchSource:0}: Error finding container a41dc620a626e76b0ea35ff7c7456036843137618f3c1bc68202d11850160449: Status 404 returned error can't find the container with id a41dc620a626e76b0ea35ff7c7456036843137618f3c1bc68202d11850160449 Mar 20 08:44:38.983254 master-0 kubenswrapper[18707]: I0320 08:44:38.983169 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9","Type":"ContainerStarted","Data":"a41dc620a626e76b0ea35ff7c7456036843137618f3c1bc68202d11850160449"} Mar 20 08:44:38.990035 master-0 kubenswrapper[18707]: I0320 08:44:38.989502 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:38.993741 master-0 kubenswrapper[18707]: I0320 08:44:38.992962 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:39.019062 master-0 kubenswrapper[18707]: I0320 08:44:39.018969 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:39.486663 master-0 kubenswrapper[18707]: I0320 08:44:39.486489 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 20 08:44:39.497020 master-0 kubenswrapper[18707]: W0320 08:44:39.496953 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77ddbb16_b96e_4717_9786_2feae0d0cc3f.slice/crio-1f3f6e5cd3c4ec0aac0308f00b6155dfd226f4c87924acc6f28fc32640f5a3c1 WatchSource:0}: Error finding container 1f3f6e5cd3c4ec0aac0308f00b6155dfd226f4c87924acc6f28fc32640f5a3c1: Status 404 returned error can't find the container with id 1f3f6e5cd3c4ec0aac0308f00b6155dfd226f4c87924acc6f28fc32640f5a3c1 Mar 20 08:44:39.886978 master-0 kubenswrapper[18707]: E0320 08:44:39.886921 18707 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/etcd-pod.yaml\": /etc/kubernetes/manifests/etcd-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 20 08:44:39.887407 master-0 kubenswrapper[18707]: I0320 08:44:39.887365 18707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 20 08:44:39.887893 master-0 kubenswrapper[18707]: I0320 08:44:39.887861 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" containerID="cri-o://c0c0eafff8c825fc9c4a32593e8d54d61ac68f27a7fde59d8dfb857aeb1580f0" gracePeriod=30 Mar 20 08:44:39.888108 master-0 kubenswrapper[18707]: I0320 08:44:39.887979 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" containerID="cri-o://5de11809fbb3db5b6981fddb634a5dbf7f162fcbe9eede8cb63026b2ff7e2a3e" gracePeriod=30 Mar 20 08:44:39.888210 master-0 kubenswrapper[18707]: I0320 08:44:39.887897 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" containerID="cri-o://267f2d3e5624276bc815692d6f63750c35fab88bf4fad9637c60210f294ab470" gracePeriod=30 Mar 20 08:44:39.888322 master-0 kubenswrapper[18707]: I0320 08:44:39.888002 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" containerID="cri-o://336ee2eca239c702b23f8eadc224486f445c6fd4853f373a73d423d9f64cfcac" gracePeriod=30 Mar 20 08:44:39.888406 master-0 kubenswrapper[18707]: I0320 08:44:39.887903 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" containerID="cri-o://91928dd4bf037a74fc3110c950269e5b4ae8998e3616107aa1170ce1d3fede55" gracePeriod=30 Mar 20 08:44:39.890265 master-0 kubenswrapper[18707]: I0320 08:44:39.890209 18707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 20 08:44:39.890667 master-0 kubenswrapper[18707]: E0320 08:44:39.890640 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" Mar 20 08:44:39.890667 master-0 kubenswrapper[18707]: I0320 08:44:39.890662 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" Mar 20 08:44:39.890758 master-0 kubenswrapper[18707]: E0320 08:44:39.890693 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" Mar 20 08:44:39.890758 master-0 kubenswrapper[18707]: I0320 08:44:39.890708 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" Mar 20 08:44:39.890758 master-0 kubenswrapper[18707]: E0320 08:44:39.890726 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="setup" Mar 20 08:44:39.890758 master-0 kubenswrapper[18707]: I0320 08:44:39.890736 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="setup" Mar 20 08:44:39.890758 master-0 kubenswrapper[18707]: E0320 08:44:39.890750 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" Mar 20 08:44:39.890758 master-0 kubenswrapper[18707]: I0320 08:44:39.890758 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" Mar 20 08:44:39.890926 master-0 kubenswrapper[18707]: E0320 08:44:39.890775 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-resources-copy" Mar 20 08:44:39.890926 master-0 kubenswrapper[18707]: I0320 08:44:39.890783 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-resources-copy" Mar 20 08:44:39.890926 master-0 kubenswrapper[18707]: E0320 08:44:39.890795 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" Mar 20 08:44:39.890926 master-0 kubenswrapper[18707]: I0320 08:44:39.890801 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" Mar 20 08:44:39.890926 master-0 kubenswrapper[18707]: E0320 08:44:39.890809 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-ensure-env-vars" Mar 20 08:44:39.890926 master-0 kubenswrapper[18707]: I0320 08:44:39.890816 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-ensure-env-vars" Mar 20 08:44:39.890926 master-0 kubenswrapper[18707]: E0320 08:44:39.890834 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" Mar 20 08:44:39.890926 master-0 kubenswrapper[18707]: I0320 08:44:39.890840 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" Mar 20 08:44:39.891158 master-0 kubenswrapper[18707]: I0320 08:44:39.890968 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-ensure-env-vars" Mar 20 08:44:39.891158 master-0 kubenswrapper[18707]: I0320 08:44:39.890993 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" Mar 20 08:44:39.891158 master-0 kubenswrapper[18707]: I0320 08:44:39.891020 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" Mar 20 08:44:39.891158 master-0 kubenswrapper[18707]: I0320 08:44:39.891032 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="setup" Mar 20 08:44:39.891158 master-0 kubenswrapper[18707]: I0320 08:44:39.891044 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" Mar 20 08:44:39.891158 master-0 kubenswrapper[18707]: I0320 08:44:39.891056 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-resources-copy" Mar 20 08:44:39.891158 master-0 kubenswrapper[18707]: I0320 08:44:39.891070 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" Mar 20 08:44:39.891158 master-0 kubenswrapper[18707]: I0320 08:44:39.891081 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" Mar 20 08:44:39.993153 master-0 kubenswrapper[18707]: I0320 08:44:39.993084 18707 generic.go:334] "Generic (PLEG): container finished" podID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerID="c71b79252c7c80b19481c7db2c5281ad6c66edb0a67d74d3f77f82c8ec887429" exitCode=0 Mar 20 08:44:39.993466 master-0 kubenswrapper[18707]: I0320 08:44:39.993257 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9","Type":"ContainerDied","Data":"c71b79252c7c80b19481c7db2c5281ad6c66edb0a67d74d3f77f82c8ec887429"} Mar 20 08:44:39.997935 master-0 kubenswrapper[18707]: I0320 08:44:39.997895 18707 generic.go:334] "Generic (PLEG): container finished" podID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerID="784d0ad28f7b67339145bc04f7762b9f24ab7f8ba996e7b718878c32534ac025" exitCode=0 Mar 20 08:44:39.998025 master-0 kubenswrapper[18707]: I0320 08:44:39.997944 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77ddbb16-b96e-4717-9786-2feae0d0cc3f","Type":"ContainerDied","Data":"784d0ad28f7b67339145bc04f7762b9f24ab7f8ba996e7b718878c32534ac025"} Mar 20 08:44:39.998025 master-0 kubenswrapper[18707]: I0320 08:44:39.997974 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77ddbb16-b96e-4717-9786-2feae0d0cc3f","Type":"ContainerStarted","Data":"1f3f6e5cd3c4ec0aac0308f00b6155dfd226f4c87924acc6f28fc32640f5a3c1"} Mar 20 08:44:40.011230 master-0 kubenswrapper[18707]: I0320 08:44:40.008656 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.011230 master-0 kubenswrapper[18707]: I0320 08:44:40.008711 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.011230 master-0 kubenswrapper[18707]: I0320 08:44:40.008747 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.011230 master-0 kubenswrapper[18707]: I0320 08:44:40.008842 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.011230 master-0 kubenswrapper[18707]: I0320 08:44:40.009003 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.011230 master-0 kubenswrapper[18707]: I0320 08:44:40.009086 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.112245 master-0 kubenswrapper[18707]: I0320 08:44:40.112111 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.112245 master-0 kubenswrapper[18707]: I0320 08:44:40.112230 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.112785 master-0 kubenswrapper[18707]: I0320 08:44:40.112417 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.113109 master-0 kubenswrapper[18707]: I0320 08:44:40.113038 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.113178 master-0 kubenswrapper[18707]: I0320 08:44:40.113118 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.113178 master-0 kubenswrapper[18707]: I0320 08:44:40.113068 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.113314 master-0 kubenswrapper[18707]: I0320 08:44:40.113250 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.113443 master-0 kubenswrapper[18707]: I0320 08:44:40.113381 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.113443 master-0 kubenswrapper[18707]: I0320 08:44:40.113435 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.113649 master-0 kubenswrapper[18707]: I0320 08:44:40.113614 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.113719 master-0 kubenswrapper[18707]: I0320 08:44:40.113668 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:40.113942 master-0 kubenswrapper[18707]: I0320 08:44:40.113684 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 20 08:44:41.011071 master-0 kubenswrapper[18707]: I0320 08:44:41.010995 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 20 08:44:41.012390 master-0 kubenswrapper[18707]: I0320 08:44:41.012367 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 20 08:44:41.015019 master-0 kubenswrapper[18707]: I0320 08:44:41.014977 18707 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="336ee2eca239c702b23f8eadc224486f445c6fd4853f373a73d423d9f64cfcac" exitCode=2 Mar 20 08:44:41.015019 master-0 kubenswrapper[18707]: I0320 08:44:41.015011 18707 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="267f2d3e5624276bc815692d6f63750c35fab88bf4fad9637c60210f294ab470" exitCode=0 Mar 20 08:44:41.015019 master-0 kubenswrapper[18707]: I0320 08:44:41.015022 18707 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="5de11809fbb3db5b6981fddb634a5dbf7f162fcbe9eede8cb63026b2ff7e2a3e" exitCode=2 Mar 20 08:44:44.070484 master-0 kubenswrapper[18707]: I0320 08:44:44.070422 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77ddbb16-b96e-4717-9786-2feae0d0cc3f","Type":"ContainerStarted","Data":"f242c0b5af48bb6305b45e0ed0a80c9b6a7707b2746522996f3c16ec946731bb"} Mar 20 08:44:44.070484 master-0 kubenswrapper[18707]: I0320 08:44:44.070480 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77ddbb16-b96e-4717-9786-2feae0d0cc3f","Type":"ContainerStarted","Data":"c19e6fb31c3a39d696985e46f9a3295932d6dbe5fa5c9082521702bd7df5a351"} Mar 20 08:44:44.076393 master-0 kubenswrapper[18707]: I0320 08:44:44.076330 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9","Type":"ContainerStarted","Data":"1e47e53c73caf9294d7cc552d435de2bba3335889632bb5e47c9437fae1ad38a"} Mar 20 08:44:44.076464 master-0 kubenswrapper[18707]: I0320 08:44:44.076399 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9","Type":"ContainerStarted","Data":"cbc84cc16fdf36c19a2ec8ddb8d8c567dd3a2aefe284ff96c39ef8fc1be8eb8d"} Mar 20 08:44:44.787447 master-0 kubenswrapper[18707]: I0320 08:44:44.787357 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:44.787819 master-0 kubenswrapper[18707]: I0320 08:44:44.787470 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:44:45.092010 master-0 kubenswrapper[18707]: I0320 08:44:45.091906 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9","Type":"ContainerStarted","Data":"9109a59c4f2c870fa24f5897d11d6ce0d82c7037a7c29d3008301ef58dbfb30e"} Mar 20 08:44:45.092010 master-0 kubenswrapper[18707]: I0320 08:44:45.091980 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9","Type":"ContainerStarted","Data":"ef595b86616a9d59758e3bb52cb2c40900df30c7a5c7962dd70066b8351abfcc"} Mar 20 08:44:45.092010 master-0 kubenswrapper[18707]: I0320 08:44:45.091992 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9","Type":"ContainerStarted","Data":"a1bf05287a33269822d24c09f055433696b780b4f7f403c232dda246f6cacc28"} Mar 20 08:44:45.092010 master-0 kubenswrapper[18707]: I0320 08:44:45.092002 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9","Type":"ContainerStarted","Data":"bf080bd2aa0228ff67560c7f40f2f04126f362d98ad14ce5d016639c7975d354"} Mar 20 08:44:45.103736 master-0 kubenswrapper[18707]: I0320 08:44:45.103664 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77ddbb16-b96e-4717-9786-2feae0d0cc3f","Type":"ContainerStarted","Data":"27cbcc417567b245b2900410177a926ad00e08c6422a5beeefb377e30ef77b61"} Mar 20 08:44:45.103736 master-0 kubenswrapper[18707]: I0320 08:44:45.103728 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77ddbb16-b96e-4717-9786-2feae0d0cc3f","Type":"ContainerStarted","Data":"7c9e4dd3063d39c7cb9b18d7a354e3435c2d2d5ec1986c53b7b85c908a8d1153"} Mar 20 08:44:45.103736 master-0 kubenswrapper[18707]: I0320 08:44:45.103747 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77ddbb16-b96e-4717-9786-2feae0d0cc3f","Type":"ContainerStarted","Data":"44284a8b77568b923b8f2d7a551bb6b8408c54a850ac6d39be44e107e4b4a043"} Mar 20 08:44:45.104123 master-0 kubenswrapper[18707]: I0320 08:44:45.103764 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77ddbb16-b96e-4717-9786-2feae0d0cc3f","Type":"ContainerStarted","Data":"87d8402979c175f019cec0c6457191a23d5e33345827dd54236af9fd1ae9b380"} Mar 20 08:44:49.029784 master-0 kubenswrapper[18707]: I0320 08:44:49.029644 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:44:50.449404 master-0 kubenswrapper[18707]: I0320 08:44:50.449276 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-zbpk4\" (UID: \"044870dd-540a-402e-84cb-fa1bf3d6a318\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:44:50.455513 master-0 kubenswrapper[18707]: I0320 08:44:50.455439 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/044870dd-540a-402e-84cb-fa1bf3d6a318-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-zbpk4\" (UID: \"044870dd-540a-402e-84cb-fa1bf3d6a318\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:44:50.557773 master-0 kubenswrapper[18707]: I0320 08:44:50.557679 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:44:52.360819 master-0 kubenswrapper[18707]: E0320 08:44:52.360661 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:44:54.188214 master-0 kubenswrapper[18707]: I0320 08:44:54.188101 18707 generic.go:334] "Generic (PLEG): container finished" podID="3eda9567-712b-4541-9344-a333e7734fed" containerID="6f05dd6a29969585010f10aa13bff6ed73728734772cadb9dab99cd2906be079" exitCode=0 Mar 20 08:44:54.189011 master-0 kubenswrapper[18707]: I0320 08:44:54.188291 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"3eda9567-712b-4541-9344-a333e7734fed","Type":"ContainerDied","Data":"6f05dd6a29969585010f10aa13bff6ed73728734772cadb9dab99cd2906be079"} Mar 20 08:44:54.192598 master-0 kubenswrapper[18707]: I0320 08:44:54.192531 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:44:54.195484 master-0 kubenswrapper[18707]: I0320 08:44:54.195424 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/0.log" Mar 20 08:44:54.195658 master-0 kubenswrapper[18707]: I0320 08:44:54.195509 18707 generic.go:334] "Generic (PLEG): container finished" podID="2028761b8522f874dcebf13c4683d033" containerID="a800488fb62a072a6848e70ff9d43658046c14f0dd3e6ca8078acf4e79046779" exitCode=1 Mar 20 08:44:54.195658 master-0 kubenswrapper[18707]: I0320 08:44:54.195566 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerDied","Data":"a800488fb62a072a6848e70ff9d43658046c14f0dd3e6ca8078acf4e79046779"} Mar 20 08:44:54.195803 master-0 kubenswrapper[18707]: I0320 08:44:54.195696 18707 scope.go:117] "RemoveContainer" containerID="fe1c54a0e0873f0780c7faf59fe96cab185ee928520426321b3ddc92f25a0204" Mar 20 08:44:54.196557 master-0 kubenswrapper[18707]: I0320 08:44:54.196508 18707 scope.go:117] "RemoveContainer" containerID="a800488fb62a072a6848e70ff9d43658046c14f0dd3e6ca8078acf4e79046779" Mar 20 08:44:54.196933 master-0 kubenswrapper[18707]: E0320 08:44:54.196879 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2028761b8522f874dcebf13c4683d033)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" Mar 20 08:44:55.217811 master-0 kubenswrapper[18707]: I0320 08:44:55.217701 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:44:55.232636 master-0 kubenswrapper[18707]: I0320 08:44:55.232554 18707 scope.go:117] "RemoveContainer" containerID="8cdf7ffa9625537bd484b3cd72f3ca62a1fbd66303b800564461ec0e3e2735c7" Mar 20 08:44:55.636058 master-0 kubenswrapper[18707]: I0320 08:44:55.635995 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 20 08:44:55.773249 master-0 kubenswrapper[18707]: I0320 08:44:55.769038 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3eda9567-712b-4541-9344-a333e7734fed-kubelet-dir\") pod \"3eda9567-712b-4541-9344-a333e7734fed\" (UID: \"3eda9567-712b-4541-9344-a333e7734fed\") " Mar 20 08:44:55.773249 master-0 kubenswrapper[18707]: I0320 08:44:55.769203 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eda9567-712b-4541-9344-a333e7734fed-kube-api-access\") pod \"3eda9567-712b-4541-9344-a333e7734fed\" (UID: \"3eda9567-712b-4541-9344-a333e7734fed\") " Mar 20 08:44:55.773249 master-0 kubenswrapper[18707]: I0320 08:44:55.769229 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3eda9567-712b-4541-9344-a333e7734fed-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3eda9567-712b-4541-9344-a333e7734fed" (UID: "3eda9567-712b-4541-9344-a333e7734fed"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:44:55.773249 master-0 kubenswrapper[18707]: I0320 08:44:55.769360 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3eda9567-712b-4541-9344-a333e7734fed-var-lock\") pod \"3eda9567-712b-4541-9344-a333e7734fed\" (UID: \"3eda9567-712b-4541-9344-a333e7734fed\") " Mar 20 08:44:55.773249 master-0 kubenswrapper[18707]: I0320 08:44:55.769480 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3eda9567-712b-4541-9344-a333e7734fed-var-lock" (OuterVolumeSpecName: "var-lock") pod "3eda9567-712b-4541-9344-a333e7734fed" (UID: "3eda9567-712b-4541-9344-a333e7734fed"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:44:55.773249 master-0 kubenswrapper[18707]: I0320 08:44:55.769926 18707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3eda9567-712b-4541-9344-a333e7734fed-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:44:55.773249 master-0 kubenswrapper[18707]: I0320 08:44:55.769953 18707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3eda9567-712b-4541-9344-a333e7734fed-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:44:55.774817 master-0 kubenswrapper[18707]: I0320 08:44:55.774731 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eda9567-712b-4541-9344-a333e7734fed-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3eda9567-712b-4541-9344-a333e7734fed" (UID: "3eda9567-712b-4541-9344-a333e7734fed"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:55.872307 master-0 kubenswrapper[18707]: I0320 08:44:55.871700 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eda9567-712b-4541-9344-a333e7734fed-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:44:56.235083 master-0 kubenswrapper[18707]: I0320 08:44:56.234890 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"3eda9567-712b-4541-9344-a333e7734fed","Type":"ContainerDied","Data":"6a965a004a799a7227efab7ec68c05de7401e629ea703cff56c404bc2ecb8d83"} Mar 20 08:44:56.235083 master-0 kubenswrapper[18707]: I0320 08:44:56.234959 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a965a004a799a7227efab7ec68c05de7401e629ea703cff56c404bc2ecb8d83" Mar 20 08:44:56.235083 master-0 kubenswrapper[18707]: I0320 08:44:56.235006 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 20 08:44:59.511000 master-0 kubenswrapper[18707]: I0320 08:44:59.510893 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:44:59.512094 master-0 kubenswrapper[18707]: I0320 08:44:59.512050 18707 scope.go:117] "RemoveContainer" containerID="a800488fb62a072a6848e70ff9d43658046c14f0dd3e6ca8078acf4e79046779" Mar 20 08:44:59.512504 master-0 kubenswrapper[18707]: E0320 08:44:59.512454 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2028761b8522f874dcebf13c4683d033)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" Mar 20 08:45:01.290576 master-0 kubenswrapper[18707]: I0320 08:45:01.290463 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:45:01.291694 master-0 kubenswrapper[18707]: I0320 08:45:01.291649 18707 scope.go:117] "RemoveContainer" containerID="a800488fb62a072a6848e70ff9d43658046c14f0dd3e6ca8078acf4e79046779" Mar 20 08:45:01.292294 master-0 kubenswrapper[18707]: E0320 08:45:01.292244 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2028761b8522f874dcebf13c4683d033)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" Mar 20 08:45:02.361994 master-0 kubenswrapper[18707]: E0320 08:45:02.361825 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:45:02.759879 master-0 kubenswrapper[18707]: I0320 08:45:02.759634 18707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:45:02.760558 master-0 kubenswrapper[18707]: I0320 08:45:02.760503 18707 scope.go:117] "RemoveContainer" containerID="a800488fb62a072a6848e70ff9d43658046c14f0dd3e6ca8078acf4e79046779" Mar 20 08:45:02.761060 master-0 kubenswrapper[18707]: E0320 08:45:02.761001 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2028761b8522f874dcebf13c4683d033)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" Mar 20 08:45:04.800402 master-0 kubenswrapper[18707]: I0320 08:45:04.799337 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:45:04.810422 master-0 kubenswrapper[18707]: I0320 08:45:04.810368 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-646d59cb8b-622bl" Mar 20 08:45:05.392031 master-0 kubenswrapper[18707]: E0320 08:45:05.391732 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:44:55Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:44:55Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:44:55Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:44:55Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4869c69128f74d9c3b178ea6c8c8d38df169b6bce05eb821a65f0aaf514c563a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f3c2ad90e251062165f8d6623ca4994c0b3e28324e4b5b17fd588b162ec97766\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746912226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:6bec6e4ce9b3ff60658829df2f5980cf947458d49b97476cee1ff01ec638d309\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:b259d760a14ca994ed34d4cfc901758f180cc8d1de7f3c427baa68030e06b7c7\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252654287},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:542b8eae269892bc6f0f9d5ab808afe26119a72e430870d81f59faf93c3f2f18\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d18953a151bb71fe6585017054b939c4062435242069a2601fe668009e6e5087\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223740630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f2c59d19eb73ad5c0f93b0a63003c1885f5297959c9c45b401d1a74aea6e76\\\"],\\\"sizeBytes\\\":880382887},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de91abd5ad76fb491881a75a0feb4b8ca5600ceb5e15a4b0b687ada01ea0a44c\\\"],\\\"sizeBytes\\\":862205633},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f3038df8df25746bb5095296d4e5740f2356f85c1ed8d43f1b3d281e294826e5\\\"],\\\"sizeBytes\\\":605698193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5\\\"],\\\"sizeBytes\\\":557428271},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1973d56a1097a48ea0ebf2c4dbae1ed86fa67bb0116f4962f7720d48aa337d27\\\"],\\\"sizeBytes\\\":504662731},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf72297fee61ec9950f6868881ad3e84be8692ca08f084b3d155d93a766c0823\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a746a87b784ea1caa278fd0e012554f9df520b6fff665ea0bc4c83f487fed113\\\"],\\\"sizeBytes\\\":484450894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c4d5a681595e428ff4b5083648c13615eed80be9084a3d3fc68a0295079cb12\\\"],\\\"sizeBytes\\\":484187929}]}}\" for node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (patch nodes master-0)" Mar 20 08:45:10.411452 master-0 kubenswrapper[18707]: I0320 08:45:10.411384 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 20 08:45:10.412296 master-0 kubenswrapper[18707]: I0320 08:45:10.412242 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 20 08:45:10.413054 master-0 kubenswrapper[18707]: I0320 08:45:10.413023 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd/0.log" Mar 20 08:45:10.413427 master-0 kubenswrapper[18707]: I0320 08:45:10.413399 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcdctl/0.log" Mar 20 08:45:10.414377 master-0 kubenswrapper[18707]: I0320 08:45:10.414338 18707 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="91928dd4bf037a74fc3110c950269e5b4ae8998e3616107aa1170ce1d3fede55" exitCode=137 Mar 20 08:45:10.414377 master-0 kubenswrapper[18707]: I0320 08:45:10.414368 18707 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="c0c0eafff8c825fc9c4a32593e8d54d61ac68f27a7fde59d8dfb857aeb1580f0" exitCode=137 Mar 20 08:45:10.498359 master-0 kubenswrapper[18707]: I0320 08:45:10.498266 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 20 08:45:10.499561 master-0 kubenswrapper[18707]: I0320 08:45:10.499503 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 20 08:45:10.501073 master-0 kubenswrapper[18707]: I0320 08:45:10.501027 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd/0.log" Mar 20 08:45:10.501686 master-0 kubenswrapper[18707]: I0320 08:45:10.501665 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcdctl/0.log" Mar 20 08:45:10.502982 master-0 kubenswrapper[18707]: I0320 08:45:10.502952 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 20 08:45:10.524849 master-0 kubenswrapper[18707]: I0320 08:45:10.524785 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 20 08:45:10.525147 master-0 kubenswrapper[18707]: I0320 08:45:10.524867 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 20 08:45:10.525147 master-0 kubenswrapper[18707]: I0320 08:45:10.524933 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 20 08:45:10.525147 master-0 kubenswrapper[18707]: I0320 08:45:10.525000 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 20 08:45:10.525147 master-0 kubenswrapper[18707]: I0320 08:45:10.525004 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:45:10.525147 master-0 kubenswrapper[18707]: I0320 08:45:10.525057 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 20 08:45:10.525147 master-0 kubenswrapper[18707]: I0320 08:45:10.525082 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:45:10.525147 master-0 kubenswrapper[18707]: I0320 08:45:10.525120 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 20 08:45:10.525375 master-0 kubenswrapper[18707]: I0320 08:45:10.525099 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir" (OuterVolumeSpecName: "data-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:45:10.525375 master-0 kubenswrapper[18707]: I0320 08:45:10.525099 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:45:10.525375 master-0 kubenswrapper[18707]: I0320 08:45:10.525144 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir" (OuterVolumeSpecName: "log-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:45:10.525375 master-0 kubenswrapper[18707]: I0320 08:45:10.525198 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:45:10.525942 master-0 kubenswrapper[18707]: I0320 08:45:10.525909 18707 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:45:10.525942 master-0 kubenswrapper[18707]: I0320 08:45:10.525935 18707 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:45:10.525942 master-0 kubenswrapper[18707]: I0320 08:45:10.525944 18707 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:45:10.526058 master-0 kubenswrapper[18707]: I0320 08:45:10.525954 18707 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") on node \"master-0\" DevicePath \"\"" Mar 20 08:45:10.526058 master-0 kubenswrapper[18707]: I0320 08:45:10.525964 18707 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:45:10.526058 master-0 kubenswrapper[18707]: I0320 08:45:10.525973 18707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:45:10.835063 master-0 kubenswrapper[18707]: I0320 08:45:10.834965 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:45:10.835485 master-0 kubenswrapper[18707]: E0320 08:45:10.835443 18707 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:45:10.835568 master-0 kubenswrapper[18707]: E0320 08:45:10.835490 18707 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:45:10.835646 master-0 kubenswrapper[18707]: E0320 08:45:10.835568 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access podName:d245e5b2-a30d-45c8-9b79-6e8096765c14 nodeName:}" failed. No retries permitted until 2026-03-20 08:47:12.835540646 +0000 UTC m=+377.991721002 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access") pod "installer-3-master-0" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:45:11.104861 master-0 kubenswrapper[18707]: I0320 08:45:11.104654 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b4ed170d527099878cb5fdd508a2fb" path="/var/lib/kubelet/pods/24b4ed170d527099878cb5fdd508a2fb/volumes" Mar 20 08:45:11.424241 master-0 kubenswrapper[18707]: I0320 08:45:11.424058 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 20 08:45:11.425492 master-0 kubenswrapper[18707]: I0320 08:45:11.425442 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 20 08:45:11.426248 master-0 kubenswrapper[18707]: I0320 08:45:11.426217 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd/0.log" Mar 20 08:45:11.426860 master-0 kubenswrapper[18707]: I0320 08:45:11.426837 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcdctl/0.log" Mar 20 08:45:11.428058 master-0 kubenswrapper[18707]: I0320 08:45:11.428031 18707 scope.go:117] "RemoveContainer" containerID="336ee2eca239c702b23f8eadc224486f445c6fd4853f373a73d423d9f64cfcac" Mar 20 08:45:11.428306 master-0 kubenswrapper[18707]: I0320 08:45:11.428278 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 20 08:45:11.452671 master-0 kubenswrapper[18707]: I0320 08:45:11.452603 18707 scope.go:117] "RemoveContainer" containerID="267f2d3e5624276bc815692d6f63750c35fab88bf4fad9637c60210f294ab470" Mar 20 08:45:11.470272 master-0 kubenswrapper[18707]: I0320 08:45:11.470219 18707 scope.go:117] "RemoveContainer" containerID="5de11809fbb3db5b6981fddb634a5dbf7f162fcbe9eede8cb63026b2ff7e2a3e" Mar 20 08:45:11.483769 master-0 kubenswrapper[18707]: I0320 08:45:11.483730 18707 scope.go:117] "RemoveContainer" containerID="91928dd4bf037a74fc3110c950269e5b4ae8998e3616107aa1170ce1d3fede55" Mar 20 08:45:11.508690 master-0 kubenswrapper[18707]: I0320 08:45:11.508633 18707 scope.go:117] "RemoveContainer" containerID="c0c0eafff8c825fc9c4a32593e8d54d61ac68f27a7fde59d8dfb857aeb1580f0" Mar 20 08:45:11.528847 master-0 kubenswrapper[18707]: I0320 08:45:11.528798 18707 scope.go:117] "RemoveContainer" containerID="a2139218314ea5d5d1e04c37be758e7a9f90c106dd3c470737be6550fb6322a9" Mar 20 08:45:11.544696 master-0 kubenswrapper[18707]: I0320 08:45:11.544652 18707 scope.go:117] "RemoveContainer" containerID="31b5815996c66a028a6e102943aed8dd0cbf1cb918ec3a5b728d9fb0cb098506" Mar 20 08:45:11.561741 master-0 kubenswrapper[18707]: I0320 08:45:11.561681 18707 scope.go:117] "RemoveContainer" containerID="91045cb8c13e35ca1f0bfb21ba636da24cd41b91eea8db817a9a5a02317192b3" Mar 20 08:45:12.367761 master-0 kubenswrapper[18707]: E0320 08:45:12.362535 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:45:13.925672 master-0 kubenswrapper[18707]: E0320 08:45:13.925473 18707 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e8037af94ce31 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Killing,Message:Stopping container etcd-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:44:39.887875633 +0000 UTC m=+225.044055999,LastTimestamp:2026-03-20 08:44:39.887875633 +0000 UTC m=+225.044055999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:45:15.393636 master-0 kubenswrapper[18707]: E0320 08:45:15.393455 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:45:16.094038 master-0 kubenswrapper[18707]: I0320 08:45:16.093942 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 20 08:45:16.094979 master-0 kubenswrapper[18707]: I0320 08:45:16.094915 18707 scope.go:117] "RemoveContainer" containerID="a800488fb62a072a6848e70ff9d43658046c14f0dd3e6ca8078acf4e79046779" Mar 20 08:45:16.119504 master-0 kubenswrapper[18707]: I0320 08:45:16.119448 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:45:16.119504 master-0 kubenswrapper[18707]: I0320 08:45:16.119499 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:45:17.485787 master-0 kubenswrapper[18707]: I0320 08:45:17.485653 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:45:17.487492 master-0 kubenswrapper[18707]: I0320 08:45:17.487414 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"871676bb611c7b17e30bacdec9ff5c25cf4c40cdada2c8d2a4f54b77ebf11820"} Mar 20 08:45:19.511114 master-0 kubenswrapper[18707]: I0320 08:45:19.510960 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:45:21.290099 master-0 kubenswrapper[18707]: I0320 08:45:21.289988 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:45:21.297439 master-0 kubenswrapper[18707]: I0320 08:45:21.297373 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:45:22.200436 master-0 kubenswrapper[18707]: E0320 08:45:22.200291 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[trusted-ca], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" podUID="348f3880-793f-43e4-9de1-8511626d2552" Mar 20 08:45:22.369234 master-0 kubenswrapper[18707]: E0320 08:45:22.368043 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:45:22.548463 master-0 kubenswrapper[18707]: I0320 08:45:22.548412 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:45:25.394307 master-0 kubenswrapper[18707]: E0320 08:45:25.394093 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:45:25.540772 master-0 kubenswrapper[18707]: I0320 08:45:25.540657 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:45:25.543829 master-0 kubenswrapper[18707]: I0320 08:45:25.543759 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/348f3880-793f-43e4-9de1-8511626d2552-trusted-ca\") pod \"console-operator-76b6568d85-8b8gv\" (UID: \"348f3880-793f-43e4-9de1-8511626d2552\") " pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:45:25.552326 master-0 kubenswrapper[18707]: I0320 08:45:25.552261 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-whhgj" Mar 20 08:45:25.560259 master-0 kubenswrapper[18707]: I0320 08:45:25.560152 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:45:29.527223 master-0 kubenswrapper[18707]: I0320 08:45:29.527142 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:45:32.372551 master-0 kubenswrapper[18707]: E0320 08:45:32.372234 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:45:32.372551 master-0 kubenswrapper[18707]: I0320 08:45:32.372341 18707 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 08:45:35.395128 master-0 kubenswrapper[18707]: E0320 08:45:35.394980 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:45:35.683546 master-0 kubenswrapper[18707]: I0320 08:45:35.683338 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-6t5vb_fb0fc10f-5796-4cd5-b8f5-72d678054c24/approver/1.log" Mar 20 08:45:35.684522 master-0 kubenswrapper[18707]: I0320 08:45:35.684462 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-6t5vb_fb0fc10f-5796-4cd5-b8f5-72d678054c24/approver/0.log" Mar 20 08:45:35.685174 master-0 kubenswrapper[18707]: I0320 08:45:35.685094 18707 generic.go:334] "Generic (PLEG): container finished" podID="fb0fc10f-5796-4cd5-b8f5-72d678054c24" containerID="b38f5242ee8fec0a4fb77638e3088bf483c8bc44e65e9e4b954af76f0ae77a90" exitCode=1 Mar 20 08:45:35.685317 master-0 kubenswrapper[18707]: I0320 08:45:35.685178 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-6t5vb" event={"ID":"fb0fc10f-5796-4cd5-b8f5-72d678054c24","Type":"ContainerDied","Data":"b38f5242ee8fec0a4fb77638e3088bf483c8bc44e65e9e4b954af76f0ae77a90"} Mar 20 08:45:35.685387 master-0 kubenswrapper[18707]: I0320 08:45:35.685365 18707 scope.go:117] "RemoveContainer" containerID="11fb9901a3c95280d4ada671324191ca15e473462e8fdb0e196a3a680e2d44a1" Mar 20 08:45:35.686463 master-0 kubenswrapper[18707]: I0320 08:45:35.686430 18707 scope.go:117] "RemoveContainer" containerID="b38f5242ee8fec0a4fb77638e3088bf483c8bc44e65e9e4b954af76f0ae77a90" Mar 20 08:45:36.697096 master-0 kubenswrapper[18707]: I0320 08:45:36.697030 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-6t5vb_fb0fc10f-5796-4cd5-b8f5-72d678054c24/approver/1.log" Mar 20 08:45:36.697768 master-0 kubenswrapper[18707]: I0320 08:45:36.697626 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-6t5vb" event={"ID":"fb0fc10f-5796-4cd5-b8f5-72d678054c24","Type":"ContainerStarted","Data":"3d5da0c7abbc20eb8566ee511210d40c05a21752a54c1073c35f19c94e5da815"} Mar 20 08:45:39.028032 master-0 kubenswrapper[18707]: I0320 08:45:39.027908 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:45:39.062541 master-0 kubenswrapper[18707]: I0320 08:45:39.062493 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:45:39.751843 master-0 kubenswrapper[18707]: I0320 08:45:39.751759 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:45:39.995145 master-0 kubenswrapper[18707]: I0320 08:45:39.995043 18707 status_manager.go:851] "Failed to get status for pod" podUID="24b4ed170d527099878cb5fdd508a2fb" pod="openshift-etcd/etcd-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods etcd-master-0)" Mar 20 08:45:42.374387 master-0 kubenswrapper[18707]: E0320 08:45:42.373481 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 20 08:45:45.396312 master-0 kubenswrapper[18707]: E0320 08:45:45.396068 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:45:45.396312 master-0 kubenswrapper[18707]: E0320 08:45:45.396174 18707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 08:45:47.929438 master-0 kubenswrapper[18707]: E0320 08:45:47.929159 18707 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e8037af94f1f9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:44:39.887884793 +0000 UTC m=+225.044065159,LastTimestamp:2026-03-20 08:44:39.887884793 +0000 UTC m=+225.044065159,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:45:50.123094 master-0 kubenswrapper[18707]: E0320 08:45:50.123012 18707 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 20 08:45:50.124815 master-0 kubenswrapper[18707]: I0320 08:45:50.124783 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 20 08:45:50.163862 master-0 kubenswrapper[18707]: W0320 08:45:50.163672 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod094204df314fe45bd5af12ca1b4622bb.slice/crio-4aa4a4d72199d3bbe04246ec7a6b9f4339ed584cbe48bc063ddae99b9e414f1c WatchSource:0}: Error finding container 4aa4a4d72199d3bbe04246ec7a6b9f4339ed584cbe48bc063ddae99b9e414f1c: Status 404 returned error can't find the container with id 4aa4a4d72199d3bbe04246ec7a6b9f4339ed584cbe48bc063ddae99b9e414f1c Mar 20 08:45:50.825251 master-0 kubenswrapper[18707]: I0320 08:45:50.825136 18707 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="b520dba1277bed7af89823ffe67cdbf645d8d190ce4d6a9e23dddfbeb2824653" exitCode=0 Mar 20 08:45:50.825562 master-0 kubenswrapper[18707]: I0320 08:45:50.825256 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"b520dba1277bed7af89823ffe67cdbf645d8d190ce4d6a9e23dddfbeb2824653"} Mar 20 08:45:50.825562 master-0 kubenswrapper[18707]: I0320 08:45:50.825392 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"4aa4a4d72199d3bbe04246ec7a6b9f4339ed584cbe48bc063ddae99b9e414f1c"} Mar 20 08:45:50.825935 master-0 kubenswrapper[18707]: I0320 08:45:50.825879 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:45:50.825935 master-0 kubenswrapper[18707]: I0320 08:45:50.825921 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:45:51.369279 master-0 kubenswrapper[18707]: E0320 08:45:51.369118 18707 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:45:51.369279 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(e31283cad84b23fb487e1c8a841943f36f80924391a43b0ed4cdfedfd609699e): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e31283cad84b23fb487e1c8a841943f36f80924391a43b0ed4cdfedfd609699e" Netns:"/var/run/netns/a27c34f3-2026-4282-ad33-a758b28fd8b5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=e31283cad84b23fb487e1c8a841943f36f80924391a43b0ed4cdfedfd609699e;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:45:51.369279 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:45:51.369279 master-0 kubenswrapper[18707]: > Mar 20 08:45:51.370030 master-0 kubenswrapper[18707]: E0320 08:45:51.369311 18707 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:45:51.370030 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(e31283cad84b23fb487e1c8a841943f36f80924391a43b0ed4cdfedfd609699e): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e31283cad84b23fb487e1c8a841943f36f80924391a43b0ed4cdfedfd609699e" Netns:"/var/run/netns/a27c34f3-2026-4282-ad33-a758b28fd8b5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=e31283cad84b23fb487e1c8a841943f36f80924391a43b0ed4cdfedfd609699e;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:45:51.370030 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:45:51.370030 master-0 kubenswrapper[18707]: > pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:45:51.370030 master-0 kubenswrapper[18707]: E0320 08:45:51.369343 18707 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:45:51.370030 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(e31283cad84b23fb487e1c8a841943f36f80924391a43b0ed4cdfedfd609699e): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e31283cad84b23fb487e1c8a841943f36f80924391a43b0ed4cdfedfd609699e" Netns:"/var/run/netns/a27c34f3-2026-4282-ad33-a758b28fd8b5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=e31283cad84b23fb487e1c8a841943f36f80924391a43b0ed4cdfedfd609699e;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:45:51.370030 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:45:51.370030 master-0 kubenswrapper[18707]: > pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:45:51.370030 master-0 kubenswrapper[18707]: E0320 08:45:51.369422 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console(044870dd-540a-402e-84cb-fa1bf3d6a318)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console(044870dd-540a-402e-84cb-fa1bf3d6a318)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(e31283cad84b23fb487e1c8a841943f36f80924391a43b0ed4cdfedfd609699e): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"e31283cad84b23fb487e1c8a841943f36f80924391a43b0ed4cdfedfd609699e\\\" Netns:\\\"/var/run/netns/a27c34f3-2026-4282-ad33-a758b28fd8b5\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=e31283cad84b23fb487e1c8a841943f36f80924391a43b0ed4cdfedfd609699e;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" podUID="044870dd-540a-402e-84cb-fa1bf3d6a318" Mar 20 08:45:51.839583 master-0 kubenswrapper[18707]: I0320 08:45:51.839484 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:45:51.840451 master-0 kubenswrapper[18707]: I0320 08:45:51.840405 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:45:52.575146 master-0 kubenswrapper[18707]: E0320 08:45:52.575031 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 20 08:45:55.106739 master-0 kubenswrapper[18707]: I0320 08:45:55.106645 18707 kubelet.go:1505] "Image garbage collection succeeded" Mar 20 08:45:55.318535 master-0 kubenswrapper[18707]: I0320 08:45:55.318426 18707 scope.go:117] "RemoveContainer" containerID="6e31068727643e077f1c9461b5883b919e163a79d9088735e4c5d39688c47867" Mar 20 08:45:55.346068 master-0 kubenswrapper[18707]: I0320 08:45:55.345995 18707 scope.go:117] "RemoveContainer" containerID="73a7f9993a52ad274232661b06d25f3b18e0675faed4b301aeb4072dcc7cfa79" Mar 20 08:45:55.373827 master-0 kubenswrapper[18707]: I0320 08:45:55.373750 18707 scope.go:117] "RemoveContainer" containerID="b43c5f4dbc5493b32b9934371a1875a8e1d7c69940c30587bfa291adee73b603" Mar 20 08:45:55.397526 master-0 kubenswrapper[18707]: I0320 08:45:55.397483 18707 scope.go:117] "RemoveContainer" containerID="230d37232882904f1764e96ed6057bf568baed29ff892b892e215ce87e945710" Mar 20 08:46:02.977202 master-0 kubenswrapper[18707]: E0320 08:46:02.977087 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 20 08:46:05.506641 master-0 kubenswrapper[18707]: E0320 08:46:05.506325 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:45:55Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:45:55Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:45:55Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:45:55Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4869c69128f74d9c3b178ea6c8c8d38df169b6bce05eb821a65f0aaf514c563a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f3c2ad90e251062165f8d6623ca4994c0b3e28324e4b5b17fd588b162ec97766\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746912226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:6bec6e4ce9b3ff60658829df2f5980cf947458d49b97476cee1ff01ec638d309\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:b259d760a14ca994ed34d4cfc901758f180cc8d1de7f3c427baa68030e06b7c7\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252654287},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:542b8eae269892bc6f0f9d5ab808afe26119a72e430870d81f59faf93c3f2f18\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d18953a151bb71fe6585017054b939c4062435242069a2601fe668009e6e5087\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223740630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f2c59d19eb73ad5c0f93b0a63003c1885f5297959c9c45b401d1a74aea6e76\\\"],\\\"sizeBytes\\\":880382887},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de91abd5ad76fb491881a75a0feb4b8ca5600ceb5e15a4b0b687ada01ea0a44c\\\"],\\\"sizeBytes\\\":862205633},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f3038df8df25746bb5095296d4e5740f2356f85c1ed8d43f1b3d281e294826e5\\\"],\\\"sizeBytes\\\":605698193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5\\\"],\\\"sizeBytes\\\":557428271},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1973d56a1097a48ea0ebf2c4dbae1ed86fa67bb0116f4962f7720d48aa337d27\\\"],\\\"sizeBytes\\\":504662731},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf72297fee61ec9950f6868881ad3e84be8692ca08f084b3d155d93a766c0823\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a746a87b784ea1caa278fd0e012554f9df520b6fff665ea0bc4c83f487fed113\\\"],\\\"sizeBytes\\\":484450894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c4d5a681595e428ff4b5083648c13615eed80be9084a3d3fc68a0295079cb12\\\"],\\\"sizeBytes\\\":484187929}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:46:13.779238 master-0 kubenswrapper[18707]: E0320 08:46:13.779063 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 20 08:46:15.507497 master-0 kubenswrapper[18707]: E0320 08:46:15.507408 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": context deadline exceeded" Mar 20 08:46:17.090344 master-0 kubenswrapper[18707]: I0320 08:46:17.090249 18707 generic.go:334] "Generic (PLEG): container finished" podID="acb704a9-6c8d-4378-ae93-e7095b1fce85" containerID="4edf8229d20f78af8d6c9bd895781409ea67ff831e7f63eef258c5dd26b7d38d" exitCode=0 Mar 20 08:46:17.091246 master-0 kubenswrapper[18707]: I0320 08:46:17.090348 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" event={"ID":"acb704a9-6c8d-4378-ae93-e7095b1fce85","Type":"ContainerDied","Data":"4edf8229d20f78af8d6c9bd895781409ea67ff831e7f63eef258c5dd26b7d38d"} Mar 20 08:46:17.091246 master-0 kubenswrapper[18707]: I0320 08:46:17.091130 18707 scope.go:117] "RemoveContainer" containerID="4edf8229d20f78af8d6c9bd895781409ea67ff831e7f63eef258c5dd26b7d38d" Mar 20 08:46:18.103996 master-0 kubenswrapper[18707]: I0320 08:46:18.103903 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" event={"ID":"acb704a9-6c8d-4378-ae93-e7095b1fce85","Type":"ContainerStarted","Data":"d5730ce99ce84daa469646e6254d205d4c489fad363bcb80a9e011d6d98bb833"} Mar 20 08:46:18.105293 master-0 kubenswrapper[18707]: I0320 08:46:18.104498 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:46:18.106735 master-0 kubenswrapper[18707]: I0320 08:46:18.106652 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:46:21.934125 master-0 kubenswrapper[18707]: E0320 08:46:21.933854 18707 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e8037af95ac94 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Killing,Message:Stopping container etcd-metrics,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:44:39.887932564 +0000 UTC m=+225.044112990,LastTimestamp:2026-03-20 08:44:39.887932564 +0000 UTC m=+225.044112990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:46:24.828910 master-0 kubenswrapper[18707]: E0320 08:46:24.828823 18707 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 20 08:46:25.177609 master-0 kubenswrapper[18707]: I0320 08:46:25.177534 18707 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="54cc77388445ddd21bedcde65cfa120cebfccbcf552eb48d113f3355a5725832" exitCode=0 Mar 20 08:46:25.177863 master-0 kubenswrapper[18707]: I0320 08:46:25.177622 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"54cc77388445ddd21bedcde65cfa120cebfccbcf552eb48d113f3355a5725832"} Mar 20 08:46:25.178370 master-0 kubenswrapper[18707]: I0320 08:46:25.178336 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:46:25.178421 master-0 kubenswrapper[18707]: I0320 08:46:25.178370 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:46:25.381020 master-0 kubenswrapper[18707]: E0320 08:46:25.380720 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 20 08:46:25.508388 master-0 kubenswrapper[18707]: E0320 08:46:25.507921 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:46:26.449263 master-0 kubenswrapper[18707]: E0320 08:46:26.449150 18707 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:46:26.449263 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(ca8576e8d30903a0ab02eabeb68a25c4d1075706f6eae8f9d699f9069d1ab96b): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ca8576e8d30903a0ab02eabeb68a25c4d1075706f6eae8f9d699f9069d1ab96b" Netns:"/var/run/netns/9c7752cf-7a04-422a-882d-f3da023fd279" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=ca8576e8d30903a0ab02eabeb68a25c4d1075706f6eae8f9d699f9069d1ab96b;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552" Path:"" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:46:26.449263 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:46:26.449263 master-0 kubenswrapper[18707]: > Mar 20 08:46:26.450114 master-0 kubenswrapper[18707]: E0320 08:46:26.449319 18707 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:46:26.450114 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(ca8576e8d30903a0ab02eabeb68a25c4d1075706f6eae8f9d699f9069d1ab96b): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ca8576e8d30903a0ab02eabeb68a25c4d1075706f6eae8f9d699f9069d1ab96b" Netns:"/var/run/netns/9c7752cf-7a04-422a-882d-f3da023fd279" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=ca8576e8d30903a0ab02eabeb68a25c4d1075706f6eae8f9d699f9069d1ab96b;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552" Path:"" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:46:26.450114 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:46:26.450114 master-0 kubenswrapper[18707]: > pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:46:26.450114 master-0 kubenswrapper[18707]: E0320 08:46:26.449359 18707 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:46:26.450114 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(ca8576e8d30903a0ab02eabeb68a25c4d1075706f6eae8f9d699f9069d1ab96b): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ca8576e8d30903a0ab02eabeb68a25c4d1075706f6eae8f9d699f9069d1ab96b" Netns:"/var/run/netns/9c7752cf-7a04-422a-882d-f3da023fd279" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=ca8576e8d30903a0ab02eabeb68a25c4d1075706f6eae8f9d699f9069d1ab96b;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552" Path:"" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:46:26.450114 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:46:26.450114 master-0 kubenswrapper[18707]: > pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:46:26.450114 master-0 kubenswrapper[18707]: E0320 08:46:26.449476 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"console-operator-76b6568d85-8b8gv_openshift-console-operator(348f3880-793f-43e4-9de1-8511626d2552)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"console-operator-76b6568d85-8b8gv_openshift-console-operator(348f3880-793f-43e4-9de1-8511626d2552)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(ca8576e8d30903a0ab02eabeb68a25c4d1075706f6eae8f9d699f9069d1ab96b): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"ca8576e8d30903a0ab02eabeb68a25c4d1075706f6eae8f9d699f9069d1ab96b\\\" Netns:\\\"/var/run/netns/9c7752cf-7a04-422a-882d-f3da023fd279\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=ca8576e8d30903a0ab02eabeb68a25c4d1075706f6eae8f9d699f9069d1ab96b;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" podUID="348f3880-793f-43e4-9de1-8511626d2552" Mar 20 08:46:27.196266 master-0 kubenswrapper[18707]: I0320 08:46:27.196168 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:46:27.196834 master-0 kubenswrapper[18707]: I0320 08:46:27.196721 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:46:30.220822 master-0 kubenswrapper[18707]: I0320 08:46:30.220739 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-7vxxr_d88ba8e1-ee42-423f-9839-e71cb0265c6c/config-sync-controllers/0.log" Mar 20 08:46:30.221641 master-0 kubenswrapper[18707]: I0320 08:46:30.221585 18707 generic.go:334] "Generic (PLEG): container finished" podID="d88ba8e1-ee42-423f-9839-e71cb0265c6c" containerID="8849c0e374773ff413e6a07005d70c646b3dbfad2bb39cd593ab7f09dab9e689" exitCode=1 Mar 20 08:46:30.221686 master-0 kubenswrapper[18707]: I0320 08:46:30.221647 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" event={"ID":"d88ba8e1-ee42-423f-9839-e71cb0265c6c","Type":"ContainerDied","Data":"8849c0e374773ff413e6a07005d70c646b3dbfad2bb39cd593ab7f09dab9e689"} Mar 20 08:46:30.222653 master-0 kubenswrapper[18707]: I0320 08:46:30.222604 18707 scope.go:117] "RemoveContainer" containerID="8849c0e374773ff413e6a07005d70c646b3dbfad2bb39cd593ab7f09dab9e689" Mar 20 08:46:31.241673 master-0 kubenswrapper[18707]: I0320 08:46:31.241573 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-7vxxr_d88ba8e1-ee42-423f-9839-e71cb0265c6c/config-sync-controllers/0.log" Mar 20 08:46:31.242711 master-0 kubenswrapper[18707]: I0320 08:46:31.242297 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" event={"ID":"d88ba8e1-ee42-423f-9839-e71cb0265c6c","Type":"ContainerStarted","Data":"025e195895cf008961b708242fe128eb68d4a72344d62de4645948babb88337a"} Mar 20 08:46:33.846637 master-0 kubenswrapper[18707]: I0320 08:46:33.846503 18707 patch_prober.go:28] interesting pod/catalogd-controller-manager-6864dc98f7-74mgr container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.39:8081/healthz\": dial tcp 10.128.0.39:8081: connect: connection refused" start-of-body= Mar 20 08:46:33.847664 master-0 kubenswrapper[18707]: I0320 08:46:33.846639 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" podUID="4fea9b05-222e-4b58-95c8-735fc1cf3a8b" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.39:8081/healthz\": dial tcp 10.128.0.39:8081: connect: connection refused" Mar 20 08:46:34.274690 master-0 kubenswrapper[18707]: I0320 08:46:34.274517 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-f44gr_96de6024-e20f-4b52-9294-b330d65e4153/snapshot-controller/0.log" Mar 20 08:46:34.274978 master-0 kubenswrapper[18707]: I0320 08:46:34.274661 18707 generic.go:334] "Generic (PLEG): container finished" podID="96de6024-e20f-4b52-9294-b330d65e4153" containerID="08ddad0132f1249a0926348017f97a2ace0dd09f6ef4f6ff1405a91cd7359f8f" exitCode=1 Mar 20 08:46:34.274978 master-0 kubenswrapper[18707]: I0320 08:46:34.274779 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" event={"ID":"96de6024-e20f-4b52-9294-b330d65e4153","Type":"ContainerDied","Data":"08ddad0132f1249a0926348017f97a2ace0dd09f6ef4f6ff1405a91cd7359f8f"} Mar 20 08:46:34.276100 master-0 kubenswrapper[18707]: I0320 08:46:34.276052 18707 scope.go:117] "RemoveContainer" containerID="08ddad0132f1249a0926348017f97a2ace0dd09f6ef4f6ff1405a91cd7359f8f" Mar 20 08:46:34.278670 master-0 kubenswrapper[18707]: I0320 08:46:34.278549 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-74mgr_4fea9b05-222e-4b58-95c8-735fc1cf3a8b/manager/0.log" Mar 20 08:46:34.279399 master-0 kubenswrapper[18707]: I0320 08:46:34.279299 18707 generic.go:334] "Generic (PLEG): container finished" podID="4fea9b05-222e-4b58-95c8-735fc1cf3a8b" containerID="92b79031f76eecd271206da72b0a3408ff8ea5659094905a6bd063d6847591cb" exitCode=1 Mar 20 08:46:34.279485 master-0 kubenswrapper[18707]: I0320 08:46:34.279400 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" event={"ID":"4fea9b05-222e-4b58-95c8-735fc1cf3a8b","Type":"ContainerDied","Data":"92b79031f76eecd271206da72b0a3408ff8ea5659094905a6bd063d6847591cb"} Mar 20 08:46:34.281130 master-0 kubenswrapper[18707]: I0320 08:46:34.280378 18707 scope.go:117] "RemoveContainer" containerID="92b79031f76eecd271206da72b0a3408ff8ea5659094905a6bd063d6847591cb" Mar 20 08:46:34.282660 master-0 kubenswrapper[18707]: I0320 08:46:34.282622 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-nk2rf_7e451189-850e-4d19-a40c-40f642d08511/manager/0.log" Mar 20 08:46:34.282715 master-0 kubenswrapper[18707]: I0320 08:46:34.282681 18707 generic.go:334] "Generic (PLEG): container finished" podID="7e451189-850e-4d19-a40c-40f642d08511" containerID="270e7ed792fece0ff9d9a6dbda1ff1ab238d9c5aab177de687ac26e9f4d69fcc" exitCode=1 Mar 20 08:46:34.282768 master-0 kubenswrapper[18707]: I0320 08:46:34.282726 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" event={"ID":"7e451189-850e-4d19-a40c-40f642d08511","Type":"ContainerDied","Data":"270e7ed792fece0ff9d9a6dbda1ff1ab238d9c5aab177de687ac26e9f4d69fcc"} Mar 20 08:46:34.283293 master-0 kubenswrapper[18707]: I0320 08:46:34.283179 18707 scope.go:117] "RemoveContainer" containerID="270e7ed792fece0ff9d9a6dbda1ff1ab238d9c5aab177de687ac26e9f4d69fcc" Mar 20 08:46:35.297730 master-0 kubenswrapper[18707]: I0320 08:46:35.297633 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-f44gr_96de6024-e20f-4b52-9294-b330d65e4153/snapshot-controller/0.log" Mar 20 08:46:35.298861 master-0 kubenswrapper[18707]: I0320 08:46:35.297797 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" event={"ID":"96de6024-e20f-4b52-9294-b330d65e4153","Type":"ContainerStarted","Data":"cdbe05a6f4402cb697000fd0767650f0ef06aefaed19227820f9e4a92d912881"} Mar 20 08:46:35.301101 master-0 kubenswrapper[18707]: I0320 08:46:35.301021 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-74mgr_4fea9b05-222e-4b58-95c8-735fc1cf3a8b/manager/0.log" Mar 20 08:46:35.301712 master-0 kubenswrapper[18707]: I0320 08:46:35.301648 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" event={"ID":"4fea9b05-222e-4b58-95c8-735fc1cf3a8b","Type":"ContainerStarted","Data":"de9d9afbf098cb9be979968baaf1c43d9196565b38962be92c3212044a002b4a"} Mar 20 08:46:35.301999 master-0 kubenswrapper[18707]: I0320 08:46:35.301945 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:46:35.305316 master-0 kubenswrapper[18707]: I0320 08:46:35.305268 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-nk2rf_7e451189-850e-4d19-a40c-40f642d08511/manager/0.log" Mar 20 08:46:35.305468 master-0 kubenswrapper[18707]: I0320 08:46:35.305338 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" event={"ID":"7e451189-850e-4d19-a40c-40f642d08511","Type":"ContainerStarted","Data":"b4f44e0353a8105ed18a82a1c05add6f2a49658b3707d7679a45667cddb4980b"} Mar 20 08:46:35.306306 master-0 kubenswrapper[18707]: I0320 08:46:35.306260 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:46:35.509149 master-0 kubenswrapper[18707]: E0320 08:46:35.509033 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:46:38.581764 master-0 kubenswrapper[18707]: E0320 08:46:38.581609 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Mar 20 08:46:39.996897 master-0 kubenswrapper[18707]: I0320 08:46:39.996768 18707 status_manager.go:851] "Failed to get status for pod" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" pod="openshift-monitoring/alertmanager-main-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods alertmanager-main-0)" Mar 20 08:46:45.510334 master-0 kubenswrapper[18707]: E0320 08:46:45.510178 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:46:45.510334 master-0 kubenswrapper[18707]: E0320 08:46:45.510317 18707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 08:46:46.436310 master-0 kubenswrapper[18707]: I0320 08:46:46.436228 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-7vxxr_d88ba8e1-ee42-423f-9839-e71cb0265c6c/config-sync-controllers/0.log" Mar 20 08:46:46.437335 master-0 kubenswrapper[18707]: I0320 08:46:46.437143 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-7vxxr_d88ba8e1-ee42-423f-9839-e71cb0265c6c/cluster-cloud-controller-manager/0.log" Mar 20 08:46:46.437335 master-0 kubenswrapper[18707]: I0320 08:46:46.437257 18707 generic.go:334] "Generic (PLEG): container finished" podID="d88ba8e1-ee42-423f-9839-e71cb0265c6c" containerID="40d3c58441549fd94b4fd06f62f9b9e1bdfe941a93f1f046de6cd048124dc220" exitCode=1 Mar 20 08:46:46.437519 master-0 kubenswrapper[18707]: I0320 08:46:46.437312 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" event={"ID":"d88ba8e1-ee42-423f-9839-e71cb0265c6c","Type":"ContainerDied","Data":"40d3c58441549fd94b4fd06f62f9b9e1bdfe941a93f1f046de6cd048124dc220"} Mar 20 08:46:46.438444 master-0 kubenswrapper[18707]: I0320 08:46:46.438399 18707 scope.go:117] "RemoveContainer" containerID="40d3c58441549fd94b4fd06f62f9b9e1bdfe941a93f1f046de6cd048124dc220" Mar 20 08:46:46.453858 master-0 kubenswrapper[18707]: I0320 08:46:46.453703 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" Mar 20 08:46:47.454324 master-0 kubenswrapper[18707]: I0320 08:46:47.454225 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-7vxxr_d88ba8e1-ee42-423f-9839-e71cb0265c6c/config-sync-controllers/0.log" Mar 20 08:46:47.455553 master-0 kubenswrapper[18707]: I0320 08:46:47.455417 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-7vxxr_d88ba8e1-ee42-423f-9839-e71cb0265c6c/cluster-cloud-controller-manager/0.log" Mar 20 08:46:47.455774 master-0 kubenswrapper[18707]: I0320 08:46:47.455707 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-7vxxr" event={"ID":"d88ba8e1-ee42-423f-9839-e71cb0265c6c","Type":"ContainerStarted","Data":"bca14d76146ab0630888ab81e91868cdccfe03f7d8c3901993cbbb10c79ee119"} Mar 20 08:46:48.414381 master-0 kubenswrapper[18707]: I0320 08:46:48.414132 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" Mar 20 08:46:52.605737 master-0 kubenswrapper[18707]: E0320 08:46:52.605645 18707 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:46:52.605737 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(7e454bbf635ef1bf6f2b0e78e85135dc4c266557d742057776c7e953ad66d59e): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7e454bbf635ef1bf6f2b0e78e85135dc4c266557d742057776c7e953ad66d59e" Netns:"/var/run/netns/b764c1f7-e862-40a1-b6b9-db5be997e963" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=7e454bbf635ef1bf6f2b0e78e85135dc4c266557d742057776c7e953ad66d59e;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:46:52.605737 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:46:52.605737 master-0 kubenswrapper[18707]: > Mar 20 08:46:52.606831 master-0 kubenswrapper[18707]: E0320 08:46:52.605757 18707 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:46:52.606831 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(7e454bbf635ef1bf6f2b0e78e85135dc4c266557d742057776c7e953ad66d59e): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7e454bbf635ef1bf6f2b0e78e85135dc4c266557d742057776c7e953ad66d59e" Netns:"/var/run/netns/b764c1f7-e862-40a1-b6b9-db5be997e963" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=7e454bbf635ef1bf6f2b0e78e85135dc4c266557d742057776c7e953ad66d59e;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:46:52.606831 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:46:52.606831 master-0 kubenswrapper[18707]: > pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:46:52.606831 master-0 kubenswrapper[18707]: E0320 08:46:52.605796 18707 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:46:52.606831 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(7e454bbf635ef1bf6f2b0e78e85135dc4c266557d742057776c7e953ad66d59e): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7e454bbf635ef1bf6f2b0e78e85135dc4c266557d742057776c7e953ad66d59e" Netns:"/var/run/netns/b764c1f7-e862-40a1-b6b9-db5be997e963" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=7e454bbf635ef1bf6f2b0e78e85135dc4c266557d742057776c7e953ad66d59e;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:46:52.606831 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:46:52.606831 master-0 kubenswrapper[18707]: > pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:46:52.606831 master-0 kubenswrapper[18707]: E0320 08:46:52.605888 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console(044870dd-540a-402e-84cb-fa1bf3d6a318)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console(044870dd-540a-402e-84cb-fa1bf3d6a318)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(7e454bbf635ef1bf6f2b0e78e85135dc4c266557d742057776c7e953ad66d59e): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"7e454bbf635ef1bf6f2b0e78e85135dc4c266557d742057776c7e953ad66d59e\\\" Netns:\\\"/var/run/netns/b764c1f7-e862-40a1-b6b9-db5be997e963\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=7e454bbf635ef1bf6f2b0e78e85135dc4c266557d742057776c7e953ad66d59e;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" podUID="044870dd-540a-402e-84cb-fa1bf3d6a318" Mar 20 08:46:53.518897 master-0 kubenswrapper[18707]: I0320 08:46:53.514814 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:46:53.518897 master-0 kubenswrapper[18707]: I0320 08:46:53.515720 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:46:54.527569 master-0 kubenswrapper[18707]: I0320 08:46:54.527484 18707 generic.go:334] "Generic (PLEG): container finished" podID="a69e8d3a-a0b1-4688-8631-d9f265aa4c69" containerID="3fbc2ee15fb564cff30cd6bea616239c5ddcc5c59f79a881b3626f244e98915b" exitCode=0 Mar 20 08:46:54.528393 master-0 kubenswrapper[18707]: I0320 08:46:54.527574 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" event={"ID":"a69e8d3a-a0b1-4688-8631-d9f265aa4c69","Type":"ContainerDied","Data":"3fbc2ee15fb564cff30cd6bea616239c5ddcc5c59f79a881b3626f244e98915b"} Mar 20 08:46:54.983546 master-0 kubenswrapper[18707]: E0320 08:46:54.983052 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 20 08:46:55.387082 master-0 kubenswrapper[18707]: I0320 08:46:55.387020 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:46:55.415018 master-0 kubenswrapper[18707]: I0320 08:46:55.414897 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6p8v\" (UniqueName: \"kubernetes.io/projected/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-kube-api-access-c6p8v\") pod \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " Mar 20 08:46:55.415344 master-0 kubenswrapper[18707]: I0320 08:46:55.415238 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-client-ca-bundle\") pod \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " Mar 20 08:46:55.415428 master-0 kubenswrapper[18707]: I0320 08:46:55.415344 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-audit-log\") pod \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " Mar 20 08:46:55.415521 master-0 kubenswrapper[18707]: I0320 08:46:55.415402 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-metrics-server-audit-profiles\") pod \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " Mar 20 08:46:55.416138 master-0 kubenswrapper[18707]: I0320 08:46:55.416090 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-audit-log" (OuterVolumeSpecName: "audit-log") pod "a69e8d3a-a0b1-4688-8631-d9f265aa4c69" (UID: "a69e8d3a-a0b1-4688-8631-d9f265aa4c69"). InnerVolumeSpecName "audit-log". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:55.416494 master-0 kubenswrapper[18707]: I0320 08:46:55.416430 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-configmap-kubelet-serving-ca-bundle\") pod \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " Mar 20 08:46:55.416699 master-0 kubenswrapper[18707]: I0320 08:46:55.416647 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-client-certs\") pod \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " Mar 20 08:46:55.416790 master-0 kubenswrapper[18707]: I0320 08:46:55.416753 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-server-tls\") pod \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\" (UID: \"a69e8d3a-a0b1-4688-8631-d9f265aa4c69\") " Mar 20 08:46:55.416931 master-0 kubenswrapper[18707]: I0320 08:46:55.416886 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-metrics-server-audit-profiles" (OuterVolumeSpecName: "metrics-server-audit-profiles") pod "a69e8d3a-a0b1-4688-8631-d9f265aa4c69" (UID: "a69e8d3a-a0b1-4688-8631-d9f265aa4c69"). InnerVolumeSpecName "metrics-server-audit-profiles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:46:55.417670 master-0 kubenswrapper[18707]: I0320 08:46:55.417598 18707 reconciler_common.go:293] "Volume detached for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-audit-log\") on node \"master-0\" DevicePath \"\"" Mar 20 08:46:55.417769 master-0 kubenswrapper[18707]: I0320 08:46:55.417697 18707 reconciler_common.go:293] "Volume detached for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-metrics-server-audit-profiles\") on node \"master-0\" DevicePath \"\"" Mar 20 08:46:55.417769 master-0 kubenswrapper[18707]: I0320 08:46:55.417592 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "a69e8d3a-a0b1-4688-8631-d9f265aa4c69" (UID: "a69e8d3a-a0b1-4688-8631-d9f265aa4c69"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:46:55.419282 master-0 kubenswrapper[18707]: I0320 08:46:55.419242 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-kube-api-access-c6p8v" (OuterVolumeSpecName: "kube-api-access-c6p8v") pod "a69e8d3a-a0b1-4688-8631-d9f265aa4c69" (UID: "a69e8d3a-a0b1-4688-8631-d9f265aa4c69"). InnerVolumeSpecName "kube-api-access-c6p8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:55.420423 master-0 kubenswrapper[18707]: I0320 08:46:55.420342 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "a69e8d3a-a0b1-4688-8631-d9f265aa4c69" (UID: "a69e8d3a-a0b1-4688-8631-d9f265aa4c69"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:55.421533 master-0 kubenswrapper[18707]: I0320 08:46:55.421493 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-client-ca-bundle" (OuterVolumeSpecName: "client-ca-bundle") pod "a69e8d3a-a0b1-4688-8631-d9f265aa4c69" (UID: "a69e8d3a-a0b1-4688-8631-d9f265aa4c69"). InnerVolumeSpecName "client-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:55.421627 master-0 kubenswrapper[18707]: I0320 08:46:55.421577 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-server-tls" (OuterVolumeSpecName: "secret-metrics-server-tls") pod "a69e8d3a-a0b1-4688-8631-d9f265aa4c69" (UID: "a69e8d3a-a0b1-4688-8631-d9f265aa4c69"). InnerVolumeSpecName "secret-metrics-server-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:55.462933 master-0 kubenswrapper[18707]: I0320 08:46:55.462856 18707 scope.go:117] "RemoveContainer" containerID="3fbc2ee15fb564cff30cd6bea616239c5ddcc5c59f79a881b3626f244e98915b" Mar 20 08:46:55.519247 master-0 kubenswrapper[18707]: I0320 08:46:55.519169 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6p8v\" (UniqueName: \"kubernetes.io/projected/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-kube-api-access-c6p8v\") on node \"master-0\" DevicePath \"\"" Mar 20 08:46:55.519247 master-0 kubenswrapper[18707]: I0320 08:46:55.519235 18707 reconciler_common.go:293] "Volume detached for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-client-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 08:46:55.519247 master-0 kubenswrapper[18707]: I0320 08:46:55.519247 18707 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 08:46:55.519441 master-0 kubenswrapper[18707]: I0320 08:46:55.519261 18707 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 08:46:55.519441 master-0 kubenswrapper[18707]: I0320 08:46:55.519281 18707 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a69e8d3a-a0b1-4688-8631-d9f265aa4c69-secret-metrics-server-tls\") on node \"master-0\" DevicePath \"\"" Mar 20 08:46:55.544832 master-0 kubenswrapper[18707]: I0320 08:46:55.544744 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" event={"ID":"a69e8d3a-a0b1-4688-8631-d9f265aa4c69","Type":"ContainerDied","Data":"9786d6c6139bd8ce0f2d0b4a8dd76068f202fd2531fb86bd7b58ce95ed53f64f"} Mar 20 08:46:55.545600 master-0 kubenswrapper[18707]: I0320 08:46:55.544777 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-64c67d44c4-s7vfs" Mar 20 08:46:55.937602 master-0 kubenswrapper[18707]: E0320 08:46:55.937383 18707 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e8037af960d6b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Killing,Message:Stopping container etcd-rev,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:44:39.887957355 +0000 UTC m=+225.044137791,LastTimestamp:2026-03-20 08:44:39.887957355 +0000 UTC m=+225.044137791,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:46:59.182443 master-0 kubenswrapper[18707]: E0320 08:46:59.182360 18707 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 20 08:46:59.579016 master-0 kubenswrapper[18707]: I0320 08:46:59.578961 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"dc12be87dbd9c4dd20f9d765c06d5e43610309116152e4e35d0d6ab27126c17f"} Mar 20 08:46:59.579391 master-0 kubenswrapper[18707]: I0320 08:46:59.579351 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:46:59.579458 master-0 kubenswrapper[18707]: I0320 08:46:59.579400 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:47:00.597453 master-0 kubenswrapper[18707]: I0320 08:47:00.597360 18707 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="dc12be87dbd9c4dd20f9d765c06d5e43610309116152e4e35d0d6ab27126c17f" exitCode=0 Mar 20 08:47:00.597453 master-0 kubenswrapper[18707]: I0320 08:47:00.597442 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"dc12be87dbd9c4dd20f9d765c06d5e43610309116152e4e35d0d6ab27126c17f"} Mar 20 08:47:04.640860 master-0 kubenswrapper[18707]: I0320 08:47:04.640734 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-f44gr_96de6024-e20f-4b52-9294-b330d65e4153/snapshot-controller/1.log" Mar 20 08:47:04.641932 master-0 kubenswrapper[18707]: I0320 08:47:04.641899 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-f44gr_96de6024-e20f-4b52-9294-b330d65e4153/snapshot-controller/0.log" Mar 20 08:47:04.642022 master-0 kubenswrapper[18707]: I0320 08:47:04.641981 18707 generic.go:334] "Generic (PLEG): container finished" podID="96de6024-e20f-4b52-9294-b330d65e4153" containerID="cdbe05a6f4402cb697000fd0767650f0ef06aefaed19227820f9e4a92d912881" exitCode=1 Mar 20 08:47:04.642090 master-0 kubenswrapper[18707]: I0320 08:47:04.642051 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" event={"ID":"96de6024-e20f-4b52-9294-b330d65e4153","Type":"ContainerDied","Data":"cdbe05a6f4402cb697000fd0767650f0ef06aefaed19227820f9e4a92d912881"} Mar 20 08:47:04.642158 master-0 kubenswrapper[18707]: I0320 08:47:04.642127 18707 scope.go:117] "RemoveContainer" containerID="08ddad0132f1249a0926348017f97a2ace0dd09f6ef4f6ff1405a91cd7359f8f" Mar 20 08:47:04.643258 master-0 kubenswrapper[18707]: I0320 08:47:04.643178 18707 scope.go:117] "RemoveContainer" containerID="cdbe05a6f4402cb697000fd0767650f0ef06aefaed19227820f9e4a92d912881" Mar 20 08:47:04.643752 master-0 kubenswrapper[18707]: E0320 08:47:04.643661 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-f44gr_openshift-cluster-storage-operator(96de6024-e20f-4b52-9294-b330d65e4153)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" podUID="96de6024-e20f-4b52-9294-b330d65e4153" Mar 20 08:47:05.654466 master-0 kubenswrapper[18707]: I0320 08:47:05.654355 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-f44gr_96de6024-e20f-4b52-9294-b330d65e4153/snapshot-controller/1.log" Mar 20 08:47:05.900567 master-0 kubenswrapper[18707]: E0320 08:47:05.900168 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:46:55Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:46:55Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:46:55Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:46:55Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4869c69128f74d9c3b178ea6c8c8d38df169b6bce05eb821a65f0aaf514c563a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f3c2ad90e251062165f8d6623ca4994c0b3e28324e4b5b17fd588b162ec97766\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746912226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:6bec6e4ce9b3ff60658829df2f5980cf947458d49b97476cee1ff01ec638d309\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:b259d760a14ca994ed34d4cfc901758f180cc8d1de7f3c427baa68030e06b7c7\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252654287},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:542b8eae269892bc6f0f9d5ab808afe26119a72e430870d81f59faf93c3f2f18\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d18953a151bb71fe6585017054b939c4062435242069a2601fe668009e6e5087\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223740630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f2c59d19eb73ad5c0f93b0a63003c1885f5297959c9c45b401d1a74aea6e76\\\"],\\\"sizeBytes\\\":880382887},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de91abd5ad76fb491881a75a0feb4b8ca5600ceb5e15a4b0b687ada01ea0a44c\\\"],\\\"sizeBytes\\\":862205633},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f3038df8df25746bb5095296d4e5740f2356f85c1ed8d43f1b3d281e294826e5\\\"],\\\"sizeBytes\\\":605698193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5\\\"],\\\"sizeBytes\\\":557428271},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1973d56a1097a48ea0ebf2c4dbae1ed86fa67bb0116f4962f7720d48aa337d27\\\"],\\\"sizeBytes\\\":504662731},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf72297fee61ec9950f6868881ad3e84be8692ca08f084b3d155d93a766c0823\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a746a87b784ea1caa278fd0e012554f9df520b6fff665ea0bc4c83f487fed113\\\"],\\\"sizeBytes\\\":484450894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c4d5a681595e428ff4b5083648c13615eed80be9084a3d3fc68a0295079cb12\\\"],\\\"sizeBytes\\\":484187929}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:47:07.676910 master-0 kubenswrapper[18707]: I0320 08:47:07.676680 18707 generic.go:334] "Generic (PLEG): container finished" podID="5e3ddf9e-eeb5-4266-b675-092fd4e27623" containerID="c4a834368b75816e5bf327a50499cbf160883d81fc9ea89519da8bf5870c95aa" exitCode=0 Mar 20 08:47:07.676910 master-0 kubenswrapper[18707]: I0320 08:47:07.676823 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" event={"ID":"5e3ddf9e-eeb5-4266-b675-092fd4e27623","Type":"ContainerDied","Data":"c4a834368b75816e5bf327a50499cbf160883d81fc9ea89519da8bf5870c95aa"} Mar 20 08:47:07.678866 master-0 kubenswrapper[18707]: I0320 08:47:07.678785 18707 scope.go:117] "RemoveContainer" containerID="c4a834368b75816e5bf327a50499cbf160883d81fc9ea89519da8bf5870c95aa" Mar 20 08:47:07.681234 master-0 kubenswrapper[18707]: I0320 08:47:07.681124 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-dv6cd_f53bc282-5937-49ac-ac98-2ee37ccb268d/cluster-baremetal-operator/0.log" Mar 20 08:47:07.681389 master-0 kubenswrapper[18707]: I0320 08:47:07.681264 18707 generic.go:334] "Generic (PLEG): container finished" podID="f53bc282-5937-49ac-ac98-2ee37ccb268d" containerID="84b569249d8110b2d1ae9f2ad68f8d09c1249f9b0523e1c444309cb25f725be7" exitCode=1 Mar 20 08:47:07.681389 master-0 kubenswrapper[18707]: I0320 08:47:07.681311 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" event={"ID":"f53bc282-5937-49ac-ac98-2ee37ccb268d","Type":"ContainerDied","Data":"84b569249d8110b2d1ae9f2ad68f8d09c1249f9b0523e1c444309cb25f725be7"} Mar 20 08:47:07.681984 master-0 kubenswrapper[18707]: I0320 08:47:07.681861 18707 scope.go:117] "RemoveContainer" containerID="84b569249d8110b2d1ae9f2ad68f8d09c1249f9b0523e1c444309cb25f725be7" Mar 20 08:47:08.694726 master-0 kubenswrapper[18707]: I0320 08:47:08.694397 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-z2zpj" event={"ID":"5e3ddf9e-eeb5-4266-b675-092fd4e27623","Type":"ContainerStarted","Data":"eb32f37df19836db9c9b2e7e28149ccbb2173aa339df4afe1492f2baa3a15aa8"} Mar 20 08:47:08.698766 master-0 kubenswrapper[18707]: I0320 08:47:08.698678 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-dv6cd_f53bc282-5937-49ac-ac98-2ee37ccb268d/cluster-baremetal-operator/0.log" Mar 20 08:47:08.698950 master-0 kubenswrapper[18707]: I0320 08:47:08.698794 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" event={"ID":"f53bc282-5937-49ac-ac98-2ee37ccb268d","Type":"ContainerStarted","Data":"3280012a08ca163e191750f1532fb90a38637168dc635f7e9ff14ee2e4d0ad50"} Mar 20 08:47:09.709580 master-0 kubenswrapper[18707]: I0320 08:47:09.709411 18707 generic.go:334] "Generic (PLEG): container finished" podID="6d2e841b-2070-42a9-b9c1-74411ddebee4" containerID="9bde44a84ae0d5d3ed2a7201c7b97eb3264198bec510b101380ad7b4c98aa7b7" exitCode=0 Mar 20 08:47:09.709580 master-0 kubenswrapper[18707]: I0320 08:47:09.709488 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" event={"ID":"6d2e841b-2070-42a9-b9c1-74411ddebee4","Type":"ContainerDied","Data":"9bde44a84ae0d5d3ed2a7201c7b97eb3264198bec510b101380ad7b4c98aa7b7"} Mar 20 08:47:09.710355 master-0 kubenswrapper[18707]: I0320 08:47:09.710109 18707 scope.go:117] "RemoveContainer" containerID="9bde44a84ae0d5d3ed2a7201c7b97eb3264198bec510b101380ad7b4c98aa7b7" Mar 20 08:47:10.725499 master-0 kubenswrapper[18707]: I0320 08:47:10.725386 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" event={"ID":"6d2e841b-2070-42a9-b9c1-74411ddebee4","Type":"ContainerStarted","Data":"8f35773cae8c100cade7a54b2f4dc5eb84320a3ff7700d25bd4a21c072f15f3e"} Mar 20 08:47:10.726689 master-0 kubenswrapper[18707]: I0320 08:47:10.725888 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:47:10.731251 master-0 kubenswrapper[18707]: I0320 08:47:10.731141 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-bfcf4df58-l6xz7" Mar 20 08:47:11.985180 master-0 kubenswrapper[18707]: E0320 08:47:11.985064 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 20 08:47:12.863256 master-0 kubenswrapper[18707]: I0320 08:47:12.861113 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:47:12.863256 master-0 kubenswrapper[18707]: E0320 08:47:12.862068 18707 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:47:12.863256 master-0 kubenswrapper[18707]: E0320 08:47:12.862117 18707 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:47:12.863256 master-0 kubenswrapper[18707]: E0320 08:47:12.862254 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access podName:d245e5b2-a30d-45c8-9b79-6e8096765c14 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:14.862168644 +0000 UTC m=+500.018349010 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access") pod "installer-3-master-0" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:47:15.772278 master-0 kubenswrapper[18707]: I0320 08:47:15.772210 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-5c6485487f-qb94j_ae39c09b-7aef-4615-8ced-0dcad39f23a5/machine-approver-controller/0.log" Mar 20 08:47:15.773248 master-0 kubenswrapper[18707]: I0320 08:47:15.773173 18707 generic.go:334] "Generic (PLEG): container finished" podID="ae39c09b-7aef-4615-8ced-0dcad39f23a5" containerID="0e60f2693cbc96c33931a792326fb808ba028038939cac58b0b52b50bec85ee7" exitCode=255 Mar 20 08:47:15.773372 master-0 kubenswrapper[18707]: I0320 08:47:15.773254 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" event={"ID":"ae39c09b-7aef-4615-8ced-0dcad39f23a5","Type":"ContainerDied","Data":"0e60f2693cbc96c33931a792326fb808ba028038939cac58b0b52b50bec85ee7"} Mar 20 08:47:15.774338 master-0 kubenswrapper[18707]: I0320 08:47:15.774322 18707 scope.go:117] "RemoveContainer" containerID="0e60f2693cbc96c33931a792326fb808ba028038939cac58b0b52b50bec85ee7" Mar 20 08:47:15.902362 master-0 kubenswrapper[18707]: E0320 08:47:15.901944 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:47:16.790888 master-0 kubenswrapper[18707]: I0320 08:47:16.790793 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-5c6485487f-qb94j_ae39c09b-7aef-4615-8ced-0dcad39f23a5/machine-approver-controller/0.log" Mar 20 08:47:16.792024 master-0 kubenswrapper[18707]: I0320 08:47:16.791436 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-qb94j" event={"ID":"ae39c09b-7aef-4615-8ced-0dcad39f23a5","Type":"ContainerStarted","Data":"8ddef4a9495b149331b247ecd9c42e1102d8a861af73ef90a38edd5f285b6fee"} Mar 20 08:47:17.806810 master-0 kubenswrapper[18707]: I0320 08:47:17.806708 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8dd3d3608fe9c86b0f65904ec2353df4/kube-scheduler/0.log" Mar 20 08:47:17.807735 master-0 kubenswrapper[18707]: I0320 08:47:17.807341 18707 generic.go:334] "Generic (PLEG): container finished" podID="8dd3d3608fe9c86b0f65904ec2353df4" containerID="5aef46f74824b7bd8319047c24bacbb9cbf6ac782ef810c2be78f3961d31d75e" exitCode=1 Mar 20 08:47:17.807735 master-0 kubenswrapper[18707]: I0320 08:47:17.807397 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerDied","Data":"5aef46f74824b7bd8319047c24bacbb9cbf6ac782ef810c2be78f3961d31d75e"} Mar 20 08:47:17.808536 master-0 kubenswrapper[18707]: I0320 08:47:17.808481 18707 scope.go:117] "RemoveContainer" containerID="5aef46f74824b7bd8319047c24bacbb9cbf6ac782ef810c2be78f3961d31d75e" Mar 20 08:47:18.821157 master-0 kubenswrapper[18707]: I0320 08:47:18.821072 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8dd3d3608fe9c86b0f65904ec2353df4/kube-scheduler/0.log" Mar 20 08:47:18.822247 master-0 kubenswrapper[18707]: I0320 08:47:18.821843 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerStarted","Data":"90e686e94f5392e75756edfe996a2d225a8ede8bd72124fcf1876da1d3244223"} Mar 20 08:47:18.822494 master-0 kubenswrapper[18707]: I0320 08:47:18.822442 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:47:19.830360 master-0 kubenswrapper[18707]: I0320 08:47:19.830293 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8dd3d3608fe9c86b0f65904ec2353df4/kube-scheduler/0.log" Mar 20 08:47:19.831001 master-0 kubenswrapper[18707]: I0320 08:47:19.830630 18707 generic.go:334] "Generic (PLEG): container finished" podID="8dd3d3608fe9c86b0f65904ec2353df4" containerID="3c189090fb625d43e4a0aad0248864aa122ec54e7ab96d232c95dbcba79fcc95" exitCode=0 Mar 20 08:47:19.831001 master-0 kubenswrapper[18707]: I0320 08:47:19.830663 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerDied","Data":"3c189090fb625d43e4a0aad0248864aa122ec54e7ab96d232c95dbcba79fcc95"} Mar 20 08:47:19.831316 master-0 kubenswrapper[18707]: I0320 08:47:19.831282 18707 scope.go:117] "RemoveContainer" containerID="3c189090fb625d43e4a0aad0248864aa122ec54e7ab96d232c95dbcba79fcc95" Mar 20 08:47:20.094388 master-0 kubenswrapper[18707]: I0320 08:47:20.094222 18707 scope.go:117] "RemoveContainer" containerID="cdbe05a6f4402cb697000fd0767650f0ef06aefaed19227820f9e4a92d912881" Mar 20 08:47:20.845085 master-0 kubenswrapper[18707]: I0320 08:47:20.844997 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:47:20.847413 master-0 kubenswrapper[18707]: I0320 08:47:20.847333 18707 generic.go:334] "Generic (PLEG): container finished" podID="2028761b8522f874dcebf13c4683d033" containerID="df731fba572a5e906cf5649aa4cc14b16e25717d1ba7fd8e7dc02e7ea0aa85be" exitCode=0 Mar 20 08:47:20.847571 master-0 kubenswrapper[18707]: I0320 08:47:20.847466 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerDied","Data":"df731fba572a5e906cf5649aa4cc14b16e25717d1ba7fd8e7dc02e7ea0aa85be"} Mar 20 08:47:20.848819 master-0 kubenswrapper[18707]: I0320 08:47:20.848780 18707 scope.go:117] "RemoveContainer" containerID="df731fba572a5e906cf5649aa4cc14b16e25717d1ba7fd8e7dc02e7ea0aa85be" Mar 20 08:47:20.853321 master-0 kubenswrapper[18707]: I0320 08:47:20.853270 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8dd3d3608fe9c86b0f65904ec2353df4/kube-scheduler/0.log" Mar 20 08:47:20.854208 master-0 kubenswrapper[18707]: I0320 08:47:20.854146 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8dd3d3608fe9c86b0f65904ec2353df4","Type":"ContainerStarted","Data":"21af743d9ba623a159aff49736a2e159bcf30ef8644fe5c3ed418758349b2110"} Mar 20 08:47:20.858103 master-0 kubenswrapper[18707]: I0320 08:47:20.858035 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-f44gr_96de6024-e20f-4b52-9294-b330d65e4153/snapshot-controller/1.log" Mar 20 08:47:20.858200 master-0 kubenswrapper[18707]: I0320 08:47:20.858141 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" event={"ID":"96de6024-e20f-4b52-9294-b330d65e4153","Type":"ContainerStarted","Data":"b44a311658d31e89e841b17c8b5c7ac065996f840c2553c9ed5b0e7c103c6d53"} Mar 20 08:47:21.870537 master-0 kubenswrapper[18707]: I0320 08:47:21.870476 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:47:21.874386 master-0 kubenswrapper[18707]: I0320 08:47:21.874290 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"4b5d2ccc50a346e3fb1af695a75046f7d3a06154bf3882375c6b72feec851179"} Mar 20 08:47:25.257492 master-0 kubenswrapper[18707]: I0320 08:47:25.257383 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:47:25.532697 master-0 kubenswrapper[18707]: I0320 08:47:25.532469 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:47:25.903017 master-0 kubenswrapper[18707]: E0320 08:47:25.902921 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:47:25.928900 master-0 kubenswrapper[18707]: I0320 08:47:25.928804 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-7t5qv_e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e/control-plane-machine-set-operator/0.log" Mar 20 08:47:25.928900 master-0 kubenswrapper[18707]: I0320 08:47:25.928892 18707 generic.go:334] "Generic (PLEG): container finished" podID="e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e" containerID="322d73e8b151dad5452501bac1f7dfab899c0c317c5ec70fec1dfe654509113e" exitCode=1 Mar 20 08:47:25.929414 master-0 kubenswrapper[18707]: I0320 08:47:25.928951 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" event={"ID":"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e","Type":"ContainerDied","Data":"322d73e8b151dad5452501bac1f7dfab899c0c317c5ec70fec1dfe654509113e"} Mar 20 08:47:25.930474 master-0 kubenswrapper[18707]: I0320 08:47:25.930420 18707 scope.go:117] "RemoveContainer" containerID="322d73e8b151dad5452501bac1f7dfab899c0c317c5ec70fec1dfe654509113e" Mar 20 08:47:26.944276 master-0 kubenswrapper[18707]: I0320 08:47:26.944155 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-7t5qv_e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e/control-plane-machine-set-operator/0.log" Mar 20 08:47:26.945147 master-0 kubenswrapper[18707]: I0320 08:47:26.944309 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-7t5qv" event={"ID":"e9e912c1-a2d4-4d78-98c1-ea3b232ddd7e","Type":"ContainerStarted","Data":"f37851a25ef42464de23ed4647241a17980786083b49fee70a6360e2172ba84e"} Mar 20 08:47:27.952428 master-0 kubenswrapper[18707]: E0320 08:47:27.952297 18707 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:47:27.952428 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(a7d9b28db0ac22a37d423a7ed15e1230abf1c849668396dd25061a938e1b3f30): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a7d9b28db0ac22a37d423a7ed15e1230abf1c849668396dd25061a938e1b3f30" Netns:"/var/run/netns/0076a471-cb8c-4a76-9e34-57408445194b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=a7d9b28db0ac22a37d423a7ed15e1230abf1c849668396dd25061a938e1b3f30;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552" Path:"" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:47:27.952428 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:47:27.952428 master-0 kubenswrapper[18707]: > Mar 20 08:47:27.953234 master-0 kubenswrapper[18707]: E0320 08:47:27.952463 18707 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:47:27.953234 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(a7d9b28db0ac22a37d423a7ed15e1230abf1c849668396dd25061a938e1b3f30): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a7d9b28db0ac22a37d423a7ed15e1230abf1c849668396dd25061a938e1b3f30" Netns:"/var/run/netns/0076a471-cb8c-4a76-9e34-57408445194b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=a7d9b28db0ac22a37d423a7ed15e1230abf1c849668396dd25061a938e1b3f30;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552" Path:"" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:47:27.953234 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:47:27.953234 master-0 kubenswrapper[18707]: > pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:47:27.953234 master-0 kubenswrapper[18707]: E0320 08:47:27.952503 18707 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:47:27.953234 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(a7d9b28db0ac22a37d423a7ed15e1230abf1c849668396dd25061a938e1b3f30): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a7d9b28db0ac22a37d423a7ed15e1230abf1c849668396dd25061a938e1b3f30" Netns:"/var/run/netns/0076a471-cb8c-4a76-9e34-57408445194b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=a7d9b28db0ac22a37d423a7ed15e1230abf1c849668396dd25061a938e1b3f30;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552" Path:"" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:47:27.953234 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:47:27.953234 master-0 kubenswrapper[18707]: > pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:47:27.953234 master-0 kubenswrapper[18707]: E0320 08:47:27.952615 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"console-operator-76b6568d85-8b8gv_openshift-console-operator(348f3880-793f-43e4-9de1-8511626d2552)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"console-operator-76b6568d85-8b8gv_openshift-console-operator(348f3880-793f-43e4-9de1-8511626d2552)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(a7d9b28db0ac22a37d423a7ed15e1230abf1c849668396dd25061a938e1b3f30): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"a7d9b28db0ac22a37d423a7ed15e1230abf1c849668396dd25061a938e1b3f30\\\" Netns:\\\"/var/run/netns/0076a471-cb8c-4a76-9e34-57408445194b\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=a7d9b28db0ac22a37d423a7ed15e1230abf1c849668396dd25061a938e1b3f30;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" podUID="348f3880-793f-43e4-9de1-8511626d2552" Mar 20 08:47:28.532761 master-0 kubenswrapper[18707]: I0320 08:47:28.532616 18707 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:47:28.532761 master-0 kubenswrapper[18707]: I0320 08:47:28.532732 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:47:28.960898 master-0 kubenswrapper[18707]: I0320 08:47:28.960793 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:47:28.961901 master-0 kubenswrapper[18707]: I0320 08:47:28.961693 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:47:28.986847 master-0 kubenswrapper[18707]: E0320 08:47:28.986773 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 20 08:47:29.941209 master-0 kubenswrapper[18707]: E0320 08:47:29.940995 18707 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{alertmanager-main-0.189e8037b60b5a59 openshift-monitoring 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-monitoring,Name:alertmanager-main-0,UID:14d29bfa-a0cf-43bd-a3b8-052c1a224fc9,APIVersion:v1,ResourceVersion:13929,FieldPath:spec.containers{alertmanager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0be5d73579621976f063d98db555f3bceee2f5a91b14422481ce30561438712c\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:44:39.996308057 +0000 UTC m=+225.152488413,LastTimestamp:2026-03-20 08:44:39.996308057 +0000 UTC m=+225.152488413,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:47:33.582771 master-0 kubenswrapper[18707]: E0320 08:47:33.582644 18707 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 20 08:47:34.009014 master-0 kubenswrapper[18707]: I0320 08:47:34.008830 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:47:34.009014 master-0 kubenswrapper[18707]: I0320 08:47:34.008889 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:47:35.904565 master-0 kubenswrapper[18707]: E0320 08:47:35.904426 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:47:38.533600 master-0 kubenswrapper[18707]: I0320 08:47:38.533482 18707 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:47:38.534621 master-0 kubenswrapper[18707]: I0320 08:47:38.533613 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:47:39.998907 master-0 kubenswrapper[18707]: I0320 08:47:39.998792 18707 status_manager.go:851] "Failed to get status for pod" podUID="4fea9b05-222e-4b58-95c8-735fc1cf3a8b" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-74mgr" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods catalogd-controller-manager-6864dc98f7-74mgr)" Mar 20 08:47:45.905476 master-0 kubenswrapper[18707]: E0320 08:47:45.905294 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:47:45.905476 master-0 kubenswrapper[18707]: E0320 08:47:45.905360 18707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 08:47:45.988441 master-0 kubenswrapper[18707]: E0320 08:47:45.988300 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 20 08:47:48.532517 master-0 kubenswrapper[18707]: I0320 08:47:48.532408 18707 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:47:48.533329 master-0 kubenswrapper[18707]: I0320 08:47:48.532518 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:47:48.533329 master-0 kubenswrapper[18707]: I0320 08:47:48.532621 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:47:48.533654 master-0 kubenswrapper[18707]: I0320 08:47:48.533593 18707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"4b5d2ccc50a346e3fb1af695a75046f7d3a06154bf3882375c6b72feec851179"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 08:47:48.533791 master-0 kubenswrapper[18707]: I0320 08:47:48.533745 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" containerID="cri-o://4b5d2ccc50a346e3fb1af695a75046f7d3a06154bf3882375c6b72feec851179" gracePeriod=30 Mar 20 08:47:48.665095 master-0 kubenswrapper[18707]: E0320 08:47:48.665006 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2028761b8522f874dcebf13c4683d033.slice/crio-conmon-4b5d2ccc50a346e3fb1af695a75046f7d3a06154bf3882375c6b72feec851179.scope\": RecentStats: unable to find data in memory cache]" Mar 20 08:47:49.167346 master-0 kubenswrapper[18707]: I0320 08:47:49.167106 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/cluster-policy-controller/1.log" Mar 20 08:47:49.169829 master-0 kubenswrapper[18707]: I0320 08:47:49.169748 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:47:49.172014 master-0 kubenswrapper[18707]: I0320 08:47:49.171902 18707 generic.go:334] "Generic (PLEG): container finished" podID="2028761b8522f874dcebf13c4683d033" containerID="4b5d2ccc50a346e3fb1af695a75046f7d3a06154bf3882375c6b72feec851179" exitCode=255 Mar 20 08:47:49.172014 master-0 kubenswrapper[18707]: I0320 08:47:49.171987 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerDied","Data":"4b5d2ccc50a346e3fb1af695a75046f7d3a06154bf3882375c6b72feec851179"} Mar 20 08:47:49.172362 master-0 kubenswrapper[18707]: I0320 08:47:49.172034 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"afeb635d749a5fd21753eee3cf6c962d519e76c6be8c83d442ea22a59f925a3a"} Mar 20 08:47:49.172362 master-0 kubenswrapper[18707]: I0320 08:47:49.172069 18707 scope.go:117] "RemoveContainer" containerID="df731fba572a5e906cf5649aa4cc14b16e25717d1ba7fd8e7dc02e7ea0aa85be" Mar 20 08:47:50.187732 master-0 kubenswrapper[18707]: I0320 08:47:50.187639 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/cluster-policy-controller/1.log" Mar 20 08:47:50.189913 master-0 kubenswrapper[18707]: I0320 08:47:50.189848 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:47:51.206344 master-0 kubenswrapper[18707]: I0320 08:47:51.206242 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-f44gr_96de6024-e20f-4b52-9294-b330d65e4153/snapshot-controller/2.log" Mar 20 08:47:51.207479 master-0 kubenswrapper[18707]: I0320 08:47:51.207293 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-f44gr_96de6024-e20f-4b52-9294-b330d65e4153/snapshot-controller/1.log" Mar 20 08:47:51.207479 master-0 kubenswrapper[18707]: I0320 08:47:51.207363 18707 generic.go:334] "Generic (PLEG): container finished" podID="96de6024-e20f-4b52-9294-b330d65e4153" containerID="b44a311658d31e89e841b17c8b5c7ac065996f840c2553c9ed5b0e7c103c6d53" exitCode=1 Mar 20 08:47:51.207625 master-0 kubenswrapper[18707]: I0320 08:47:51.207429 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" event={"ID":"96de6024-e20f-4b52-9294-b330d65e4153","Type":"ContainerDied","Data":"b44a311658d31e89e841b17c8b5c7ac065996f840c2553c9ed5b0e7c103c6d53"} Mar 20 08:47:51.207625 master-0 kubenswrapper[18707]: I0320 08:47:51.207591 18707 scope.go:117] "RemoveContainer" containerID="cdbe05a6f4402cb697000fd0767650f0ef06aefaed19227820f9e4a92d912881" Mar 20 08:47:51.208630 master-0 kubenswrapper[18707]: I0320 08:47:51.208556 18707 scope.go:117] "RemoveContainer" containerID="b44a311658d31e89e841b17c8b5c7ac065996f840c2553c9ed5b0e7c103c6d53" Mar 20 08:47:51.209136 master-0 kubenswrapper[18707]: E0320 08:47:51.209065 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-f44gr_openshift-cluster-storage-operator(96de6024-e20f-4b52-9294-b330d65e4153)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" podUID="96de6024-e20f-4b52-9294-b330d65e4153" Mar 20 08:47:52.221892 master-0 kubenswrapper[18707]: I0320 08:47:52.221796 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-f44gr_96de6024-e20f-4b52-9294-b330d65e4153/snapshot-controller/2.log" Mar 20 08:47:54.259914 master-0 kubenswrapper[18707]: E0320 08:47:54.259818 18707 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:47:54.259914 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(4380c2e62b9a68692ac36325cb36833c81d190dc424789b028acbfd6a05362c5): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4380c2e62b9a68692ac36325cb36833c81d190dc424789b028acbfd6a05362c5" Netns:"/var/run/netns/65006278-e165-47c3-b835-e4c438dca515" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=4380c2e62b9a68692ac36325cb36833c81d190dc424789b028acbfd6a05362c5;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:47:54.259914 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:47:54.259914 master-0 kubenswrapper[18707]: > Mar 20 08:47:54.260675 master-0 kubenswrapper[18707]: E0320 08:47:54.259966 18707 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:47:54.260675 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(4380c2e62b9a68692ac36325cb36833c81d190dc424789b028acbfd6a05362c5): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4380c2e62b9a68692ac36325cb36833c81d190dc424789b028acbfd6a05362c5" Netns:"/var/run/netns/65006278-e165-47c3-b835-e4c438dca515" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=4380c2e62b9a68692ac36325cb36833c81d190dc424789b028acbfd6a05362c5;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:47:54.260675 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:47:54.260675 master-0 kubenswrapper[18707]: > pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:47:54.260675 master-0 kubenswrapper[18707]: E0320 08:47:54.260006 18707 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:47:54.260675 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(4380c2e62b9a68692ac36325cb36833c81d190dc424789b028acbfd6a05362c5): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4380c2e62b9a68692ac36325cb36833c81d190dc424789b028acbfd6a05362c5" Netns:"/var/run/netns/65006278-e165-47c3-b835-e4c438dca515" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=4380c2e62b9a68692ac36325cb36833c81d190dc424789b028acbfd6a05362c5;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:47:54.260675 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:47:54.260675 master-0 kubenswrapper[18707]: > pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:47:54.260675 master-0 kubenswrapper[18707]: E0320 08:47:54.260134 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console(044870dd-540a-402e-84cb-fa1bf3d6a318)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console(044870dd-540a-402e-84cb-fa1bf3d6a318)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(4380c2e62b9a68692ac36325cb36833c81d190dc424789b028acbfd6a05362c5): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"4380c2e62b9a68692ac36325cb36833c81d190dc424789b028acbfd6a05362c5\\\" Netns:\\\"/var/run/netns/65006278-e165-47c3-b835-e4c438dca515\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=4380c2e62b9a68692ac36325cb36833c81d190dc424789b028acbfd6a05362c5;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" podUID="044870dd-540a-402e-84cb-fa1bf3d6a318" Mar 20 08:47:55.249434 master-0 kubenswrapper[18707]: I0320 08:47:55.249336 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:47:55.249927 master-0 kubenswrapper[18707]: I0320 08:47:55.249871 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:47:55.257587 master-0 kubenswrapper[18707]: I0320 08:47:55.257516 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:47:55.532434 master-0 kubenswrapper[18707]: I0320 08:47:55.532230 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:47:58.532714 master-0 kubenswrapper[18707]: I0320 08:47:58.532623 18707 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:47:58.533557 master-0 kubenswrapper[18707]: I0320 08:47:58.532728 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:02.990566 master-0 kubenswrapper[18707]: E0320 08:48:02.990403 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 20 08:48:03.944109 master-0 kubenswrapper[18707]: E0320 08:48:03.943903 18707 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{prometheus-k8s-0.189e8037b63a3adb openshift-monitoring 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-monitoring,Name:prometheus-k8s-0,UID:77ddbb16-b96e-4717-9786-2feae0d0cc3f,APIVersion:v1,ResourceVersion:13978,FieldPath:spec.containers{prometheus},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f3038df8df25746bb5095296d4e5740f2356f85c1ed8d43f1b3d281e294826e5\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:44:39.999380187 +0000 UTC m=+225.155560543,LastTimestamp:2026-03-20 08:44:39.999380187 +0000 UTC m=+225.155560543,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:48:04.093986 master-0 kubenswrapper[18707]: I0320 08:48:04.093921 18707 scope.go:117] "RemoveContainer" containerID="b44a311658d31e89e841b17c8b5c7ac065996f840c2553c9ed5b0e7c103c6d53" Mar 20 08:48:04.094598 master-0 kubenswrapper[18707]: E0320 08:48:04.094356 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-f44gr_openshift-cluster-storage-operator(96de6024-e20f-4b52-9294-b330d65e4153)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" podUID="96de6024-e20f-4b52-9294-b330d65e4153" Mar 20 08:48:05.913923 master-0 kubenswrapper[18707]: I0320 08:48:05.913819 18707 patch_prober.go:28] interesting pod/openshift-kube-scheduler-master-0 container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:48:05.914650 master-0 kubenswrapper[18707]: I0320 08:48:05.913939 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:06.007430 master-0 kubenswrapper[18707]: E0320 08:48:06.007276 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:47:56Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:47:56Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:47:56Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:47:56Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4869c69128f74d9c3b178ea6c8c8d38df169b6bce05eb821a65f0aaf514c563a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f3c2ad90e251062165f8d6623ca4994c0b3e28324e4b5b17fd588b162ec97766\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746912226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:6bec6e4ce9b3ff60658829df2f5980cf947458d49b97476cee1ff01ec638d309\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:b259d760a14ca994ed34d4cfc901758f180cc8d1de7f3c427baa68030e06b7c7\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252654287},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:542b8eae269892bc6f0f9d5ab808afe26119a72e430870d81f59faf93c3f2f18\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d18953a151bb71fe6585017054b939c4062435242069a2601fe668009e6e5087\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223740630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f2c59d19eb73ad5c0f93b0a63003c1885f5297959c9c45b401d1a74aea6e76\\\"],\\\"sizeBytes\\\":880382887},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de91abd5ad76fb491881a75a0feb4b8ca5600ceb5e15a4b0b687ada01ea0a44c\\\"],\\\"sizeBytes\\\":862205633},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f3038df8df25746bb5095296d4e5740f2356f85c1ed8d43f1b3d281e294826e5\\\"],\\\"sizeBytes\\\":605698193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5\\\"],\\\"sizeBytes\\\":557428271},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1973d56a1097a48ea0ebf2c4dbae1ed86fa67bb0116f4962f7720d48aa337d27\\\"],\\\"sizeBytes\\\":504662731},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf72297fee61ec9950f6868881ad3e84be8692ca08f084b3d155d93a766c0823\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a746a87b784ea1caa278fd0e012554f9df520b6fff665ea0bc4c83f487fed113\\\"],\\\"sizeBytes\\\":484450894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c4d5a681595e428ff4b5083648c13615eed80be9084a3d3fc68a0295079cb12\\\"],\\\"sizeBytes\\\":484187929}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:08.012169 master-0 kubenswrapper[18707]: E0320 08:48:08.012028 18707 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 20 08:48:08.376582 master-0 kubenswrapper[18707]: I0320 08:48:08.376498 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"8137ebf2f4c5b34c4a57daa58d2fa54d58d03ff5ae4549800f3702ecf576f346"} Mar 20 08:48:08.378890 master-0 kubenswrapper[18707]: I0320 08:48:08.378821 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-dv6cd_f53bc282-5937-49ac-ac98-2ee37ccb268d/cluster-baremetal-operator/1.log" Mar 20 08:48:08.379852 master-0 kubenswrapper[18707]: I0320 08:48:08.379797 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-dv6cd_f53bc282-5937-49ac-ac98-2ee37ccb268d/cluster-baremetal-operator/0.log" Mar 20 08:48:08.379993 master-0 kubenswrapper[18707]: I0320 08:48:08.379850 18707 generic.go:334] "Generic (PLEG): container finished" podID="f53bc282-5937-49ac-ac98-2ee37ccb268d" containerID="3280012a08ca163e191750f1532fb90a38637168dc635f7e9ff14ee2e4d0ad50" exitCode=1 Mar 20 08:48:08.379993 master-0 kubenswrapper[18707]: I0320 08:48:08.379893 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" event={"ID":"f53bc282-5937-49ac-ac98-2ee37ccb268d","Type":"ContainerDied","Data":"3280012a08ca163e191750f1532fb90a38637168dc635f7e9ff14ee2e4d0ad50"} Mar 20 08:48:08.379993 master-0 kubenswrapper[18707]: I0320 08:48:08.379942 18707 scope.go:117] "RemoveContainer" containerID="84b569249d8110b2d1ae9f2ad68f8d09c1249f9b0523e1c444309cb25f725be7" Mar 20 08:48:08.380713 master-0 kubenswrapper[18707]: I0320 08:48:08.380649 18707 scope.go:117] "RemoveContainer" containerID="3280012a08ca163e191750f1532fb90a38637168dc635f7e9ff14ee2e4d0ad50" Mar 20 08:48:08.381000 master-0 kubenswrapper[18707]: E0320 08:48:08.380936 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-6f69995874-dv6cd_openshift-machine-api(f53bc282-5937-49ac-ac98-2ee37ccb268d)\"" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" podUID="f53bc282-5937-49ac-ac98-2ee37ccb268d" Mar 20 08:48:08.533125 master-0 kubenswrapper[18707]: I0320 08:48:08.532921 18707 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:48:08.533125 master-0 kubenswrapper[18707]: I0320 08:48:08.533044 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:09.395761 master-0 kubenswrapper[18707]: I0320 08:48:09.395707 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"9fced383a1d773417df97ccb8d1b7804bc9866abf256393dd43cbf7100f3860c"} Mar 20 08:48:09.396387 master-0 kubenswrapper[18707]: I0320 08:48:09.396369 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"e691bfeb2bda4df299e442db938ea5344b5ecc57104d3cfa32390f0cbe48ab4c"} Mar 20 08:48:09.398546 master-0 kubenswrapper[18707]: I0320 08:48:09.398528 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-dv6cd_f53bc282-5937-49ac-ac98-2ee37ccb268d/cluster-baremetal-operator/1.log" Mar 20 08:48:09.852554 master-0 kubenswrapper[18707]: I0320 08:48:09.852431 18707 patch_prober.go:28] interesting pod/openshift-kube-scheduler-master-0 container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:48:09.852973 master-0 kubenswrapper[18707]: I0320 08:48:09.852552 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:10.418031 master-0 kubenswrapper[18707]: I0320 08:48:10.417934 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"5e1a69b9a28ef3e7dcf061b5c9fc06e493d8e7ab8e570d9935c3452a58dd1598"} Mar 20 08:48:10.418031 master-0 kubenswrapper[18707]: I0320 08:48:10.418004 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"6b926ccb585fdf265c65079d967421e077ba9577a80370c93037cf232dd6ac6c"} Mar 20 08:48:10.419237 master-0 kubenswrapper[18707]: I0320 08:48:10.418525 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:48:10.419237 master-0 kubenswrapper[18707]: I0320 08:48:10.418568 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:48:15.125780 master-0 kubenswrapper[18707]: I0320 08:48:15.125684 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 20 08:48:15.916727 master-0 kubenswrapper[18707]: I0320 08:48:15.916594 18707 patch_prober.go:28] interesting pod/openshift-kube-scheduler-master-0 container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:48:15.917081 master-0 kubenswrapper[18707]: I0320 08:48:15.916745 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8dd3d3608fe9c86b0f65904ec2353df4" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:16.008299 master-0 kubenswrapper[18707]: E0320 08:48:16.008102 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:17.096064 master-0 kubenswrapper[18707]: I0320 08:48:17.095410 18707 scope.go:117] "RemoveContainer" containerID="b44a311658d31e89e841b17c8b5c7ac065996f840c2553c9ed5b0e7c103c6d53" Mar 20 08:48:17.487524 master-0 kubenswrapper[18707]: I0320 08:48:17.487333 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-f44gr_96de6024-e20f-4b52-9294-b330d65e4153/snapshot-controller/2.log" Mar 20 08:48:17.487524 master-0 kubenswrapper[18707]: I0320 08:48:17.487439 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" event={"ID":"96de6024-e20f-4b52-9294-b330d65e4153","Type":"ContainerStarted","Data":"dc430f466db39347eddd50d62cf7d30007ddcbada88ac7de4c13d2e1a985e1ed"} Mar 20 08:48:18.533058 master-0 kubenswrapper[18707]: I0320 08:48:18.532912 18707 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:48:18.533922 master-0 kubenswrapper[18707]: I0320 08:48:18.533878 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:18.534077 master-0 kubenswrapper[18707]: I0320 08:48:18.534060 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:48:18.535056 master-0 kubenswrapper[18707]: I0320 08:48:18.535030 18707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"afeb635d749a5fd21753eee3cf6c962d519e76c6be8c83d442ea22a59f925a3a"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 08:48:18.535303 master-0 kubenswrapper[18707]: I0320 08:48:18.535276 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" containerID="cri-o://afeb635d749a5fd21753eee3cf6c962d519e76c6be8c83d442ea22a59f925a3a" gracePeriod=30 Mar 20 08:48:18.859749 master-0 kubenswrapper[18707]: I0320 08:48:18.859695 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 20 08:48:19.516916 master-0 kubenswrapper[18707]: I0320 08:48:19.516858 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/cluster-policy-controller/2.log" Mar 20 08:48:19.517698 master-0 kubenswrapper[18707]: I0320 08:48:19.517652 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/cluster-policy-controller/1.log" Mar 20 08:48:19.518986 master-0 kubenswrapper[18707]: I0320 08:48:19.518941 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:48:19.519683 master-0 kubenswrapper[18707]: I0320 08:48:19.519632 18707 generic.go:334] "Generic (PLEG): container finished" podID="2028761b8522f874dcebf13c4683d033" containerID="afeb635d749a5fd21753eee3cf6c962d519e76c6be8c83d442ea22a59f925a3a" exitCode=255 Mar 20 08:48:19.519790 master-0 kubenswrapper[18707]: I0320 08:48:19.519683 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerDied","Data":"afeb635d749a5fd21753eee3cf6c962d519e76c6be8c83d442ea22a59f925a3a"} Mar 20 08:48:19.519790 master-0 kubenswrapper[18707]: I0320 08:48:19.519725 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"1a645c4dd6894d87bc5827ce1df71595038461daa2bda7238648a01805dbab0e"} Mar 20 08:48:19.519790 master-0 kubenswrapper[18707]: I0320 08:48:19.519762 18707 scope.go:117] "RemoveContainer" containerID="4b5d2ccc50a346e3fb1af695a75046f7d3a06154bf3882375c6b72feec851179" Mar 20 08:48:19.991534 master-0 kubenswrapper[18707]: E0320 08:48:19.991451 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 20 08:48:20.126158 master-0 kubenswrapper[18707]: I0320 08:48:20.126054 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 20 08:48:20.168281 master-0 kubenswrapper[18707]: I0320 08:48:20.168148 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 20 08:48:20.534791 master-0 kubenswrapper[18707]: I0320 08:48:20.534719 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/cluster-policy-controller/2.log" Mar 20 08:48:20.536392 master-0 kubenswrapper[18707]: I0320 08:48:20.536351 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:48:23.094636 master-0 kubenswrapper[18707]: I0320 08:48:23.094573 18707 scope.go:117] "RemoveContainer" containerID="3280012a08ca163e191750f1532fb90a38637168dc635f7e9ff14ee2e4d0ad50" Mar 20 08:48:23.578312 master-0 kubenswrapper[18707]: I0320 08:48:23.578158 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-dv6cd_f53bc282-5937-49ac-ac98-2ee37ccb268d/cluster-baremetal-operator/1.log" Mar 20 08:48:23.578994 master-0 kubenswrapper[18707]: I0320 08:48:23.578912 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" event={"ID":"f53bc282-5937-49ac-ac98-2ee37ccb268d","Type":"ContainerStarted","Data":"5b8f6cb814b9e840ab89858d438999874215c853379b386a3140792c0b17b9bf"} Mar 20 08:48:25.147443 master-0 kubenswrapper[18707]: I0320 08:48:25.145860 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 20 08:48:25.257710 master-0 kubenswrapper[18707]: I0320 08:48:25.257565 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:48:25.532736 master-0 kubenswrapper[18707]: I0320 08:48:25.532530 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:48:26.009286 master-0 kubenswrapper[18707]: E0320 08:48:26.009140 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:28.533101 master-0 kubenswrapper[18707]: I0320 08:48:28.532981 18707 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:48:28.533101 master-0 kubenswrapper[18707]: I0320 08:48:28.533087 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:29.799477 master-0 kubenswrapper[18707]: E0320 08:48:29.799401 18707 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:48:29.799477 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(af5af9faf4f8cee86539ef74da22e713d76207507caacf3e4c0d09dc217dfa8f): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"af5af9faf4f8cee86539ef74da22e713d76207507caacf3e4c0d09dc217dfa8f" Netns:"/var/run/netns/a8e99ea1-5371-4811-b7a7-4c4fc5dc79ce" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=af5af9faf4f8cee86539ef74da22e713d76207507caacf3e4c0d09dc217dfa8f;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552" Path:"" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:48:29.799477 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:48:29.799477 master-0 kubenswrapper[18707]: > Mar 20 08:48:29.800083 master-0 kubenswrapper[18707]: E0320 08:48:29.799513 18707 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:48:29.800083 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(af5af9faf4f8cee86539ef74da22e713d76207507caacf3e4c0d09dc217dfa8f): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"af5af9faf4f8cee86539ef74da22e713d76207507caacf3e4c0d09dc217dfa8f" Netns:"/var/run/netns/a8e99ea1-5371-4811-b7a7-4c4fc5dc79ce" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=af5af9faf4f8cee86539ef74da22e713d76207507caacf3e4c0d09dc217dfa8f;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552" Path:"" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:48:29.800083 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:48:29.800083 master-0 kubenswrapper[18707]: > pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:48:29.800083 master-0 kubenswrapper[18707]: E0320 08:48:29.799540 18707 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:48:29.800083 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(af5af9faf4f8cee86539ef74da22e713d76207507caacf3e4c0d09dc217dfa8f): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"af5af9faf4f8cee86539ef74da22e713d76207507caacf3e4c0d09dc217dfa8f" Netns:"/var/run/netns/a8e99ea1-5371-4811-b7a7-4c4fc5dc79ce" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=af5af9faf4f8cee86539ef74da22e713d76207507caacf3e4c0d09dc217dfa8f;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552" Path:"" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:48:29.800083 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:48:29.800083 master-0 kubenswrapper[18707]: > pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:48:29.800083 master-0 kubenswrapper[18707]: E0320 08:48:29.799612 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"console-operator-76b6568d85-8b8gv_openshift-console-operator(348f3880-793f-43e4-9de1-8511626d2552)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"console-operator-76b6568d85-8b8gv_openshift-console-operator(348f3880-793f-43e4-9de1-8511626d2552)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(af5af9faf4f8cee86539ef74da22e713d76207507caacf3e4c0d09dc217dfa8f): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"af5af9faf4f8cee86539ef74da22e713d76207507caacf3e4c0d09dc217dfa8f\\\" Netns:\\\"/var/run/netns/a8e99ea1-5371-4811-b7a7-4c4fc5dc79ce\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=af5af9faf4f8cee86539ef74da22e713d76207507caacf3e4c0d09dc217dfa8f;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" podUID="348f3880-793f-43e4-9de1-8511626d2552" Mar 20 08:48:30.647171 master-0 kubenswrapper[18707]: I0320 08:48:30.647068 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:48:30.647900 master-0 kubenswrapper[18707]: I0320 08:48:30.647847 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:48:36.010605 master-0 kubenswrapper[18707]: E0320 08:48:36.010499 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:36.993088 master-0 kubenswrapper[18707]: E0320 08:48:36.992969 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 20 08:48:37.947230 master-0 kubenswrapper[18707]: E0320 08:48:37.946979 18707 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{alertmanager-main-0.189e80388b091f13 openshift-monitoring 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-monitoring,Name:alertmanager-main-0,UID:14d29bfa-a0cf-43bd-a3b8-052c1a224fc9,APIVersion:v1,ResourceVersion:13929,FieldPath:spec.containers{alertmanager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0be5d73579621976f063d98db555f3bceee2f5a91b14422481ce30561438712c\" in 3.573s (3.573s including waiting). Image size: 467542663 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:44:43.569708819 +0000 UTC m=+228.725889175,LastTimestamp:2026-03-20 08:44:43.569708819 +0000 UTC m=+228.725889175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:48:38.533404 master-0 kubenswrapper[18707]: I0320 08:48:38.533314 18707 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:48:38.533736 master-0 kubenswrapper[18707]: I0320 08:48:38.533408 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:40.000558 master-0 kubenswrapper[18707]: I0320 08:48:40.000435 18707 status_manager.go:851] "Failed to get status for pod" podUID="3eda9567-712b-4541-9344-a333e7734fed" pod="openshift-etcd/installer-2-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-2-master-0)" Mar 20 08:48:44.422436 master-0 kubenswrapper[18707]: E0320 08:48:44.422323 18707 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 20 08:48:44.772094 master-0 kubenswrapper[18707]: I0320 08:48:44.771916 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:48:44.772094 master-0 kubenswrapper[18707]: I0320 08:48:44.771967 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:48:46.012605 master-0 kubenswrapper[18707]: E0320 08:48:46.012525 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:46.013854 master-0 kubenswrapper[18707]: E0320 08:48:46.013351 18707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 08:48:47.806311 master-0 kubenswrapper[18707]: I0320 08:48:47.806243 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-f44gr_96de6024-e20f-4b52-9294-b330d65e4153/snapshot-controller/3.log" Mar 20 08:48:47.806934 master-0 kubenswrapper[18707]: I0320 08:48:47.806792 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-f44gr_96de6024-e20f-4b52-9294-b330d65e4153/snapshot-controller/2.log" Mar 20 08:48:47.806934 master-0 kubenswrapper[18707]: I0320 08:48:47.806856 18707 generic.go:334] "Generic (PLEG): container finished" podID="96de6024-e20f-4b52-9294-b330d65e4153" containerID="dc430f466db39347eddd50d62cf7d30007ddcbada88ac7de4c13d2e1a985e1ed" exitCode=1 Mar 20 08:48:47.806934 master-0 kubenswrapper[18707]: I0320 08:48:47.806894 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" event={"ID":"96de6024-e20f-4b52-9294-b330d65e4153","Type":"ContainerDied","Data":"dc430f466db39347eddd50d62cf7d30007ddcbada88ac7de4c13d2e1a985e1ed"} Mar 20 08:48:47.806934 master-0 kubenswrapper[18707]: I0320 08:48:47.806938 18707 scope.go:117] "RemoveContainer" containerID="b44a311658d31e89e841b17c8b5c7ac065996f840c2553c9ed5b0e7c103c6d53" Mar 20 08:48:47.807798 master-0 kubenswrapper[18707]: I0320 08:48:47.807766 18707 scope.go:117] "RemoveContainer" containerID="dc430f466db39347eddd50d62cf7d30007ddcbada88ac7de4c13d2e1a985e1ed" Mar 20 08:48:47.808111 master-0 kubenswrapper[18707]: E0320 08:48:47.808070 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-f44gr_openshift-cluster-storage-operator(96de6024-e20f-4b52-9294-b330d65e4153)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" podUID="96de6024-e20f-4b52-9294-b330d65e4153" Mar 20 08:48:48.533344 master-0 kubenswrapper[18707]: I0320 08:48:48.533256 18707 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:48:48.533687 master-0 kubenswrapper[18707]: I0320 08:48:48.533364 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:48.533687 master-0 kubenswrapper[18707]: I0320 08:48:48.533458 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:48:48.534633 master-0 kubenswrapper[18707]: I0320 08:48:48.534580 18707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"1a645c4dd6894d87bc5827ce1df71595038461daa2bda7238648a01805dbab0e"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 08:48:48.534791 master-0 kubenswrapper[18707]: I0320 08:48:48.534745 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" containerID="cri-o://1a645c4dd6894d87bc5827ce1df71595038461daa2bda7238648a01805dbab0e" gracePeriod=30 Mar 20 08:48:48.666599 master-0 kubenswrapper[18707]: E0320 08:48:48.666499 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2028761b8522f874dcebf13c4683d033)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" Mar 20 08:48:48.820100 master-0 kubenswrapper[18707]: I0320 08:48:48.819993 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/cluster-policy-controller/3.log" Mar 20 08:48:48.821407 master-0 kubenswrapper[18707]: I0320 08:48:48.820851 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/cluster-policy-controller/2.log" Mar 20 08:48:48.822372 master-0 kubenswrapper[18707]: I0320 08:48:48.822314 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:48:48.823442 master-0 kubenswrapper[18707]: I0320 08:48:48.823369 18707 generic.go:334] "Generic (PLEG): container finished" podID="2028761b8522f874dcebf13c4683d033" containerID="1a645c4dd6894d87bc5827ce1df71595038461daa2bda7238648a01805dbab0e" exitCode=255 Mar 20 08:48:48.823571 master-0 kubenswrapper[18707]: I0320 08:48:48.823441 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerDied","Data":"1a645c4dd6894d87bc5827ce1df71595038461daa2bda7238648a01805dbab0e"} Mar 20 08:48:48.823571 master-0 kubenswrapper[18707]: I0320 08:48:48.823532 18707 scope.go:117] "RemoveContainer" containerID="afeb635d749a5fd21753eee3cf6c962d519e76c6be8c83d442ea22a59f925a3a" Mar 20 08:48:48.824633 master-0 kubenswrapper[18707]: I0320 08:48:48.824584 18707 scope.go:117] "RemoveContainer" containerID="1a645c4dd6894d87bc5827ce1df71595038461daa2bda7238648a01805dbab0e" Mar 20 08:48:48.825524 master-0 kubenswrapper[18707]: E0320 08:48:48.825285 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2028761b8522f874dcebf13c4683d033)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" Mar 20 08:48:48.828952 master-0 kubenswrapper[18707]: I0320 08:48:48.828881 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-f44gr_96de6024-e20f-4b52-9294-b330d65e4153/snapshot-controller/3.log" Mar 20 08:48:49.841112 master-0 kubenswrapper[18707]: I0320 08:48:49.841032 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/cluster-policy-controller/3.log" Mar 20 08:48:49.842673 master-0 kubenswrapper[18707]: I0320 08:48:49.842612 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:48:50.335162 master-0 kubenswrapper[18707]: I0320 08:48:50.335069 18707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:48:50.336560 master-0 kubenswrapper[18707]: I0320 08:48:50.336507 18707 scope.go:117] "RemoveContainer" containerID="1a645c4dd6894d87bc5827ce1df71595038461daa2bda7238648a01805dbab0e" Mar 20 08:48:50.337029 master-0 kubenswrapper[18707]: E0320 08:48:50.336979 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2028761b8522f874dcebf13c4683d033)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" Mar 20 08:48:53.994044 master-0 kubenswrapper[18707]: E0320 08:48:53.993929 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 20 08:48:56.078744 master-0 kubenswrapper[18707]: E0320 08:48:56.078647 18707 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:48:56.078744 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(c2d60fffe56ed279059ff614c498eff53b4dd1933e7096df70ed8391764bdfa7): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c2d60fffe56ed279059ff614c498eff53b4dd1933e7096df70ed8391764bdfa7" Netns:"/var/run/netns/a037ebf2-fffc-4748-9316-3a236cef8f45" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=c2d60fffe56ed279059ff614c498eff53b4dd1933e7096df70ed8391764bdfa7;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:48:56.078744 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:48:56.078744 master-0 kubenswrapper[18707]: > Mar 20 08:48:56.080156 master-0 kubenswrapper[18707]: E0320 08:48:56.078784 18707 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:48:56.080156 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(c2d60fffe56ed279059ff614c498eff53b4dd1933e7096df70ed8391764bdfa7): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c2d60fffe56ed279059ff614c498eff53b4dd1933e7096df70ed8391764bdfa7" Netns:"/var/run/netns/a037ebf2-fffc-4748-9316-3a236cef8f45" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=c2d60fffe56ed279059ff614c498eff53b4dd1933e7096df70ed8391764bdfa7;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:48:56.080156 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:48:56.080156 master-0 kubenswrapper[18707]: > pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:48:56.080156 master-0 kubenswrapper[18707]: E0320 08:48:56.078823 18707 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:48:56.080156 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(c2d60fffe56ed279059ff614c498eff53b4dd1933e7096df70ed8391764bdfa7): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c2d60fffe56ed279059ff614c498eff53b4dd1933e7096df70ed8391764bdfa7" Netns:"/var/run/netns/a037ebf2-fffc-4748-9316-3a236cef8f45" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=c2d60fffe56ed279059ff614c498eff53b4dd1933e7096df70ed8391764bdfa7;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318" Path:"" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:48:56.080156 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:48:56.080156 master-0 kubenswrapper[18707]: > pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:48:56.080156 master-0 kubenswrapper[18707]: E0320 08:48:56.078940 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console(044870dd-540a-402e-84cb-fa1bf3d6a318)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console(044870dd-540a-402e-84cb-fa1bf3d6a318)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_networking-console-plugin-7c6b76c555-zbpk4_openshift-network-console_044870dd-540a-402e-84cb-fa1bf3d6a318_0(c2d60fffe56ed279059ff614c498eff53b4dd1933e7096df70ed8391764bdfa7): error adding pod openshift-network-console_networking-console-plugin-7c6b76c555-zbpk4 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"c2d60fffe56ed279059ff614c498eff53b4dd1933e7096df70ed8391764bdfa7\\\" Netns:\\\"/var/run/netns/a037ebf2-fffc-4748-9316-3a236cef8f45\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-console;K8S_POD_NAME=networking-console-plugin-7c6b76c555-zbpk4;K8S_POD_INFRA_CONTAINER_ID=c2d60fffe56ed279059ff614c498eff53b4dd1933e7096df70ed8391764bdfa7;K8S_POD_UID=044870dd-540a-402e-84cb-fa1bf3d6a318\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4] networking: Multus: [openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4/044870dd-540a-402e-84cb-fa1bf3d6a318]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: SetNetworkStatus: failed to update the pod networking-console-plugin-7c6b76c555-zbpk4 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-console/pods/networking-console-plugin-7c6b76c555-zbpk4?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" podUID="044870dd-540a-402e-84cb-fa1bf3d6a318" Mar 20 08:48:56.917503 master-0 kubenswrapper[18707]: I0320 08:48:56.917417 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:48:56.918750 master-0 kubenswrapper[18707]: I0320 08:48:56.918707 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" Mar 20 08:49:01.094227 master-0 kubenswrapper[18707]: I0320 08:49:01.094102 18707 scope.go:117] "RemoveContainer" containerID="dc430f466db39347eddd50d62cf7d30007ddcbada88ac7de4c13d2e1a985e1ed" Mar 20 08:49:01.095053 master-0 kubenswrapper[18707]: E0320 08:49:01.094539 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-f44gr_openshift-cluster-storage-operator(96de6024-e20f-4b52-9294-b330d65e4153)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" podUID="96de6024-e20f-4b52-9294-b330d65e4153" Mar 20 08:49:04.094803 master-0 kubenswrapper[18707]: I0320 08:49:04.094732 18707 scope.go:117] "RemoveContainer" containerID="1a645c4dd6894d87bc5827ce1df71595038461daa2bda7238648a01805dbab0e" Mar 20 08:49:04.096351 master-0 kubenswrapper[18707]: E0320 08:49:04.096306 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2028761b8522f874dcebf13c4683d033)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" Mar 20 08:49:06.191393 master-0 kubenswrapper[18707]: E0320 08:49:06.190981 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:48:56Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:48:56Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:48:56Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:48:56Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4869c69128f74d9c3b178ea6c8c8d38df169b6bce05eb821a65f0aaf514c563a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f3c2ad90e251062165f8d6623ca4994c0b3e28324e4b5b17fd588b162ec97766\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746912226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:6bec6e4ce9b3ff60658829df2f5980cf947458d49b97476cee1ff01ec638d309\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:b259d760a14ca994ed34d4cfc901758f180cc8d1de7f3c427baa68030e06b7c7\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252654287},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:542b8eae269892bc6f0f9d5ab808afe26119a72e430870d81f59faf93c3f2f18\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d18953a151bb71fe6585017054b939c4062435242069a2601fe668009e6e5087\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223740630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f2c59d19eb73ad5c0f93b0a63003c1885f5297959c9c45b401d1a74aea6e76\\\"],\\\"sizeBytes\\\":880382887},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de91abd5ad76fb491881a75a0feb4b8ca5600ceb5e15a4b0b687ada01ea0a44c\\\"],\\\"sizeBytes\\\":862205633},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f3038df8df25746bb5095296d4e5740f2356f85c1ed8d43f1b3d281e294826e5\\\"],\\\"sizeBytes\\\":605698193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:112a03f2411f871cdaca5f20daef71024dac710113d5f30897117a5a02f6b6f5\\\"],\\\"sizeBytes\\\":557428271},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:30a2f97d7785ce8b0ea5115e67c4554b64adefbc7856bcf6f4fe6cc7e938a310\\\"],\\\"sizeBytes\\\":513582374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1973d56a1097a48ea0ebf2c4dbae1ed86fa67bb0116f4962f7720d48aa337d27\\\"],\\\"sizeBytes\\\":504662731},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf72297fee61ec9950f6868881ad3e84be8692ca08f084b3d155d93a766c0823\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:002dfb86e17ad8f5cc232a7d2dce183b23335c8ecb7e7d31dcf3e4446b390777\\\"],\\\"sizeBytes\\\":487159945},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a746a87b784ea1caa278fd0e012554f9df520b6fff665ea0bc4c83f487fed113\\\"],\\\"sizeBytes\\\":484450894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c4d5a681595e428ff4b5083648c13615eed80be9084a3d3fc68a0295079cb12\\\"],\\\"sizeBytes\\\":484187929}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:49:10.995524 master-0 kubenswrapper[18707]: E0320 08:49:10.995381 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="7s" Mar 20 08:49:14.911149 master-0 kubenswrapper[18707]: I0320 08:49:14.911036 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:49:14.912849 master-0 kubenswrapper[18707]: E0320 08:49:14.911401 18707 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:49:14.912849 master-0 kubenswrapper[18707]: E0320 08:49:14.911475 18707 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:49:14.912849 master-0 kubenswrapper[18707]: E0320 08:49:14.911607 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access podName:d245e5b2-a30d-45c8-9b79-6e8096765c14 nodeName:}" failed. No retries permitted until 2026-03-20 08:51:16.911563971 +0000 UTC m=+622.067744367 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access") pod "installer-3-master-0" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 20 08:49:16.096766 master-0 kubenswrapper[18707]: I0320 08:49:16.096627 18707 scope.go:117] "RemoveContainer" containerID="dc430f466db39347eddd50d62cf7d30007ddcbada88ac7de4c13d2e1a985e1ed" Mar 20 08:49:16.097642 master-0 kubenswrapper[18707]: E0320 08:49:16.097321 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-f44gr_openshift-cluster-storage-operator(96de6024-e20f-4b52-9294-b330d65e4153)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" podUID="96de6024-e20f-4b52-9294-b330d65e4153" Mar 20 08:49:16.193233 master-0 kubenswrapper[18707]: E0320 08:49:16.193112 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:49:17.112047 master-0 kubenswrapper[18707]: I0320 08:49:17.111927 18707 generic.go:334] "Generic (PLEG): container finished" podID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerID="c27171e28b2189b6b8d129f565f467a424f94c7f7677d037a89da262ce118c31" exitCode=0 Mar 20 08:49:17.113839 master-0 kubenswrapper[18707]: I0320 08:49:17.113748 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" event={"ID":"ab175f7e-a5e8-4fda-98c9-6d052a221a83","Type":"ContainerDied","Data":"c27171e28b2189b6b8d129f565f467a424f94c7f7677d037a89da262ce118c31"} Mar 20 08:49:17.113977 master-0 kubenswrapper[18707]: I0320 08:49:17.113948 18707 scope.go:117] "RemoveContainer" containerID="32a6a1f6b15d34dd1d0099bb33c1df2bcbe7798374f91764aaa7bb5b94a96471" Mar 20 08:49:17.114872 master-0 kubenswrapper[18707]: I0320 08:49:17.114808 18707 scope.go:117] "RemoveContainer" containerID="c27171e28b2189b6b8d129f565f467a424f94c7f7677d037a89da262ce118c31" Mar 20 08:49:17.116809 master-0 kubenswrapper[18707]: I0320 08:49:17.116778 18707 generic.go:334] "Generic (PLEG): container finished" podID="5e3b82e6-25e8-49f6-bbe7-1365425c4b7f" containerID="3229c4e0fdc2d8c60bad75326ff1f2872340ec6b674781d5e8f4649fb7a07f12" exitCode=0 Mar 20 08:49:17.116950 master-0 kubenswrapper[18707]: I0320 08:49:17.116888 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" event={"ID":"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f","Type":"ContainerDied","Data":"3229c4e0fdc2d8c60bad75326ff1f2872340ec6b674781d5e8f4649fb7a07f12"} Mar 20 08:49:17.117680 master-0 kubenswrapper[18707]: I0320 08:49:17.117598 18707 scope.go:117] "RemoveContainer" containerID="3229c4e0fdc2d8c60bad75326ff1f2872340ec6b674781d5e8f4649fb7a07f12" Mar 20 08:49:17.121626 master-0 kubenswrapper[18707]: I0320 08:49:17.121569 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7bd846bfc4-mt454_ad692349-5089-4afc-85b2-9b6e7997567c/network-operator/0.log" Mar 20 08:49:17.121811 master-0 kubenswrapper[18707]: I0320 08:49:17.121638 18707 generic.go:334] "Generic (PLEG): container finished" podID="ad692349-5089-4afc-85b2-9b6e7997567c" containerID="b10e547edcdc3314e5e478ee6b910f608083e53cf3a4277550ec5bfade59f20f" exitCode=0 Mar 20 08:49:17.121811 master-0 kubenswrapper[18707]: I0320 08:49:17.121678 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" event={"ID":"ad692349-5089-4afc-85b2-9b6e7997567c","Type":"ContainerDied","Data":"b10e547edcdc3314e5e478ee6b910f608083e53cf3a4277550ec5bfade59f20f"} Mar 20 08:49:17.122573 master-0 kubenswrapper[18707]: I0320 08:49:17.122477 18707 scope.go:117] "RemoveContainer" containerID="b10e547edcdc3314e5e478ee6b910f608083e53cf3a4277550ec5bfade59f20f" Mar 20 08:49:17.124707 master-0 kubenswrapper[18707]: I0320 08:49:17.124637 18707 generic.go:334] "Generic (PLEG): container finished" podID="aa16c3bf-2350-46d1-afa0-9477b3ec8877" containerID="222440b4a2f7299de95ce041a034d3160fcac83fac650064e342b5c86cfa35c1" exitCode=0 Mar 20 08:49:17.124951 master-0 kubenswrapper[18707]: I0320 08:49:17.124747 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" event={"ID":"aa16c3bf-2350-46d1-afa0-9477b3ec8877","Type":"ContainerDied","Data":"222440b4a2f7299de95ce041a034d3160fcac83fac650064e342b5c86cfa35c1"} Mar 20 08:49:17.125929 master-0 kubenswrapper[18707]: I0320 08:49:17.125859 18707 scope.go:117] "RemoveContainer" containerID="222440b4a2f7299de95ce041a034d3160fcac83fac650064e342b5c86cfa35c1" Mar 20 08:49:17.128251 master-0 kubenswrapper[18707]: I0320 08:49:17.128163 18707 generic.go:334] "Generic (PLEG): container finished" podID="b4291bfd-53d9-4c78-b7cb-d7eb46560528" containerID="7d9ef09c05c17f91e19a7e2b31b502d477af56141dfbd1c2fd48a2cadd1f3194" exitCode=0 Mar 20 08:49:17.128399 master-0 kubenswrapper[18707]: I0320 08:49:17.128250 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" event={"ID":"b4291bfd-53d9-4c78-b7cb-d7eb46560528","Type":"ContainerDied","Data":"7d9ef09c05c17f91e19a7e2b31b502d477af56141dfbd1c2fd48a2cadd1f3194"} Mar 20 08:49:17.129306 master-0 kubenswrapper[18707]: I0320 08:49:17.129257 18707 scope.go:117] "RemoveContainer" containerID="7d9ef09c05c17f91e19a7e2b31b502d477af56141dfbd1c2fd48a2cadd1f3194" Mar 20 08:49:17.134215 master-0 kubenswrapper[18707]: I0320 08:49:17.134126 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-598fbc5f8f-vxzvg_ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7/cluster-node-tuning-operator/0.log" Mar 20 08:49:17.134648 master-0 kubenswrapper[18707]: I0320 08:49:17.134227 18707 generic.go:334] "Generic (PLEG): container finished" podID="ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7" containerID="68df10b3a72fc3b0c353b5fc70a166a2be68d78636e2ecc68d4b89aecbe60781" exitCode=1 Mar 20 08:49:17.134648 master-0 kubenswrapper[18707]: I0320 08:49:17.134424 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" event={"ID":"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7","Type":"ContainerDied","Data":"68df10b3a72fc3b0c353b5fc70a166a2be68d78636e2ecc68d4b89aecbe60781"} Mar 20 08:49:17.135295 master-0 kubenswrapper[18707]: I0320 08:49:17.135233 18707 scope.go:117] "RemoveContainer" containerID="68df10b3a72fc3b0c353b5fc70a166a2be68d78636e2ecc68d4b89aecbe60781" Mar 20 08:49:17.137875 master-0 kubenswrapper[18707]: I0320 08:49:17.137822 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-p7pt6_68252533-bd64-4fc5-838a-cc350cbe77f0/openshift-controller-manager-operator/0.log" Mar 20 08:49:17.138016 master-0 kubenswrapper[18707]: I0320 08:49:17.137888 18707 generic.go:334] "Generic (PLEG): container finished" podID="68252533-bd64-4fc5-838a-cc350cbe77f0" containerID="e90b46b2a24eed1acbde07d446b8c7de8acf8cbdfe00eeb63977c91e3cae9f34" exitCode=0 Mar 20 08:49:17.138016 master-0 kubenswrapper[18707]: I0320 08:49:17.137985 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" event={"ID":"68252533-bd64-4fc5-838a-cc350cbe77f0","Type":"ContainerDied","Data":"e90b46b2a24eed1acbde07d446b8c7de8acf8cbdfe00eeb63977c91e3cae9f34"} Mar 20 08:49:17.138784 master-0 kubenswrapper[18707]: I0320 08:49:17.138738 18707 scope.go:117] "RemoveContainer" containerID="e90b46b2a24eed1acbde07d446b8c7de8acf8cbdfe00eeb63977c91e3cae9f34" Mar 20 08:49:17.140668 master-0 kubenswrapper[18707]: I0320 08:49:17.140619 18707 generic.go:334] "Generic (PLEG): container finished" podID="c1854ea4-c8e2-4289-84b6-1f18b2ac684f" containerID="eedbb1dfd13f24b92d1505673b2418928be1e1bfdd5eb59005a694a899688fee" exitCode=0 Mar 20 08:49:17.140827 master-0 kubenswrapper[18707]: I0320 08:49:17.140654 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" event={"ID":"c1854ea4-c8e2-4289-84b6-1f18b2ac684f","Type":"ContainerDied","Data":"eedbb1dfd13f24b92d1505673b2418928be1e1bfdd5eb59005a694a899688fee"} Mar 20 08:49:17.141309 master-0 kubenswrapper[18707]: I0320 08:49:17.141250 18707 scope.go:117] "RemoveContainer" containerID="eedbb1dfd13f24b92d1505673b2418928be1e1bfdd5eb59005a694a899688fee" Mar 20 08:49:17.143580 master-0 kubenswrapper[18707]: I0320 08:49:17.143506 18707 generic.go:334] "Generic (PLEG): container finished" podID="75e3e2cc-aa56-41f3-8859-1c086f419d05" containerID="9526eea2cea58cb9e28474105457b96211d2f64f5d2c17947ddff373db76ab0b" exitCode=0 Mar 20 08:49:17.143580 master-0 kubenswrapper[18707]: I0320 08:49:17.143553 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" event={"ID":"75e3e2cc-aa56-41f3-8859-1c086f419d05","Type":"ContainerDied","Data":"9526eea2cea58cb9e28474105457b96211d2f64f5d2c17947ddff373db76ab0b"} Mar 20 08:49:17.144916 master-0 kubenswrapper[18707]: I0320 08:49:17.144501 18707 scope.go:117] "RemoveContainer" containerID="9526eea2cea58cb9e28474105457b96211d2f64f5d2c17947ddff373db76ab0b" Mar 20 08:49:17.147266 master-0 kubenswrapper[18707]: I0320 08:49:17.146715 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-2pg77_bbc0b783-28d5-4554-b49d-c66082546f44/package-server-manager/0.log" Mar 20 08:49:17.148086 master-0 kubenswrapper[18707]: I0320 08:49:17.147585 18707 generic.go:334] "Generic (PLEG): container finished" podID="bbc0b783-28d5-4554-b49d-c66082546f44" containerID="7ee99faecdaa8ce9ade5aaa3b49dd8416a312e96db798b1de9fced997f6fd077" exitCode=1 Mar 20 08:49:17.148086 master-0 kubenswrapper[18707]: I0320 08:49:17.147657 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" event={"ID":"bbc0b783-28d5-4554-b49d-c66082546f44","Type":"ContainerDied","Data":"7ee99faecdaa8ce9ade5aaa3b49dd8416a312e96db798b1de9fced997f6fd077"} Mar 20 08:49:17.148297 master-0 kubenswrapper[18707]: I0320 08:49:17.148103 18707 scope.go:117] "RemoveContainer" containerID="7ee99faecdaa8ce9ade5aaa3b49dd8416a312e96db798b1de9fced997f6fd077" Mar 20 08:49:17.150363 master-0 kubenswrapper[18707]: I0320 08:49:17.150269 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-62zrx_29b5b089-fb1d-46a1-bd67-2e0ba03c76a6/authentication-operator/1.log" Mar 20 08:49:17.150363 master-0 kubenswrapper[18707]: I0320 08:49:17.150353 18707 generic.go:334] "Generic (PLEG): container finished" podID="29b5b089-fb1d-46a1-bd67-2e0ba03c76a6" containerID="df7ec56bc0dc6a5103a746a24bbb9fc1482c902df08dcd67e4b6e70f5d055d5f" exitCode=0 Mar 20 08:49:17.150596 master-0 kubenswrapper[18707]: I0320 08:49:17.150398 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" event={"ID":"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6","Type":"ContainerDied","Data":"df7ec56bc0dc6a5103a746a24bbb9fc1482c902df08dcd67e4b6e70f5d055d5f"} Mar 20 08:49:17.151113 master-0 kubenswrapper[18707]: I0320 08:49:17.151066 18707 scope.go:117] "RemoveContainer" containerID="df7ec56bc0dc6a5103a746a24bbb9fc1482c902df08dcd67e4b6e70f5d055d5f" Mar 20 08:49:17.152866 master-0 kubenswrapper[18707]: I0320 08:49:17.152819 18707 generic.go:334] "Generic (PLEG): container finished" podID="42df77ec-94aa-48ba-bb35-7b1f1e8b8e97" containerID="becf6a9468ee5d2197c4916442372c9501293c27732b04e68b431411779a05c6" exitCode=0 Mar 20 08:49:17.152989 master-0 kubenswrapper[18707]: I0320 08:49:17.152883 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" event={"ID":"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97","Type":"ContainerDied","Data":"becf6a9468ee5d2197c4916442372c9501293c27732b04e68b431411779a05c6"} Mar 20 08:49:17.153647 master-0 kubenswrapper[18707]: I0320 08:49:17.153574 18707 scope.go:117] "RemoveContainer" containerID="becf6a9468ee5d2197c4916442372c9501293c27732b04e68b431411779a05c6" Mar 20 08:49:17.155600 master-0 kubenswrapper[18707]: I0320 08:49:17.154472 18707 generic.go:334] "Generic (PLEG): container finished" podID="2f844652-225b-4713-a9ad-cf9bcc348f47" containerID="cdc09fd6c3bb18aaf3523f814928e0e85e0c65581ea0a2f8e18d09f87a8cff20" exitCode=0 Mar 20 08:49:17.155600 master-0 kubenswrapper[18707]: I0320 08:49:17.154642 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" event={"ID":"2f844652-225b-4713-a9ad-cf9bcc348f47","Type":"ContainerDied","Data":"cdc09fd6c3bb18aaf3523f814928e0e85e0c65581ea0a2f8e18d09f87a8cff20"} Mar 20 08:49:17.155600 master-0 kubenswrapper[18707]: I0320 08:49:17.155348 18707 scope.go:117] "RemoveContainer" containerID="cdc09fd6c3bb18aaf3523f814928e0e85e0c65581ea0a2f8e18d09f87a8cff20" Mar 20 08:49:17.162523 master-0 kubenswrapper[18707]: I0320 08:49:17.162389 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-67dcd4998-c5742_86cb5d23-df7f-4f67-8086-1789d8e68544/cluster-olm-operator/0.log" Mar 20 08:49:17.163595 master-0 kubenswrapper[18707]: I0320 08:49:17.163470 18707 generic.go:334] "Generic (PLEG): container finished" podID="86cb5d23-df7f-4f67-8086-1789d8e68544" containerID="517434b092860d80f200ad453a8ab960ca389e8d7a3ffc04820cc51b48ee30fe" exitCode=0 Mar 20 08:49:17.163741 master-0 kubenswrapper[18707]: I0320 08:49:17.163580 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" event={"ID":"86cb5d23-df7f-4f67-8086-1789d8e68544","Type":"ContainerDied","Data":"517434b092860d80f200ad453a8ab960ca389e8d7a3ffc04820cc51b48ee30fe"} Mar 20 08:49:17.164916 master-0 kubenswrapper[18707]: I0320 08:49:17.164852 18707 scope.go:117] "RemoveContainer" containerID="517434b092860d80f200ad453a8ab960ca389e8d7a3ffc04820cc51b48ee30fe" Mar 20 08:49:17.169036 master-0 kubenswrapper[18707]: I0320 08:49:17.168975 18707 generic.go:334] "Generic (PLEG): container finished" podID="f046860d-2d54-4746-8ba2-f8e90fa55e38" containerID="c3742feb1f4aa394282e45f9e7e1ad5a78209b23e0c120a4f3b31f9fa95097bc" exitCode=0 Mar 20 08:49:17.169128 master-0 kubenswrapper[18707]: I0320 08:49:17.169088 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" event={"ID":"f046860d-2d54-4746-8ba2-f8e90fa55e38","Type":"ContainerDied","Data":"c3742feb1f4aa394282e45f9e7e1ad5a78209b23e0c120a4f3b31f9fa95097bc"} Mar 20 08:49:17.169640 master-0 kubenswrapper[18707]: I0320 08:49:17.169589 18707 scope.go:117] "RemoveContainer" containerID="c3742feb1f4aa394282e45f9e7e1ad5a78209b23e0c120a4f3b31f9fa95097bc" Mar 20 08:49:17.173299 master-0 kubenswrapper[18707]: I0320 08:49:17.173096 18707 generic.go:334] "Generic (PLEG): container finished" podID="7c4e7e57-43be-4d31-b523-f7e4d316dce3" containerID="68bec1ef3f4454b1453d2de2db069e48c08d8a5c1a267f409f8da798126b9d46" exitCode=0 Mar 20 08:49:17.173760 master-0 kubenswrapper[18707]: I0320 08:49:17.173436 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh" event={"ID":"7c4e7e57-43be-4d31-b523-f7e4d316dce3","Type":"ContainerDied","Data":"68bec1ef3f4454b1453d2de2db069e48c08d8a5c1a267f409f8da798126b9d46"} Mar 20 08:49:17.175499 master-0 kubenswrapper[18707]: I0320 08:49:17.174776 18707 scope.go:117] "RemoveContainer" containerID="68bec1ef3f4454b1453d2de2db069e48c08d8a5c1a267f409f8da798126b9d46" Mar 20 08:49:17.178721 master-0 kubenswrapper[18707]: I0320 08:49:17.176474 18707 generic.go:334] "Generic (PLEG): container finished" podID="1375da42-ecaf-4d86-b554-25fd1c3d00bd" containerID="ef48bc7a298f21dc7e1c4f0e8ec7b05b2de65f0d7e2d6a14897ed741dcf440bd" exitCode=0 Mar 20 08:49:17.178721 master-0 kubenswrapper[18707]: I0320 08:49:17.176598 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" event={"ID":"1375da42-ecaf-4d86-b554-25fd1c3d00bd","Type":"ContainerDied","Data":"ef48bc7a298f21dc7e1c4f0e8ec7b05b2de65f0d7e2d6a14897ed741dcf440bd"} Mar 20 08:49:17.178721 master-0 kubenswrapper[18707]: I0320 08:49:17.177609 18707 scope.go:117] "RemoveContainer" containerID="ef48bc7a298f21dc7e1c4f0e8ec7b05b2de65f0d7e2d6a14897ed741dcf440bd" Mar 20 08:49:17.183090 master-0 kubenswrapper[18707]: I0320 08:49:17.182751 18707 generic.go:334] "Generic (PLEG): container finished" podID="de6078d7-2aad-46fe-b17a-b6b38e4eaa41" containerID="d47bba92b6fb8946edb6fa2f6a021436ea604b27f3b2a8581b9108a215eab3e8" exitCode=0 Mar 20 08:49:17.183090 master-0 kubenswrapper[18707]: I0320 08:49:17.182897 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" event={"ID":"de6078d7-2aad-46fe-b17a-b6b38e4eaa41","Type":"ContainerDied","Data":"d47bba92b6fb8946edb6fa2f6a021436ea604b27f3b2a8581b9108a215eab3e8"} Mar 20 08:49:17.185929 master-0 kubenswrapper[18707]: I0320 08:49:17.185866 18707 scope.go:117] "RemoveContainer" containerID="d47bba92b6fb8946edb6fa2f6a021436ea604b27f3b2a8581b9108a215eab3e8" Mar 20 08:49:17.191325 master-0 kubenswrapper[18707]: I0320 08:49:17.191255 18707 generic.go:334] "Generic (PLEG): container finished" podID="c2a23d24-9e09-431e-8c3b-8456ff51a8d0" containerID="bf7423bac144bcaaf3719ed8e76389e5f2ec9717aa4868ad4761ed7cc6782d76" exitCode=0 Mar 20 08:49:17.192954 master-0 kubenswrapper[18707]: I0320 08:49:17.191583 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" event={"ID":"c2a23d24-9e09-431e-8c3b-8456ff51a8d0","Type":"ContainerDied","Data":"bf7423bac144bcaaf3719ed8e76389e5f2ec9717aa4868ad4761ed7cc6782d76"} Mar 20 08:49:17.195998 master-0 kubenswrapper[18707]: I0320 08:49:17.194139 18707 scope.go:117] "RemoveContainer" containerID="bf7423bac144bcaaf3719ed8e76389e5f2ec9717aa4868ad4761ed7cc6782d76" Mar 20 08:49:17.197031 master-0 kubenswrapper[18707]: I0320 08:49:17.196984 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-6fbb6cf6f9-n8tnn_c7f5e6cd-e093-409a-8758-d3db7a7eb32c/machine-api-operator/0.log" Mar 20 08:49:17.199576 master-0 kubenswrapper[18707]: I0320 08:49:17.198072 18707 generic.go:334] "Generic (PLEG): container finished" podID="c7f5e6cd-e093-409a-8758-d3db7a7eb32c" containerID="43664e36cb7b60519ab710dfbfcb9bd2c63951d962e394659ce8bb21e98ebbb9" exitCode=255 Mar 20 08:49:17.199576 master-0 kubenswrapper[18707]: I0320 08:49:17.198153 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" event={"ID":"c7f5e6cd-e093-409a-8758-d3db7a7eb32c","Type":"ContainerDied","Data":"43664e36cb7b60519ab710dfbfcb9bd2c63951d962e394659ce8bb21e98ebbb9"} Mar 20 08:49:17.199576 master-0 kubenswrapper[18707]: I0320 08:49:17.199108 18707 scope.go:117] "RemoveContainer" containerID="43664e36cb7b60519ab710dfbfcb9bd2c63951d962e394659ce8bb21e98ebbb9" Mar 20 08:49:17.201618 master-0 kubenswrapper[18707]: I0320 08:49:17.201005 18707 generic.go:334] "Generic (PLEG): container finished" podID="a57854ac-809a-4745-aaa1-774f0a08a560" containerID="4d0d97d44af51af5156c718231836b8527e98e8ee5a7d3079503faf5682e5428" exitCode=0 Mar 20 08:49:17.201618 master-0 kubenswrapper[18707]: I0320 08:49:17.201082 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" event={"ID":"a57854ac-809a-4745-aaa1-774f0a08a560","Type":"ContainerDied","Data":"4d0d97d44af51af5156c718231836b8527e98e8ee5a7d3079503faf5682e5428"} Mar 20 08:49:17.201618 master-0 kubenswrapper[18707]: I0320 08:49:17.201374 18707 scope.go:117] "RemoveContainer" containerID="4d0d97d44af51af5156c718231836b8527e98e8ee5a7d3079503faf5682e5428" Mar 20 08:49:17.205798 master-0 kubenswrapper[18707]: I0320 08:49:17.205592 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-xwxg7_21bebade-17fa-444e-92a9-eea53d6cd673/cluster-autoscaler-operator/0.log" Mar 20 08:49:17.206936 master-0 kubenswrapper[18707]: I0320 08:49:17.206180 18707 generic.go:334] "Generic (PLEG): container finished" podID="21bebade-17fa-444e-92a9-eea53d6cd673" containerID="8b198f10122a271a46d1da5f2f799d55468d2123b4b2ad74d6f0cb05641e6136" exitCode=255 Mar 20 08:49:17.206936 master-0 kubenswrapper[18707]: I0320 08:49:17.206336 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" event={"ID":"21bebade-17fa-444e-92a9-eea53d6cd673","Type":"ContainerDied","Data":"8b198f10122a271a46d1da5f2f799d55468d2123b4b2ad74d6f0cb05641e6136"} Mar 20 08:49:17.207542 master-0 kubenswrapper[18707]: I0320 08:49:17.207212 18707 scope.go:117] "RemoveContainer" containerID="8b198f10122a271a46d1da5f2f799d55468d2123b4b2ad74d6f0cb05641e6136" Mar 20 08:49:17.218813 master-0 kubenswrapper[18707]: I0320 08:49:17.217469 18707 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="980a74af7dbc007260e7377ea4cb1edcafe4c9568ad57a168d88500b7bd91f2e" exitCode=0 Mar 20 08:49:17.218813 master-0 kubenswrapper[18707]: I0320 08:49:17.217538 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerDied","Data":"980a74af7dbc007260e7377ea4cb1edcafe4c9568ad57a168d88500b7bd91f2e"} Mar 20 08:49:17.218813 master-0 kubenswrapper[18707]: I0320 08:49:17.218383 18707 scope.go:117] "RemoveContainer" containerID="980a74af7dbc007260e7377ea4cb1edcafe4c9568ad57a168d88500b7bd91f2e" Mar 20 08:49:17.534477 master-0 kubenswrapper[18707]: I0320 08:49:17.534426 18707 scope.go:117] "RemoveContainer" containerID="871b2216305e9883e9345351545dfa140768d0b1a9975612361ac0d841fda34c" Mar 20 08:49:17.586335 master-0 kubenswrapper[18707]: I0320 08:49:17.586282 18707 scope.go:117] "RemoveContainer" containerID="e3f8332f8ebf3d1a4f060a0c3657a0fbf7dd52dab57065dbe98fe23495c66678" Mar 20 08:49:17.640919 master-0 kubenswrapper[18707]: I0320 08:49:17.640884 18707 scope.go:117] "RemoveContainer" containerID="f44e15f6f5dbc67eb3bd400bcba69ac1adc2aa6d2546e0cbc2778889121d3bcf" Mar 20 08:49:17.681037 master-0 kubenswrapper[18707]: I0320 08:49:17.680995 18707 scope.go:117] "RemoveContainer" containerID="adad78c0e9b98fa52036392a08a46d2f94455f9ec32c4bacb460a8042eb8ee5e" Mar 20 08:49:17.704776 master-0 kubenswrapper[18707]: I0320 08:49:17.704731 18707 scope.go:117] "RemoveContainer" containerID="8009103455f6e5b4ff5637d33a8290182816ff88ad007a0262e9e5800739bab4" Mar 20 08:49:17.731770 master-0 kubenswrapper[18707]: I0320 08:49:17.731646 18707 scope.go:117] "RemoveContainer" containerID="aba047bf62fdafe49170c75e4151358bb7421fff8f12793b7c505a061479cedc" Mar 20 08:49:17.776498 master-0 kubenswrapper[18707]: I0320 08:49:17.776438 18707 scope.go:117] "RemoveContainer" containerID="0bd5b879aadefbf9a44c6c6dcf866736c73041c10aa61588b2570b45b21eb4d2" Mar 20 08:49:17.900205 master-0 kubenswrapper[18707]: I0320 08:49:17.900132 18707 scope.go:117] "RemoveContainer" containerID="4f3597589c3cef89a127e7d3fec06145ebc7ceb92ec5a686f505064f491c35a3" Mar 20 08:49:17.929487 master-0 kubenswrapper[18707]: I0320 08:49:17.929449 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:49:17.929649 master-0 kubenswrapper[18707]: I0320 08:49:17.929637 18707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:49:18.244824 master-0 kubenswrapper[18707]: I0320 08:49:18.243835 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-pj7rj" event={"ID":"de6078d7-2aad-46fe-b17a-b6b38e4eaa41","Type":"ContainerStarted","Data":"614362fcad46e54a73760e2c9d9c7c6b75a5cdaf32206bdfbf5028b86b6b4075"} Mar 20 08:49:18.247857 master-0 kubenswrapper[18707]: I0320 08:49:18.247738 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-4dqrh" event={"ID":"7c4e7e57-43be-4d31-b523-f7e4d316dce3","Type":"ContainerStarted","Data":"83050ce47a44eea923b92781c53b7f56bbb4f832577c9b239f07ba4414845b6d"} Mar 20 08:49:18.258892 master-0 kubenswrapper[18707]: I0320 08:49:18.258381 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-c5742" event={"ID":"86cb5d23-df7f-4f67-8086-1789d8e68544","Type":"ContainerStarted","Data":"941a8467afc11f78750b0bfd260f7aa7a7abc503368c9d7be20d66485787bb33"} Mar 20 08:49:18.264128 master-0 kubenswrapper[18707]: I0320 08:49:18.263285 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" event={"ID":"ab175f7e-a5e8-4fda-98c9-6d052a221a83","Type":"ContainerStarted","Data":"bef91bdc9369cda692f66d07d6ee3239d77a72194de50fd579f49f34001b0e18"} Mar 20 08:49:18.264128 master-0 kubenswrapper[18707]: I0320 08:49:18.264091 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:49:18.311930 master-0 kubenswrapper[18707]: I0320 08:49:18.310497 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" event={"ID":"5e3b82e6-25e8-49f6-bbe7-1365425c4b7f","Type":"ContainerStarted","Data":"b7d21fcea14026598a88b7c826f65165ae7eaf3f392cc748cd2a67db87e89ac4"} Mar 20 08:49:18.311930 master-0 kubenswrapper[18707]: I0320 08:49:18.311419 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:49:18.337760 master-0 kubenswrapper[18707]: I0320 08:49:18.335074 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-2pg77_bbc0b783-28d5-4554-b49d-c66082546f44/package-server-manager/0.log" Mar 20 08:49:18.337760 master-0 kubenswrapper[18707]: I0320 08:49:18.336640 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" event={"ID":"bbc0b783-28d5-4554-b49d-c66082546f44","Type":"ContainerStarted","Data":"9a9ec037628cdd5fc550a16efef6ff6ef9b51a194d4ab20d80900ba89261c906"} Mar 20 08:49:18.337760 master-0 kubenswrapper[18707]: I0320 08:49:18.337549 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:49:18.342174 master-0 kubenswrapper[18707]: I0320 08:49:18.341688 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-mt454" event={"ID":"ad692349-5089-4afc-85b2-9b6e7997567c","Type":"ContainerStarted","Data":"e2f2601b12dd958365584e6e09e6c3a994f5aa0e0dfd674b73bd5e8c5a39096f"} Mar 20 08:49:18.350699 master-0 kubenswrapper[18707]: I0320 08:49:18.350661 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-qmm8h" event={"ID":"1375da42-ecaf-4d86-b554-25fd1c3d00bd","Type":"ContainerStarted","Data":"c68cb8abfe2eb59c8f65df8bb49cdb1bbb9bbb7b2c0a9ec1c9ca1a988186a2b1"} Mar 20 08:49:18.355831 master-0 kubenswrapper[18707]: I0320 08:49:18.355794 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-p7pt6" event={"ID":"68252533-bd64-4fc5-838a-cc350cbe77f0","Type":"ContainerStarted","Data":"3bbda6124412f654935acdeb4c83cda86b4eb87afc528d280f735f87a062437d"} Mar 20 08:49:18.362925 master-0 kubenswrapper[18707]: I0320 08:49:18.362874 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-c6vkz" event={"ID":"c1854ea4-c8e2-4289-84b6-1f18b2ac684f","Type":"ContainerStarted","Data":"0bd3b3cf249889ae648c582b4f9481fb96bbb3530d412e3d9f9211c3d6901fa6"} Mar 20 08:49:18.382123 master-0 kubenswrapper[18707]: I0320 08:49:18.377489 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-62zrx" event={"ID":"29b5b089-fb1d-46a1-bd67-2e0ba03c76a6","Type":"ContainerStarted","Data":"bc6d33c63a7d9090f916a1743c53afe806d832eb4e040a3432346aaa839c14d7"} Mar 20 08:49:18.382123 master-0 kubenswrapper[18707]: I0320 08:49:18.379652 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-gm4qr" event={"ID":"42df77ec-94aa-48ba-bb35-7b1f1e8b8e97","Type":"ContainerStarted","Data":"9b4285fd558dddc512238a4d805f8613a9846804d57f3e9de8504197dbfc187c"} Mar 20 08:49:18.387444 master-0 kubenswrapper[18707]: I0320 08:49:18.387298 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-vlq7h" event={"ID":"aa16c3bf-2350-46d1-afa0-9477b3ec8877","Type":"ContainerStarted","Data":"2174526f3816005c5f8bae05eca906a13b1c25818c337ccd99f63cf9c45757ba"} Mar 20 08:49:18.699086 master-0 kubenswrapper[18707]: I0320 08:49:18.699004 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-ffdd4b479-rhmfx" Mar 20 08:49:18.777582 master-0 kubenswrapper[18707]: E0320 08:49:18.777509 18707 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 20 08:49:19.096240 master-0 kubenswrapper[18707]: I0320 08:49:19.095389 18707 scope.go:117] "RemoveContainer" containerID="1a645c4dd6894d87bc5827ce1df71595038461daa2bda7238648a01805dbab0e" Mar 20 08:49:19.096240 master-0 kubenswrapper[18707]: E0320 08:49:19.096052 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2028761b8522f874dcebf13c4683d033)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" Mar 20 08:49:19.403640 master-0 kubenswrapper[18707]: I0320 08:49:19.403417 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-qzb2h" event={"ID":"75e3e2cc-aa56-41f3-8859-1c086f419d05","Type":"ContainerStarted","Data":"a6a8301b6af2735caa4195a8111a74f47653b5daf79c158fae1cd3437323e231"} Mar 20 08:49:19.408299 master-0 kubenswrapper[18707]: I0320 08:49:19.408249 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-6fbb6cf6f9-n8tnn_c7f5e6cd-e093-409a-8758-d3db7a7eb32c/machine-api-operator/0.log" Mar 20 08:49:19.408992 master-0 kubenswrapper[18707]: I0320 08:49:19.408902 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-n8tnn" event={"ID":"c7f5e6cd-e093-409a-8758-d3db7a7eb32c","Type":"ContainerStarted","Data":"d7939ef7c290e843cafc75df83a35de9f660e51bdc0cf643e425304013e11a00"} Mar 20 08:49:19.413640 master-0 kubenswrapper[18707]: I0320 08:49:19.413573 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-vx5d7" event={"ID":"a57854ac-809a-4745-aaa1-774f0a08a560","Type":"ContainerStarted","Data":"f491518b31fab913f5f8a91bbdce401a81639ad5e931c73a3f6782ef923680f3"} Mar 20 08:49:19.416892 master-0 kubenswrapper[18707]: I0320 08:49:19.416836 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" event={"ID":"f046860d-2d54-4746-8ba2-f8e90fa55e38","Type":"ContainerStarted","Data":"393d2845d5fba776a9ebaed5feb8bc3aa361388a76b3e735432c39981bb930da"} Mar 20 08:49:19.420380 master-0 kubenswrapper[18707]: I0320 08:49:19.420328 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-fmhbq" event={"ID":"b4291bfd-53d9-4c78-b7cb-d7eb46560528","Type":"ContainerStarted","Data":"dfd107e83ec6e3ec0c99e64cfc3b97e45377830c10a1d6dc5827804afbe7ae27"} Mar 20 08:49:19.424639 master-0 kubenswrapper[18707]: I0320 08:49:19.424603 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-866dc4744-xwxg7_21bebade-17fa-444e-92a9-eea53d6cd673/cluster-autoscaler-operator/0.log" Mar 20 08:49:19.425604 master-0 kubenswrapper[18707]: I0320 08:49:19.425533 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-xwxg7" event={"ID":"21bebade-17fa-444e-92a9-eea53d6cd673","Type":"ContainerStarted","Data":"bfed4f8a80c1542278e0af8b603bd065ec2e7f0456df8fbb20a61b42ed29fe9d"} Mar 20 08:49:19.429171 master-0 kubenswrapper[18707]: I0320 08:49:19.429103 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-64dg5" event={"ID":"c2a23d24-9e09-431e-8c3b-8456ff51a8d0","Type":"ContainerStarted","Data":"07f2f273d6267481ef73506ded32028aaad78787e1c1c9ed53a95527528b0632"} Mar 20 08:49:19.432727 master-0 kubenswrapper[18707]: I0320 08:49:19.432658 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-598fbc5f8f-vxzvg_ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7/cluster-node-tuning-operator/0.log" Mar 20 08:49:19.432916 master-0 kubenswrapper[18707]: I0320 08:49:19.432788 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vxzvg" event={"ID":"ebab0b0b-6cc3-490a-944b-0f8b4e2d5ae7","Type":"ContainerStarted","Data":"33df84746cc643b0ef612dc7007f03a3598f4b907becd41ed8ea8591ae0cd0e8"} Mar 20 08:49:19.435995 master-0 kubenswrapper[18707]: I0320 08:49:19.435912 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-72j8t" event={"ID":"2f844652-225b-4713-a9ad-cf9bcc348f47","Type":"ContainerStarted","Data":"96df2299e0f1ef49f01ec299445dab4667f8fb6fc16c52af9ae14e26a3a598b1"} Mar 20 08:49:19.441966 master-0 kubenswrapper[18707]: I0320 08:49:19.441879 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"963bd0525df3985d6f43b1fe1bd02659a99df0c473266a6b7445de535552a1f3"} Mar 20 08:49:19.775344 master-0 kubenswrapper[18707]: I0320 08:49:19.775113 18707 patch_prober.go:28] interesting pod/etcd-operator-8544cbcf9c-brhw4 container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.28:8443/healthz\": dial tcp 10.128.0.28:8443: connect: connection refused" start-of-body= Mar 20 08:49:19.775344 master-0 kubenswrapper[18707]: I0320 08:49:19.775216 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-brhw4" podUID="f046860d-2d54-4746-8ba2-f8e90fa55e38" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.28:8443/healthz\": dial tcp 10.128.0.28:8443: connect: connection refused" Mar 20 08:49:22.475962 master-0 kubenswrapper[18707]: I0320 08:49:22.475779 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/cluster-policy-controller/3.log" Mar 20 08:49:22.479881 master-0 kubenswrapper[18707]: I0320 08:49:22.479771 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:49:22.481427 master-0 kubenswrapper[18707]: I0320 08:49:22.481350 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager-cert-syncer/0.log" Mar 20 08:49:22.481427 master-0 kubenswrapper[18707]: I0320 08:49:22.481424 18707 generic.go:334] "Generic (PLEG): container finished" podID="2028761b8522f874dcebf13c4683d033" containerID="cfd1a328fad8a1fbf228c2c6250b1d586490a1a703c23535a3eb7dcf1bbe5867" exitCode=1 Mar 20 08:49:22.481766 master-0 kubenswrapper[18707]: I0320 08:49:22.481475 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerDied","Data":"cfd1a328fad8a1fbf228c2c6250b1d586490a1a703c23535a3eb7dcf1bbe5867"} Mar 20 08:49:22.482389 master-0 kubenswrapper[18707]: I0320 08:49:22.482334 18707 scope.go:117] "RemoveContainer" containerID="1a645c4dd6894d87bc5827ce1df71595038461daa2bda7238648a01805dbab0e" Mar 20 08:49:22.482389 master-0 kubenswrapper[18707]: I0320 08:49:22.482365 18707 scope.go:117] "RemoveContainer" containerID="cfd1a328fad8a1fbf228c2c6250b1d586490a1a703c23535a3eb7dcf1bbe5867" Mar 20 08:49:22.780549 master-0 kubenswrapper[18707]: E0320 08:49:22.780469 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2028761b8522f874dcebf13c4683d033)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" Mar 20 08:49:23.493284 master-0 kubenswrapper[18707]: I0320 08:49:23.493152 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/cluster-policy-controller/3.log" Mar 20 08:49:23.494991 master-0 kubenswrapper[18707]: I0320 08:49:23.494952 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:49:23.496956 master-0 kubenswrapper[18707]: I0320 08:49:23.496892 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager-cert-syncer/0.log" Mar 20 08:49:23.497787 master-0 kubenswrapper[18707]: I0320 08:49:23.497095 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"61bf9561fb6a2bfd80add2b9b814a82fa5954086996b7e4feb2d7aa26a526193"} Mar 20 08:49:23.500340 master-0 kubenswrapper[18707]: I0320 08:49:23.500291 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-dv6cd_f53bc282-5937-49ac-ac98-2ee37ccb268d/cluster-baremetal-operator/2.log" Mar 20 08:49:23.500817 master-0 kubenswrapper[18707]: I0320 08:49:23.500770 18707 scope.go:117] "RemoveContainer" containerID="1a645c4dd6894d87bc5827ce1df71595038461daa2bda7238648a01805dbab0e" Mar 20 08:49:23.501666 master-0 kubenswrapper[18707]: I0320 08:49:23.501608 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-dv6cd_f53bc282-5937-49ac-ac98-2ee37ccb268d/cluster-baremetal-operator/1.log" Mar 20 08:49:23.502447 master-0 kubenswrapper[18707]: E0320 08:49:23.502369 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(2028761b8522f874dcebf13c4683d033)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" Mar 20 08:49:23.502447 master-0 kubenswrapper[18707]: I0320 08:49:23.502413 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" event={"ID":"f53bc282-5937-49ac-ac98-2ee37ccb268d","Type":"ContainerDied","Data":"5b8f6cb814b9e840ab89858d438999874215c853379b386a3140792c0b17b9bf"} Mar 20 08:49:23.502677 master-0 kubenswrapper[18707]: I0320 08:49:23.502492 18707 scope.go:117] "RemoveContainer" containerID="3280012a08ca163e191750f1532fb90a38637168dc635f7e9ff14ee2e4d0ad50" Mar 20 08:49:23.502677 master-0 kubenswrapper[18707]: I0320 08:49:23.502371 18707 generic.go:334] "Generic (PLEG): container finished" podID="f53bc282-5937-49ac-ac98-2ee37ccb268d" containerID="5b8f6cb814b9e840ab89858d438999874215c853379b386a3140792c0b17b9bf" exitCode=1 Mar 20 08:49:23.507245 master-0 kubenswrapper[18707]: I0320 08:49:23.507143 18707 scope.go:117] "RemoveContainer" containerID="5b8f6cb814b9e840ab89858d438999874215c853379b386a3140792c0b17b9bf" Mar 20 08:49:23.508055 master-0 kubenswrapper[18707]: E0320 08:49:23.507974 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-6f69995874-dv6cd_openshift-machine-api(f53bc282-5937-49ac-ac98-2ee37ccb268d)\"" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" podUID="f53bc282-5937-49ac-ac98-2ee37ccb268d" Mar 20 08:49:23.930032 master-0 kubenswrapper[18707]: I0320 08:49:23.929924 18707 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-25cml container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 20 08:49:23.930032 master-0 kubenswrapper[18707]: I0320 08:49:23.930026 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" podUID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 20 08:49:23.930587 master-0 kubenswrapper[18707]: I0320 08:49:23.930058 18707 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-25cml container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 20 08:49:23.930587 master-0 kubenswrapper[18707]: I0320 08:49:23.930213 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" podUID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 20 08:49:24.520115 master-0 kubenswrapper[18707]: I0320 08:49:24.520004 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-dv6cd_f53bc282-5937-49ac-ac98-2ee37ccb268d/cluster-baremetal-operator/2.log" Mar 20 08:49:26.194552 master-0 kubenswrapper[18707]: E0320 08:49:26.194438 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:49:26.929821 master-0 kubenswrapper[18707]: I0320 08:49:26.929704 18707 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-25cml container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 20 08:49:26.930235 master-0 kubenswrapper[18707]: I0320 08:49:26.929876 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" podUID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 20 08:49:26.930235 master-0 kubenswrapper[18707]: I0320 08:49:26.930008 18707 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-25cml container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 20 08:49:26.930475 master-0 kubenswrapper[18707]: I0320 08:49:26.930282 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" podUID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 20 08:49:29.930038 master-0 kubenswrapper[18707]: I0320 08:49:29.929915 18707 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-25cml container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 20 08:49:29.930038 master-0 kubenswrapper[18707]: I0320 08:49:29.930013 18707 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-25cml container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 20 08:49:29.930038 master-0 kubenswrapper[18707]: I0320 08:49:29.930041 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" podUID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 20 08:49:29.931266 master-0 kubenswrapper[18707]: I0320 08:49:29.930105 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" podUID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 20 08:49:29.931266 master-0 kubenswrapper[18707]: I0320 08:49:29.930179 18707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:49:29.931266 master-0 kubenswrapper[18707]: I0320 08:49:29.931228 18707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"bef91bdc9369cda692f66d07d6ee3239d77a72194de50fd579f49f34001b0e18"} pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 20 08:49:29.931513 master-0 kubenswrapper[18707]: I0320 08:49:29.931276 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" podUID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerName="openshift-config-operator" containerID="cri-o://bef91bdc9369cda692f66d07d6ee3239d77a72194de50fd579f49f34001b0e18" gracePeriod=30 Mar 20 08:49:29.931513 master-0 kubenswrapper[18707]: I0320 08:49:29.931225 18707 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-25cml container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 20 08:49:29.931513 master-0 kubenswrapper[18707]: I0320 08:49:29.931368 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" podUID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 20 08:49:30.095764 master-0 kubenswrapper[18707]: I0320 08:49:30.095672 18707 scope.go:117] "RemoveContainer" containerID="dc430f466db39347eddd50d62cf7d30007ddcbada88ac7de4c13d2e1a985e1ed" Mar 20 08:49:30.585033 master-0 kubenswrapper[18707]: I0320 08:49:30.584958 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-f44gr_96de6024-e20f-4b52-9294-b330d65e4153/snapshot-controller/3.log" Mar 20 08:49:30.585381 master-0 kubenswrapper[18707]: I0320 08:49:30.585266 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-f44gr" event={"ID":"96de6024-e20f-4b52-9294-b330d65e4153","Type":"ContainerStarted","Data":"73af83aa15472ade278dd73b18d1094f64b05fbb308a2798e92f85b2f74637cf"} Mar 20 08:49:30.588680 master-0 kubenswrapper[18707]: I0320 08:49:30.588627 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-95bf4f4d-25cml_ab175f7e-a5e8-4fda-98c9-6d052a221a83/openshift-config-operator/2.log" Mar 20 08:49:30.589951 master-0 kubenswrapper[18707]: I0320 08:49:30.589884 18707 generic.go:334] "Generic (PLEG): container finished" podID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerID="bef91bdc9369cda692f66d07d6ee3239d77a72194de50fd579f49f34001b0e18" exitCode=255 Mar 20 08:49:30.590054 master-0 kubenswrapper[18707]: I0320 08:49:30.589960 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" event={"ID":"ab175f7e-a5e8-4fda-98c9-6d052a221a83","Type":"ContainerDied","Data":"bef91bdc9369cda692f66d07d6ee3239d77a72194de50fd579f49f34001b0e18"} Mar 20 08:49:30.590054 master-0 kubenswrapper[18707]: I0320 08:49:30.590010 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" event={"ID":"ab175f7e-a5e8-4fda-98c9-6d052a221a83","Type":"ContainerStarted","Data":"3ee5af5173d42ae14fb9507aeb092c68da54fa0f255ec169d7651686bb6ce330"} Mar 20 08:49:30.590054 master-0 kubenswrapper[18707]: I0320 08:49:30.590040 18707 scope.go:117] "RemoveContainer" containerID="c27171e28b2189b6b8d129f565f467a424f94c7f7677d037a89da262ce118c31" Mar 20 08:49:30.590636 master-0 kubenswrapper[18707]: I0320 08:49:30.590568 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:49:31.467914 master-0 kubenswrapper[18707]: E0320 08:49:31.467830 18707 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:49:31.467914 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(1d982a2f86882c36c89f1b9b85c981a4dbc3b452811eecb54160515a52a967d4): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1d982a2f86882c36c89f1b9b85c981a4dbc3b452811eecb54160515a52a967d4" Netns:"/var/run/netns/655152b9-33da-4e04-b0ec-a902d2d6b64b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=1d982a2f86882c36c89f1b9b85c981a4dbc3b452811eecb54160515a52a967d4;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552" Path:"" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:49:31.467914 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:49:31.467914 master-0 kubenswrapper[18707]: > Mar 20 08:49:31.469101 master-0 kubenswrapper[18707]: E0320 08:49:31.467926 18707 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:49:31.469101 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(1d982a2f86882c36c89f1b9b85c981a4dbc3b452811eecb54160515a52a967d4): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1d982a2f86882c36c89f1b9b85c981a4dbc3b452811eecb54160515a52a967d4" Netns:"/var/run/netns/655152b9-33da-4e04-b0ec-a902d2d6b64b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=1d982a2f86882c36c89f1b9b85c981a4dbc3b452811eecb54160515a52a967d4;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552" Path:"" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:49:31.469101 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:49:31.469101 master-0 kubenswrapper[18707]: > pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:49:31.469101 master-0 kubenswrapper[18707]: E0320 08:49:31.467950 18707 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:49:31.469101 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(1d982a2f86882c36c89f1b9b85c981a4dbc3b452811eecb54160515a52a967d4): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1d982a2f86882c36c89f1b9b85c981a4dbc3b452811eecb54160515a52a967d4" Netns:"/var/run/netns/655152b9-33da-4e04-b0ec-a902d2d6b64b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=1d982a2f86882c36c89f1b9b85c981a4dbc3b452811eecb54160515a52a967d4;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552" Path:"" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 20 08:49:31.469101 master-0 kubenswrapper[18707]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:49:31.469101 master-0 kubenswrapper[18707]: > pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:49:31.469101 master-0 kubenswrapper[18707]: E0320 08:49:31.468068 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"console-operator-76b6568d85-8b8gv_openshift-console-operator(348f3880-793f-43e4-9de1-8511626d2552)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"console-operator-76b6568d85-8b8gv_openshift-console-operator(348f3880-793f-43e4-9de1-8511626d2552)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-operator-76b6568d85-8b8gv_openshift-console-operator_348f3880-793f-43e4-9de1-8511626d2552_0(1d982a2f86882c36c89f1b9b85c981a4dbc3b452811eecb54160515a52a967d4): error adding pod openshift-console-operator_console-operator-76b6568d85-8b8gv to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"1d982a2f86882c36c89f1b9b85c981a4dbc3b452811eecb54160515a52a967d4\\\" Netns:\\\"/var/run/netns/655152b9-33da-4e04-b0ec-a902d2d6b64b\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console-operator;K8S_POD_NAME=console-operator-76b6568d85-8b8gv;K8S_POD_INFRA_CONTAINER_ID=1d982a2f86882c36c89f1b9b85c981a4dbc3b452811eecb54160515a52a967d4;K8S_POD_UID=348f3880-793f-43e4-9de1-8511626d2552\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-console-operator/console-operator-76b6568d85-8b8gv] networking: Multus: [openshift-console-operator/console-operator-76b6568d85-8b8gv/348f3880-793f-43e4-9de1-8511626d2552]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: SetNetworkStatus: failed to update the pod console-operator-76b6568d85-8b8gv in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-76b6568d85-8b8gv?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" podUID="348f3880-793f-43e4-9de1-8511626d2552" Mar 20 08:49:31.600513 master-0 kubenswrapper[18707]: I0320 08:49:31.600423 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-95bf4f4d-25cml_ab175f7e-a5e8-4fda-98c9-6d052a221a83/openshift-config-operator/2.log" Mar 20 08:49:31.601005 master-0 kubenswrapper[18707]: I0320 08:49:31.600967 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:49:31.601694 master-0 kubenswrapper[18707]: I0320 08:49:31.601636 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:49:35.930288 master-0 kubenswrapper[18707]: I0320 08:49:35.930163 18707 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-25cml container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 20 08:49:35.930861 master-0 kubenswrapper[18707]: I0320 08:49:35.930304 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" podUID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 20 08:49:35.930861 master-0 kubenswrapper[18707]: I0320 08:49:35.930163 18707 patch_prober.go:28] interesting pod/openshift-config-operator-95bf4f4d-25cml container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" start-of-body= Mar 20 08:49:35.930861 master-0 kubenswrapper[18707]: I0320 08:49:35.930445 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" podUID="ab175f7e-a5e8-4fda-98c9-6d052a221a83" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.17:8443/healthz\": dial tcp 10.128.0.17:8443: connect: connection refused" Mar 20 08:49:36.195545 master-0 kubenswrapper[18707]: E0320 08:49:36.195394 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:49:36.990581 master-0 kubenswrapper[18707]: I0320 08:49:36.990522 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4"] Mar 20 08:49:36.993928 master-0 kubenswrapper[18707]: I0320 08:49:36.993897 18707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:49:37.006628 master-0 kubenswrapper[18707]: I0320 08:49:37.004691 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b6568d85-8b8gv"] Mar 20 08:49:37.006628 master-0 kubenswrapper[18707]: I0320 08:49:37.004917 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=296.42427027 podStartE2EDuration="5m0.004884091s" podCreationTimestamp="2026-03-20 08:44:37 +0000 UTC" firstStartedPulling="2026-03-20 08:44:39.999371647 +0000 UTC m=+225.155552003" lastFinishedPulling="2026-03-20 08:44:43.579985468 +0000 UTC m=+228.736165824" observedRunningTime="2026-03-20 08:49:36.998774337 +0000 UTC m=+522.154954703" watchObservedRunningTime="2026-03-20 08:49:37.004884091 +0000 UTC m=+522.161064447" Mar 20 08:49:37.096873 master-0 kubenswrapper[18707]: I0320 08:49:37.096814 18707 scope.go:117] "RemoveContainer" containerID="5b8f6cb814b9e840ab89858d438999874215c853379b386a3140792c0b17b9bf" Mar 20 08:49:37.097208 master-0 kubenswrapper[18707]: E0320 08:49:37.097165 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-6f69995874-dv6cd_openshift-machine-api(f53bc282-5937-49ac-ac98-2ee37ccb268d)\"" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" podUID="f53bc282-5937-49ac-ac98-2ee37ccb268d" Mar 20 08:49:37.295664 master-0 kubenswrapper[18707]: I0320 08:49:37.295569 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=303.722147144 podStartE2EDuration="5m7.295537535s" podCreationTimestamp="2026-03-20 08:44:30 +0000 UTC" firstStartedPulling="2026-03-20 08:44:39.996302467 +0000 UTC m=+225.152482823" lastFinishedPulling="2026-03-20 08:44:43.569692858 +0000 UTC m=+228.725873214" observedRunningTime="2026-03-20 08:49:37.257493926 +0000 UTC m=+522.413674282" watchObservedRunningTime="2026-03-20 08:49:37.295537535 +0000 UTC m=+522.451717891" Mar 20 08:49:37.692069 master-0 kubenswrapper[18707]: I0320 08:49:37.683413 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" event={"ID":"044870dd-540a-402e-84cb-fa1bf3d6a318","Type":"ContainerStarted","Data":"a77af235cb34d1ac638f724071083fb4fb5f664fdbbd39da5040782b54c192c1"} Mar 20 08:49:37.705177 master-0 kubenswrapper[18707]: I0320 08:49:37.698406 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" event={"ID":"348f3880-793f-43e4-9de1-8511626d2552","Type":"ContainerStarted","Data":"b55791e87c825dc4c8670b0c9e62b616228d403001bf6bcfd5609d5f82b3e692"} Mar 20 08:49:37.886499 master-0 kubenswrapper[18707]: I0320 08:49:37.886441 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-64c67d44c4-s7vfs"] Mar 20 08:49:37.900346 master-0 kubenswrapper[18707]: I0320 08:49:37.900280 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/metrics-server-64c67d44c4-s7vfs"] Mar 20 08:49:38.102252 master-0 kubenswrapper[18707]: I0320 08:49:38.093877 18707 scope.go:117] "RemoveContainer" containerID="1a645c4dd6894d87bc5827ce1df71595038461daa2bda7238648a01805dbab0e" Mar 20 08:49:38.721414 master-0 kubenswrapper[18707]: I0320 08:49:38.720914 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/cluster-policy-controller/3.log" Mar 20 08:49:38.734968 master-0 kubenswrapper[18707]: I0320 08:49:38.734907 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:49:38.746170 master-0 kubenswrapper[18707]: I0320 08:49:38.741789 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager-cert-syncer/0.log" Mar 20 08:49:38.746170 master-0 kubenswrapper[18707]: I0320 08:49:38.741875 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2028761b8522f874dcebf13c4683d033","Type":"ContainerStarted","Data":"26d1f23f09ec46d0564e314771aedd57d50a0394449491bf05654764fec7468d"} Mar 20 08:49:38.946345 master-0 kubenswrapper[18707]: I0320 08:49:38.945760 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-25cml" Mar 20 08:49:39.114426 master-0 kubenswrapper[18707]: I0320 08:49:39.113921 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a69e8d3a-a0b1-4688-8631-d9f265aa4c69" path="/var/lib/kubelet/pods/a69e8d3a-a0b1-4688-8631-d9f265aa4c69/volumes" Mar 20 08:49:40.758892 master-0 kubenswrapper[18707]: I0320 08:49:40.758824 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" event={"ID":"044870dd-540a-402e-84cb-fa1bf3d6a318","Type":"ContainerStarted","Data":"6f33cc0b5bf3cae4af5bc36ef0a344c17e4b86f690c860c79561b0c3bc3a15e9"} Mar 20 08:49:40.782230 master-0 kubenswrapper[18707]: I0320 08:49:40.782128 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-7c6b76c555-zbpk4" podStartSLOduration=319.872599368 podStartE2EDuration="5m22.782106606s" podCreationTimestamp="2026-03-20 08:44:18 +0000 UTC" firstStartedPulling="2026-03-20 08:49:36.993841935 +0000 UTC m=+522.150022291" lastFinishedPulling="2026-03-20 08:49:39.903349173 +0000 UTC m=+525.059529529" observedRunningTime="2026-03-20 08:49:40.777120193 +0000 UTC m=+525.933300559" watchObservedRunningTime="2026-03-20 08:49:40.782106606 +0000 UTC m=+525.938286962" Mar 20 08:49:41.771908 master-0 kubenswrapper[18707]: I0320 08:49:41.771789 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" event={"ID":"348f3880-793f-43e4-9de1-8511626d2552","Type":"ContainerStarted","Data":"1025e454a24b30f456bc0f663fe1f3ab0a3b381ce3e75e062d5bf19f49e5252b"} Mar 20 08:49:41.772847 master-0 kubenswrapper[18707]: I0320 08:49:41.772773 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:49:41.781669 master-0 kubenswrapper[18707]: I0320 08:49:41.781605 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" Mar 20 08:49:41.828712 master-0 kubenswrapper[18707]: I0320 08:49:41.827714 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-76b6568d85-8b8gv" podStartSLOduration=502.774939383 podStartE2EDuration="8m26.827679647s" podCreationTimestamp="2026-03-20 08:41:15 +0000 UTC" firstStartedPulling="2026-03-20 08:49:37.033390064 +0000 UTC m=+522.189570420" lastFinishedPulling="2026-03-20 08:49:41.086130328 +0000 UTC m=+526.242310684" observedRunningTime="2026-03-20 08:49:41.803373876 +0000 UTC m=+526.959554262" watchObservedRunningTime="2026-03-20 08:49:41.827679647 +0000 UTC m=+526.983860033" Mar 20 08:49:45.257377 master-0 kubenswrapper[18707]: I0320 08:49:45.257261 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:49:45.532559 master-0 kubenswrapper[18707]: I0320 08:49:45.532385 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:49:45.537360 master-0 kubenswrapper[18707]: I0320 08:49:45.537297 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:49:48.094921 master-0 kubenswrapper[18707]: I0320 08:49:48.094870 18707 scope.go:117] "RemoveContainer" containerID="5b8f6cb814b9e840ab89858d438999874215c853379b386a3140792c0b17b9bf" Mar 20 08:49:48.828668 master-0 kubenswrapper[18707]: I0320 08:49:48.828595 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-dv6cd_f53bc282-5937-49ac-ac98-2ee37ccb268d/cluster-baremetal-operator/2.log" Mar 20 08:49:48.829244 master-0 kubenswrapper[18707]: I0320 08:49:48.829172 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-dv6cd" event={"ID":"f53bc282-5937-49ac-ac98-2ee37ccb268d","Type":"ContainerStarted","Data":"35c40c03b42708a0ad40c10b9e1386a3c24c5ef15edd9ec4b83c03e6e00b46ec"} Mar 20 08:49:51.983423 master-0 kubenswrapper[18707]: I0320 08:49:51.983360 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-2pg77" Mar 20 08:49:55.262414 master-0 kubenswrapper[18707]: I0320 08:49:55.262336 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:50:24.667222 master-0 kubenswrapper[18707]: I0320 08:50:24.667108 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 20 08:50:24.668093 master-0 kubenswrapper[18707]: I0320 08:50:24.667680 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="kube-rbac-proxy-metric" containerID="cri-o://ef595b86616a9d59758e3bb52cb2c40900df30c7a5c7962dd70066b8351abfcc" gracePeriod=120 Mar 20 08:50:24.668093 master-0 kubenswrapper[18707]: I0320 08:50:24.667849 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="prom-label-proxy" containerID="cri-o://9109a59c4f2c870fa24f5897d11d6ce0d82c7037a7c29d3008301ef58dbfb30e" gracePeriod=120 Mar 20 08:50:24.668093 master-0 kubenswrapper[18707]: I0320 08:50:24.667841 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="kube-rbac-proxy" containerID="cri-o://a1bf05287a33269822d24c09f055433696b780b4f7f403c232dda246f6cacc28" gracePeriod=120 Mar 20 08:50:24.668093 master-0 kubenswrapper[18707]: I0320 08:50:24.667865 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="alertmanager" containerID="cri-o://cbc84cc16fdf36c19a2ec8ddb8d8c567dd3a2aefe284ff96c39ef8fc1be8eb8d" gracePeriod=120 Mar 20 08:50:24.668093 master-0 kubenswrapper[18707]: I0320 08:50:24.667842 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="config-reloader" containerID="cri-o://1e47e53c73caf9294d7cc552d435de2bba3335889632bb5e47c9437fae1ad38a" gracePeriod=120 Mar 20 08:50:24.669951 master-0 kubenswrapper[18707]: I0320 08:50:24.669377 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="kube-rbac-proxy-web" containerID="cri-o://bf080bd2aa0228ff67560c7f40f2f04126f362d98ad14ce5d016639c7975d354" gracePeriod=120 Mar 20 08:50:25.192408 master-0 kubenswrapper[18707]: I0320 08:50:25.192276 18707 generic.go:334] "Generic (PLEG): container finished" podID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerID="9109a59c4f2c870fa24f5897d11d6ce0d82c7037a7c29d3008301ef58dbfb30e" exitCode=0 Mar 20 08:50:25.192408 master-0 kubenswrapper[18707]: I0320 08:50:25.192326 18707 generic.go:334] "Generic (PLEG): container finished" podID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerID="ef595b86616a9d59758e3bb52cb2c40900df30c7a5c7962dd70066b8351abfcc" exitCode=0 Mar 20 08:50:25.192408 master-0 kubenswrapper[18707]: I0320 08:50:25.192343 18707 generic.go:334] "Generic (PLEG): container finished" podID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerID="a1bf05287a33269822d24c09f055433696b780b4f7f403c232dda246f6cacc28" exitCode=0 Mar 20 08:50:25.192408 master-0 kubenswrapper[18707]: I0320 08:50:25.192330 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9","Type":"ContainerDied","Data":"9109a59c4f2c870fa24f5897d11d6ce0d82c7037a7c29d3008301ef58dbfb30e"} Mar 20 08:50:25.192408 master-0 kubenswrapper[18707]: I0320 08:50:25.192376 18707 generic.go:334] "Generic (PLEG): container finished" podID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerID="bf080bd2aa0228ff67560c7f40f2f04126f362d98ad14ce5d016639c7975d354" exitCode=0 Mar 20 08:50:25.192408 master-0 kubenswrapper[18707]: I0320 08:50:25.192385 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9","Type":"ContainerDied","Data":"ef595b86616a9d59758e3bb52cb2c40900df30c7a5c7962dd70066b8351abfcc"} Mar 20 08:50:25.192408 master-0 kubenswrapper[18707]: I0320 08:50:25.192404 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9","Type":"ContainerDied","Data":"a1bf05287a33269822d24c09f055433696b780b4f7f403c232dda246f6cacc28"} Mar 20 08:50:25.192771 master-0 kubenswrapper[18707]: I0320 08:50:25.192421 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9","Type":"ContainerDied","Data":"bf080bd2aa0228ff67560c7f40f2f04126f362d98ad14ce5d016639c7975d354"} Mar 20 08:50:25.192771 master-0 kubenswrapper[18707]: I0320 08:50:25.192439 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9","Type":"ContainerDied","Data":"1e47e53c73caf9294d7cc552d435de2bba3335889632bb5e47c9437fae1ad38a"} Mar 20 08:50:25.192771 master-0 kubenswrapper[18707]: I0320 08:50:25.192389 18707 generic.go:334] "Generic (PLEG): container finished" podID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerID="1e47e53c73caf9294d7cc552d435de2bba3335889632bb5e47c9437fae1ad38a" exitCode=0 Mar 20 08:50:25.192771 master-0 kubenswrapper[18707]: I0320 08:50:25.192463 18707 generic.go:334] "Generic (PLEG): container finished" podID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerID="cbc84cc16fdf36c19a2ec8ddb8d8c567dd3a2aefe284ff96c39ef8fc1be8eb8d" exitCode=0 Mar 20 08:50:25.192771 master-0 kubenswrapper[18707]: I0320 08:50:25.192485 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9","Type":"ContainerDied","Data":"cbc84cc16fdf36c19a2ec8ddb8d8c567dd3a2aefe284ff96c39ef8fc1be8eb8d"} Mar 20 08:50:25.192771 master-0 kubenswrapper[18707]: I0320 08:50:25.192499 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9","Type":"ContainerDied","Data":"a41dc620a626e76b0ea35ff7c7456036843137618f3c1bc68202d11850160449"} Mar 20 08:50:25.192771 master-0 kubenswrapper[18707]: I0320 08:50:25.192512 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a41dc620a626e76b0ea35ff7c7456036843137618f3c1bc68202d11850160449" Mar 20 08:50:25.196747 master-0 kubenswrapper[18707]: I0320 08:50:25.196729 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:50:25.335401 master-0 kubenswrapper[18707]: I0320 08:50:25.335327 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-metrics-client-ca\") pod \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " Mar 20 08:50:25.335617 master-0 kubenswrapper[18707]: I0320 08:50:25.335452 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-tls-assets\") pod \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " Mar 20 08:50:25.335617 master-0 kubenswrapper[18707]: I0320 08:50:25.335505 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvshj\" (UniqueName: \"kubernetes.io/projected/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-kube-api-access-gvshj\") pod \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " Mar 20 08:50:25.335617 master-0 kubenswrapper[18707]: I0320 08:50:25.335547 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy\") pod \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " Mar 20 08:50:25.335617 master-0 kubenswrapper[18707]: I0320 08:50:25.335604 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls\") pod \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " Mar 20 08:50:25.335814 master-0 kubenswrapper[18707]: I0320 08:50:25.335669 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-config-out\") pod \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " Mar 20 08:50:25.335814 master-0 kubenswrapper[18707]: I0320 08:50:25.335704 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-alertmanager-main-db\") pod \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " Mar 20 08:50:25.335814 master-0 kubenswrapper[18707]: I0320 08:50:25.335746 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-alertmanager-trusted-ca-bundle\") pod \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " Mar 20 08:50:25.335814 master-0 kubenswrapper[18707]: I0320 08:50:25.335787 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-web-config\") pod \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " Mar 20 08:50:25.335970 master-0 kubenswrapper[18707]: I0320 08:50:25.335814 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy-web\") pod \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " Mar 20 08:50:25.335970 master-0 kubenswrapper[18707]: I0320 08:50:25.335840 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " Mar 20 08:50:25.335970 master-0 kubenswrapper[18707]: I0320 08:50:25.335867 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-config-volume\") pod \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\" (UID: \"14d29bfa-a0cf-43bd-a3b8-052c1a224fc9\") " Mar 20 08:50:25.336313 master-0 kubenswrapper[18707]: I0320 08:50:25.335863 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:50:25.336573 master-0 kubenswrapper[18707]: I0320 08:50:25.336545 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:50:25.337083 master-0 kubenswrapper[18707]: I0320 08:50:25.336870 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:50:25.338862 master-0 kubenswrapper[18707]: I0320 08:50:25.338822 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-config-volume" (OuterVolumeSpecName: "config-volume") pod "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:25.338976 master-0 kubenswrapper[18707]: I0320 08:50:25.338952 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:25.339466 master-0 kubenswrapper[18707]: I0320 08:50:25.339421 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-config-out" (OuterVolumeSpecName: "config-out") pod "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:50:25.339466 master-0 kubenswrapper[18707]: I0320 08:50:25.339416 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:25.339890 master-0 kubenswrapper[18707]: I0320 08:50:25.339851 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:25.340441 master-0 kubenswrapper[18707]: I0320 08:50:25.340406 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:25.340879 master-0 kubenswrapper[18707]: I0320 08:50:25.340823 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:50:25.342457 master-0 kubenswrapper[18707]: I0320 08:50:25.342009 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-kube-api-access-gvshj" (OuterVolumeSpecName: "kube-api-access-gvshj") pod "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9"). InnerVolumeSpecName "kube-api-access-gvshj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:50:25.391716 master-0 kubenswrapper[18707]: I0320 08:50:25.391629 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-web-config" (OuterVolumeSpecName: "web-config") pod "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" (UID: "14d29bfa-a0cf-43bd-a3b8-052c1a224fc9"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:25.438394 master-0 kubenswrapper[18707]: I0320 08:50:25.438344 18707 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:25.438394 master-0 kubenswrapper[18707]: I0320 08:50:25.438392 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvshj\" (UniqueName: \"kubernetes.io/projected/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-kube-api-access-gvshj\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:25.438573 master-0 kubenswrapper[18707]: I0320 08:50:25.438411 18707 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:25.438573 master-0 kubenswrapper[18707]: I0320 08:50:25.438423 18707 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-main-tls\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:25.438573 master-0 kubenswrapper[18707]: I0320 08:50:25.438434 18707 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-config-out\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:25.438573 master-0 kubenswrapper[18707]: I0320 08:50:25.438444 18707 reconciler_common.go:293] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-alertmanager-main-db\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:25.438573 master-0 kubenswrapper[18707]: I0320 08:50:25.438454 18707 reconciler_common.go:293] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-alertmanager-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:25.438573 master-0 kubenswrapper[18707]: I0320 08:50:25.438464 18707 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-web-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:25.438573 master-0 kubenswrapper[18707]: I0320 08:50:25.438477 18707 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:25.438573 master-0 kubenswrapper[18707]: I0320 08:50:25.438492 18707 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-secret-alertmanager-kube-rbac-proxy-metric\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:25.438573 master-0 kubenswrapper[18707]: I0320 08:50:25.438502 18707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-config-volume\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:25.438573 master-0 kubenswrapper[18707]: I0320 08:50:25.438512 18707 reconciler_common.go:293] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:26.201427 master-0 kubenswrapper[18707]: I0320 08:50:26.201289 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:50:26.264882 master-0 kubenswrapper[18707]: I0320 08:50:26.264787 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 20 08:50:26.275743 master-0 kubenswrapper[18707]: I0320 08:50:26.275641 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 20 08:50:27.110093 master-0 kubenswrapper[18707]: I0320 08:50:27.110014 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" path="/var/lib/kubelet/pods/14d29bfa-a0cf-43bd-a3b8-052c1a224fc9/volumes" Mar 20 08:50:29.081517 master-0 kubenswrapper[18707]: I0320 08:50:29.081412 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 20 08:50:29.082232 master-0 kubenswrapper[18707]: I0320 08:50:29.081866 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="prometheus" containerID="cri-o://c19e6fb31c3a39d696985e46f9a3295932d6dbe5fa5c9082521702bd7df5a351" gracePeriod=600 Mar 20 08:50:29.083999 master-0 kubenswrapper[18707]: I0320 08:50:29.082348 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="kube-rbac-proxy" containerID="cri-o://7c9e4dd3063d39c7cb9b18d7a354e3435c2d2d5ec1986c53b7b85c908a8d1153" gracePeriod=600 Mar 20 08:50:29.083999 master-0 kubenswrapper[18707]: I0320 08:50:29.082471 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="kube-rbac-proxy-thanos" containerID="cri-o://27cbcc417567b245b2900410177a926ad00e08c6422a5beeefb377e30ef77b61" gracePeriod=600 Mar 20 08:50:29.083999 master-0 kubenswrapper[18707]: I0320 08:50:29.082447 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="kube-rbac-proxy-web" containerID="cri-o://44284a8b77568b923b8f2d7a551bb6b8408c54a850ac6d39be44e107e4b4a043" gracePeriod=600 Mar 20 08:50:29.083999 master-0 kubenswrapper[18707]: I0320 08:50:29.082455 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="thanos-sidecar" containerID="cri-o://87d8402979c175f019cec0c6457191a23d5e33345827dd54236af9fd1ae9b380" gracePeriod=600 Mar 20 08:50:29.083999 master-0 kubenswrapper[18707]: I0320 08:50:29.082712 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="config-reloader" containerID="cri-o://f242c0b5af48bb6305b45e0ed0a80c9b6a7707b2746522996f3c16ec946731bb" gracePeriod=600 Mar 20 08:50:29.235794 master-0 kubenswrapper[18707]: I0320 08:50:29.235738 18707 generic.go:334] "Generic (PLEG): container finished" podID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerID="27cbcc417567b245b2900410177a926ad00e08c6422a5beeefb377e30ef77b61" exitCode=0 Mar 20 08:50:29.235794 master-0 kubenswrapper[18707]: I0320 08:50:29.235781 18707 generic.go:334] "Generic (PLEG): container finished" podID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerID="7c9e4dd3063d39c7cb9b18d7a354e3435c2d2d5ec1986c53b7b85c908a8d1153" exitCode=0 Mar 20 08:50:29.235794 master-0 kubenswrapper[18707]: I0320 08:50:29.235792 18707 generic.go:334] "Generic (PLEG): container finished" podID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerID="44284a8b77568b923b8f2d7a551bb6b8408c54a850ac6d39be44e107e4b4a043" exitCode=0 Mar 20 08:50:29.235976 master-0 kubenswrapper[18707]: I0320 08:50:29.235802 18707 generic.go:334] "Generic (PLEG): container finished" podID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerID="87d8402979c175f019cec0c6457191a23d5e33345827dd54236af9fd1ae9b380" exitCode=0 Mar 20 08:50:29.235976 master-0 kubenswrapper[18707]: I0320 08:50:29.235812 18707 generic.go:334] "Generic (PLEG): container finished" podID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerID="f242c0b5af48bb6305b45e0ed0a80c9b6a7707b2746522996f3c16ec946731bb" exitCode=0 Mar 20 08:50:29.235976 master-0 kubenswrapper[18707]: I0320 08:50:29.235822 18707 generic.go:334] "Generic (PLEG): container finished" podID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerID="c19e6fb31c3a39d696985e46f9a3295932d6dbe5fa5c9082521702bd7df5a351" exitCode=0 Mar 20 08:50:29.235976 master-0 kubenswrapper[18707]: I0320 08:50:29.235818 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77ddbb16-b96e-4717-9786-2feae0d0cc3f","Type":"ContainerDied","Data":"27cbcc417567b245b2900410177a926ad00e08c6422a5beeefb377e30ef77b61"} Mar 20 08:50:29.235976 master-0 kubenswrapper[18707]: I0320 08:50:29.235882 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77ddbb16-b96e-4717-9786-2feae0d0cc3f","Type":"ContainerDied","Data":"7c9e4dd3063d39c7cb9b18d7a354e3435c2d2d5ec1986c53b7b85c908a8d1153"} Mar 20 08:50:29.235976 master-0 kubenswrapper[18707]: I0320 08:50:29.235896 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77ddbb16-b96e-4717-9786-2feae0d0cc3f","Type":"ContainerDied","Data":"44284a8b77568b923b8f2d7a551bb6b8408c54a850ac6d39be44e107e4b4a043"} Mar 20 08:50:29.235976 master-0 kubenswrapper[18707]: I0320 08:50:29.235908 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77ddbb16-b96e-4717-9786-2feae0d0cc3f","Type":"ContainerDied","Data":"87d8402979c175f019cec0c6457191a23d5e33345827dd54236af9fd1ae9b380"} Mar 20 08:50:29.235976 master-0 kubenswrapper[18707]: I0320 08:50:29.235920 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77ddbb16-b96e-4717-9786-2feae0d0cc3f","Type":"ContainerDied","Data":"f242c0b5af48bb6305b45e0ed0a80c9b6a7707b2746522996f3c16ec946731bb"} Mar 20 08:50:29.235976 master-0 kubenswrapper[18707]: I0320 08:50:29.235931 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77ddbb16-b96e-4717-9786-2feae0d0cc3f","Type":"ContainerDied","Data":"c19e6fb31c3a39d696985e46f9a3295932d6dbe5fa5c9082521702bd7df5a351"} Mar 20 08:50:29.592519 master-0 kubenswrapper[18707]: I0320 08:50:29.592467 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:50:29.722709 master-0 kubenswrapper[18707]: I0320 08:50:29.721970 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.722709 master-0 kubenswrapper[18707]: I0320 08:50:29.722719 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-metrics-client-certs\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723154 master-0 kubenswrapper[18707]: I0320 08:50:29.722802 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsh7c\" (UniqueName: \"kubernetes.io/projected/77ddbb16-b96e-4717-9786-2feae0d0cc3f-kube-api-access-bsh7c\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723154 master-0 kubenswrapper[18707]: I0320 08:50:29.722887 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723154 master-0 kubenswrapper[18707]: I0320 08:50:29.722987 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-config\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723154 master-0 kubenswrapper[18707]: I0320 08:50:29.723040 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77ddbb16-b96e-4717-9786-2feae0d0cc3f-tls-assets\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723154 master-0 kubenswrapper[18707]: I0320 08:50:29.723127 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-trusted-ca-bundle\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723398 master-0 kubenswrapper[18707]: I0320 08:50:29.723179 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-k8s-db\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723398 master-0 kubenswrapper[18707]: I0320 08:50:29.723238 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77ddbb16-b96e-4717-9786-2feae0d0cc3f-config-out\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723398 master-0 kubenswrapper[18707]: I0320 08:50:29.723296 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-serving-certs-ca-bundle\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723398 master-0 kubenswrapper[18707]: I0320 08:50:29.723340 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-k8s-rulefiles-0\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723569 master-0 kubenswrapper[18707]: I0320 08:50:29.723453 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-kube-rbac-proxy\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723569 master-0 kubenswrapper[18707]: I0320 08:50:29.723514 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-metrics-client-ca\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723655 master-0 kubenswrapper[18707]: I0320 08:50:29.723577 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-thanos-prometheus-http-client-file\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723655 master-0 kubenswrapper[18707]: I0320 08:50:29.723617 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-kubelet-serving-ca-bundle\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723743 master-0 kubenswrapper[18707]: I0320 08:50:29.723650 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-tls\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723743 master-0 kubenswrapper[18707]: I0320 08:50:29.723722 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-web-config\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.723838 master-0 kubenswrapper[18707]: I0320 08:50:29.723786 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-grpc-tls\") pod \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\" (UID: \"77ddbb16-b96e-4717-9786-2feae0d0cc3f\") " Mar 20 08:50:29.726909 master-0 kubenswrapper[18707]: I0320 08:50:29.726834 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:50:29.727425 master-0 kubenswrapper[18707]: I0320 08:50:29.727374 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:50:29.728561 master-0 kubenswrapper[18707]: I0320 08:50:29.728493 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:50:29.728809 master-0 kubenswrapper[18707]: I0320 08:50:29.728779 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77ddbb16-b96e-4717-9786-2feae0d0cc3f-config-out" (OuterVolumeSpecName: "config-out") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:50:29.730125 master-0 kubenswrapper[18707]: I0320 08:50:29.729944 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:50:29.731645 master-0 kubenswrapper[18707]: I0320 08:50:29.731606 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:29.731729 master-0 kubenswrapper[18707]: I0320 08:50:29.731649 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-config" (OuterVolumeSpecName: "config") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:29.732166 master-0 kubenswrapper[18707]: I0320 08:50:29.732116 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:50:29.732927 master-0 kubenswrapper[18707]: I0320 08:50:29.732879 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:29.733143 master-0 kubenswrapper[18707]: I0320 08:50:29.733101 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:29.734032 master-0 kubenswrapper[18707]: I0320 08:50:29.733984 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77ddbb16-b96e-4717-9786-2feae0d0cc3f-kube-api-access-bsh7c" (OuterVolumeSpecName: "kube-api-access-bsh7c") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "kube-api-access-bsh7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:50:29.734584 master-0 kubenswrapper[18707]: I0320 08:50:29.734563 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:50:29.734783 master-0 kubenswrapper[18707]: I0320 08:50:29.734711 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77ddbb16-b96e-4717-9786-2feae0d0cc3f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:50:29.735118 master-0 kubenswrapper[18707]: I0320 08:50:29.735069 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:29.735212 master-0 kubenswrapper[18707]: I0320 08:50:29.735152 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:29.735269 master-0 kubenswrapper[18707]: I0320 08:50:29.735226 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:29.744108 master-0 kubenswrapper[18707]: I0320 08:50:29.742948 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:29.785329 master-0 kubenswrapper[18707]: I0320 08:50:29.784400 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-web-config" (OuterVolumeSpecName: "web-config") pod "77ddbb16-b96e-4717-9786-2feae0d0cc3f" (UID: "77ddbb16-b96e-4717-9786-2feae0d0cc3f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:29.825941 master-0 kubenswrapper[18707]: I0320 08:50:29.825862 18707 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77ddbb16-b96e-4717-9786-2feae0d0cc3f-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.825941 master-0 kubenswrapper[18707]: I0320 08:50:29.825914 18707 reconciler_common.go:293] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.825941 master-0 kubenswrapper[18707]: I0320 08:50:29.825929 18707 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-k8s-db\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.825941 master-0 kubenswrapper[18707]: I0320 08:50:29.825941 18707 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77ddbb16-b96e-4717-9786-2feae0d0cc3f-config-out\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.825941 master-0 kubenswrapper[18707]: I0320 08:50:29.825951 18707 reconciler_common.go:293] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-serving-certs-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.825941 master-0 kubenswrapper[18707]: I0320 08:50:29.825962 18707 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-prometheus-k8s-rulefiles-0\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.825941 master-0 kubenswrapper[18707]: I0320 08:50:29.825974 18707 reconciler_common.go:293] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.826415 master-0 kubenswrapper[18707]: I0320 08:50:29.825985 18707 reconciler_common.go:293] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.826415 master-0 kubenswrapper[18707]: I0320 08:50:29.825999 18707 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-thanos-prometheus-http-client-file\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.826415 master-0 kubenswrapper[18707]: I0320 08:50:29.826008 18707 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77ddbb16-b96e-4717-9786-2feae0d0cc3f-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.826415 master-0 kubenswrapper[18707]: I0320 08:50:29.826019 18707 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-tls\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.826415 master-0 kubenswrapper[18707]: I0320 08:50:29.826028 18707 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-web-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.826415 master-0 kubenswrapper[18707]: I0320 08:50:29.826037 18707 reconciler_common.go:293] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-grpc-tls\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.826415 master-0 kubenswrapper[18707]: I0320 08:50:29.826046 18707 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.826415 master-0 kubenswrapper[18707]: I0320 08:50:29.826059 18707 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.826415 master-0 kubenswrapper[18707]: I0320 08:50:29.826071 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsh7c\" (UniqueName: \"kubernetes.io/projected/77ddbb16-b96e-4717-9786-2feae0d0cc3f-kube-api-access-bsh7c\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.826415 master-0 kubenswrapper[18707]: I0320 08:50:29.826083 18707 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:29.826415 master-0 kubenswrapper[18707]: I0320 08:50:29.826094 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/77ddbb16-b96e-4717-9786-2feae0d0cc3f-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:50:30.095653 master-0 kubenswrapper[18707]: I0320 08:50:30.095446 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:50:30.095653 master-0 kubenswrapper[18707]: I0320 08:50:30.095524 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:50:30.117339 master-0 kubenswrapper[18707]: I0320 08:50:30.117250 18707 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-0" Mar 20 08:50:30.120618 master-0 kubenswrapper[18707]: I0320 08:50:30.120550 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 20 08:50:30.126713 master-0 kubenswrapper[18707]: I0320 08:50:30.126646 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 20 08:50:30.179267 master-0 kubenswrapper[18707]: I0320 08:50:30.170679 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 20 08:50:30.250151 master-0 kubenswrapper[18707]: I0320 08:50:30.250075 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:50:30.250151 master-0 kubenswrapper[18707]: I0320 08:50:30.250118 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="8af5c0f7-cbc5-49cc-b254-1944d0a3b833" Mar 20 08:50:30.250513 master-0 kubenswrapper[18707]: I0320 08:50:30.250370 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77ddbb16-b96e-4717-9786-2feae0d0cc3f","Type":"ContainerDied","Data":"1f3f6e5cd3c4ec0aac0308f00b6155dfd226f4c87924acc6f28fc32640f5a3c1"} Mar 20 08:50:30.250513 master-0 kubenswrapper[18707]: I0320 08:50:30.250421 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:50:30.250513 master-0 kubenswrapper[18707]: I0320 08:50:30.250472 18707 scope.go:117] "RemoveContainer" containerID="27cbcc417567b245b2900410177a926ad00e08c6422a5beeefb377e30ef77b61" Mar 20 08:50:30.269109 master-0 kubenswrapper[18707]: I0320 08:50:30.269066 18707 scope.go:117] "RemoveContainer" containerID="7c9e4dd3063d39c7cb9b18d7a354e3435c2d2d5ec1986c53b7b85c908a8d1153" Mar 20 08:50:30.290222 master-0 kubenswrapper[18707]: I0320 08:50:30.288073 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=0.28804562 podStartE2EDuration="288.04562ms" podCreationTimestamp="2026-03-20 08:50:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:50:30.282923033 +0000 UTC m=+575.439103389" watchObservedRunningTime="2026-03-20 08:50:30.28804562 +0000 UTC m=+575.444225986" Mar 20 08:50:30.315806 master-0 kubenswrapper[18707]: I0320 08:50:30.315742 18707 scope.go:117] "RemoveContainer" containerID="44284a8b77568b923b8f2d7a551bb6b8408c54a850ac6d39be44e107e4b4a043" Mar 20 08:50:30.325172 master-0 kubenswrapper[18707]: I0320 08:50:30.325127 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 20 08:50:30.334664 master-0 kubenswrapper[18707]: I0320 08:50:30.334597 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 20 08:50:30.341562 master-0 kubenswrapper[18707]: I0320 08:50:30.341513 18707 scope.go:117] "RemoveContainer" containerID="87d8402979c175f019cec0c6457191a23d5e33345827dd54236af9fd1ae9b380" Mar 20 08:50:30.363423 master-0 kubenswrapper[18707]: I0320 08:50:30.363212 18707 scope.go:117] "RemoveContainer" containerID="f242c0b5af48bb6305b45e0ed0a80c9b6a7707b2746522996f3c16ec946731bb" Mar 20 08:50:30.386617 master-0 kubenswrapper[18707]: I0320 08:50:30.386223 18707 scope.go:117] "RemoveContainer" containerID="c19e6fb31c3a39d696985e46f9a3295932d6dbe5fa5c9082521702bd7df5a351" Mar 20 08:50:30.406370 master-0 kubenswrapper[18707]: I0320 08:50:30.406297 18707 scope.go:117] "RemoveContainer" containerID="784d0ad28f7b67339145bc04f7762b9f24ab7f8ba996e7b718878c32534ac025" Mar 20 08:50:31.106275 master-0 kubenswrapper[18707]: I0320 08:50:31.106200 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" path="/var/lib/kubelet/pods/77ddbb16-b96e-4717-9786-2feae0d0cc3f/volumes" Mar 20 08:50:33.563284 master-0 kubenswrapper[18707]: I0320 08:50:33.563154 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.563645 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="thanos-sidecar" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.563665 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="thanos-sidecar" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.563684 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="kube-rbac-proxy" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.563694 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="kube-rbac-proxy" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.563717 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="init-config-reloader" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.563728 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="init-config-reloader" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.563743 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="kube-rbac-proxy-web" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.563752 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="kube-rbac-proxy-web" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.563767 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="config-reloader" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.563775 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="config-reloader" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.563794 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="kube-rbac-proxy-thanos" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.563803 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="kube-rbac-proxy-thanos" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.563839 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eda9567-712b-4541-9344-a333e7734fed" containerName="installer" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.563852 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eda9567-712b-4541-9344-a333e7734fed" containerName="installer" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.563866 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="prometheus" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.563874 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="prometheus" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.563888 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="config-reloader" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.563894 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="config-reloader" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.563904 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="prom-label-proxy" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.563913 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="prom-label-proxy" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.563926 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="kube-rbac-proxy-web" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.563934 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="kube-rbac-proxy-web" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.563952 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="kube-rbac-proxy-metric" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.563960 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="kube-rbac-proxy-metric" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.563978 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="alertmanager" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.563986 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="alertmanager" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.563997 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="init-config-reloader" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.564005 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="init-config-reloader" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.564015 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="kube-rbac-proxy" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.564023 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="kube-rbac-proxy" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: E0320 08:50:33.564035 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69e8d3a-a0b1-4688-8631-d9f265aa4c69" containerName="metrics-server" Mar 20 08:50:33.564070 master-0 kubenswrapper[18707]: I0320 08:50:33.564043 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69e8d3a-a0b1-4688-8631-d9f265aa4c69" containerName="metrics-server" Mar 20 08:50:33.565958 master-0 kubenswrapper[18707]: I0320 08:50:33.564219 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="kube-rbac-proxy" Mar 20 08:50:33.565958 master-0 kubenswrapper[18707]: I0320 08:50:33.564247 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="config-reloader" Mar 20 08:50:33.565958 master-0 kubenswrapper[18707]: I0320 08:50:33.564256 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="prom-label-proxy" Mar 20 08:50:33.565958 master-0 kubenswrapper[18707]: I0320 08:50:33.564269 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="prometheus" Mar 20 08:50:33.565958 master-0 kubenswrapper[18707]: I0320 08:50:33.564285 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="kube-rbac-proxy-web" Mar 20 08:50:33.565958 master-0 kubenswrapper[18707]: I0320 08:50:33.564297 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="alertmanager" Mar 20 08:50:33.565958 master-0 kubenswrapper[18707]: I0320 08:50:33.564308 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="kube-rbac-proxy" Mar 20 08:50:33.565958 master-0 kubenswrapper[18707]: I0320 08:50:33.564318 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="kube-rbac-proxy-thanos" Mar 20 08:50:33.565958 master-0 kubenswrapper[18707]: I0320 08:50:33.564335 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69e8d3a-a0b1-4688-8631-d9f265aa4c69" containerName="metrics-server" Mar 20 08:50:33.565958 master-0 kubenswrapper[18707]: I0320 08:50:33.564347 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="kube-rbac-proxy-web" Mar 20 08:50:33.565958 master-0 kubenswrapper[18707]: I0320 08:50:33.564365 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="14d29bfa-a0cf-43bd-a3b8-052c1a224fc9" containerName="kube-rbac-proxy-metric" Mar 20 08:50:33.565958 master-0 kubenswrapper[18707]: I0320 08:50:33.564378 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eda9567-712b-4541-9344-a333e7734fed" containerName="installer" Mar 20 08:50:33.565958 master-0 kubenswrapper[18707]: I0320 08:50:33.564390 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="config-reloader" Mar 20 08:50:33.565958 master-0 kubenswrapper[18707]: I0320 08:50:33.564403 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ddbb16-b96e-4717-9786-2feae0d0cc3f" containerName="thanos-sidecar" Mar 20 08:50:33.565958 master-0 kubenswrapper[18707]: I0320 08:50:33.565059 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 20 08:50:33.568389 master-0 kubenswrapper[18707]: I0320 08:50:33.568331 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 08:50:33.579705 master-0 kubenswrapper[18707]: I0320 08:50:33.579613 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 20 08:50:33.700865 master-0 kubenswrapper[18707]: I0320 08:50:33.700774 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-var-lock\") pod \"installer-3-master-0\" (UID: \"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 20 08:50:33.701140 master-0 kubenswrapper[18707]: I0320 08:50:33.700898 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 20 08:50:33.701140 master-0 kubenswrapper[18707]: I0320 08:50:33.700940 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 20 08:50:33.802023 master-0 kubenswrapper[18707]: I0320 08:50:33.801932 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-var-lock\") pod \"installer-3-master-0\" (UID: \"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 20 08:50:33.802379 master-0 kubenswrapper[18707]: I0320 08:50:33.802103 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-var-lock\") pod \"installer-3-master-0\" (UID: \"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 20 08:50:33.802379 master-0 kubenswrapper[18707]: I0320 08:50:33.802133 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 20 08:50:33.802379 master-0 kubenswrapper[18707]: I0320 08:50:33.802298 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 20 08:50:33.802519 master-0 kubenswrapper[18707]: I0320 08:50:33.802420 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 20 08:50:33.819323 master-0 kubenswrapper[18707]: I0320 08:50:33.818857 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 20 08:50:33.893280 master-0 kubenswrapper[18707]: I0320 08:50:33.893179 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 20 08:50:34.336534 master-0 kubenswrapper[18707]: I0320 08:50:34.336454 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 20 08:50:35.302532 master-0 kubenswrapper[18707]: I0320 08:50:35.302452 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4","Type":"ContainerStarted","Data":"e5b832789574933e49e27a75531482aec0f71ec35968658bf7bdedc31d2d16cf"} Mar 20 08:50:35.303176 master-0 kubenswrapper[18707]: I0320 08:50:35.302562 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4","Type":"ContainerStarted","Data":"8fe8dfc5e8cfa16e3d49c6c2f6fac118fe2c31451336dba895802d42aa6be26b"} Mar 20 08:50:35.340450 master-0 kubenswrapper[18707]: I0320 08:50:35.340319 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=2.340289087 podStartE2EDuration="2.340289087s" podCreationTimestamp="2026-03-20 08:50:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:50:35.334910953 +0000 UTC m=+580.491091309" watchObservedRunningTime="2026-03-20 08:50:35.340289087 +0000 UTC m=+580.496469483" Mar 20 08:50:46.107061 master-0 kubenswrapper[18707]: I0320 08:50:46.106959 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 20 08:50:46.108828 master-0 kubenswrapper[18707]: I0320 08:50:46.108121 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 20 08:50:46.111372 master-0 kubenswrapper[18707]: I0320 08:50:46.111318 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 08:50:46.111372 master-0 kubenswrapper[18707]: I0320 08:50:46.111328 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-k4ktd" Mar 20 08:50:46.119009 master-0 kubenswrapper[18707]: I0320 08:50:46.118949 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 20 08:50:46.243018 master-0 kubenswrapper[18707]: I0320 08:50:46.242907 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/84b73fa6-b86b-4b65-826c-8f139d45c3d4-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"84b73fa6-b86b-4b65-826c-8f139d45c3d4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 20 08:50:46.243398 master-0 kubenswrapper[18707]: I0320 08:50:46.243102 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/84b73fa6-b86b-4b65-826c-8f139d45c3d4-var-lock\") pod \"installer-5-master-0\" (UID: \"84b73fa6-b86b-4b65-826c-8f139d45c3d4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 20 08:50:46.243398 master-0 kubenswrapper[18707]: I0320 08:50:46.243141 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84b73fa6-b86b-4b65-826c-8f139d45c3d4-kube-api-access\") pod \"installer-5-master-0\" (UID: \"84b73fa6-b86b-4b65-826c-8f139d45c3d4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 20 08:50:46.344977 master-0 kubenswrapper[18707]: I0320 08:50:46.344873 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/84b73fa6-b86b-4b65-826c-8f139d45c3d4-var-lock\") pod \"installer-5-master-0\" (UID: \"84b73fa6-b86b-4b65-826c-8f139d45c3d4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 20 08:50:46.345431 master-0 kubenswrapper[18707]: I0320 08:50:46.345089 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/84b73fa6-b86b-4b65-826c-8f139d45c3d4-var-lock\") pod \"installer-5-master-0\" (UID: \"84b73fa6-b86b-4b65-826c-8f139d45c3d4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 20 08:50:46.345517 master-0 kubenswrapper[18707]: I0320 08:50:46.345405 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84b73fa6-b86b-4b65-826c-8f139d45c3d4-kube-api-access\") pod \"installer-5-master-0\" (UID: \"84b73fa6-b86b-4b65-826c-8f139d45c3d4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 20 08:50:46.345779 master-0 kubenswrapper[18707]: I0320 08:50:46.345721 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/84b73fa6-b86b-4b65-826c-8f139d45c3d4-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"84b73fa6-b86b-4b65-826c-8f139d45c3d4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 20 08:50:46.345927 master-0 kubenswrapper[18707]: I0320 08:50:46.345865 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/84b73fa6-b86b-4b65-826c-8f139d45c3d4-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"84b73fa6-b86b-4b65-826c-8f139d45c3d4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 20 08:50:46.377582 master-0 kubenswrapper[18707]: I0320 08:50:46.377395 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84b73fa6-b86b-4b65-826c-8f139d45c3d4-kube-api-access\") pod \"installer-5-master-0\" (UID: \"84b73fa6-b86b-4b65-826c-8f139d45c3d4\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 20 08:50:46.459846 master-0 kubenswrapper[18707]: I0320 08:50:46.459762 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 20 08:50:46.951273 master-0 kubenswrapper[18707]: I0320 08:50:46.951100 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 20 08:50:46.960062 master-0 kubenswrapper[18707]: W0320 08:50:46.960013 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod84b73fa6_b86b_4b65_826c_8f139d45c3d4.slice/crio-2a20bf4774a7b180c7af035a4c5c7b9a0d86fc11337ff4d8a5f5e1af2ac72be1 WatchSource:0}: Error finding container 2a20bf4774a7b180c7af035a4c5c7b9a0d86fc11337ff4d8a5f5e1af2ac72be1: Status 404 returned error can't find the container with id 2a20bf4774a7b180c7af035a4c5c7b9a0d86fc11337ff4d8a5f5e1af2ac72be1 Mar 20 08:50:47.434835 master-0 kubenswrapper[18707]: I0320 08:50:47.434757 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"84b73fa6-b86b-4b65-826c-8f139d45c3d4","Type":"ContainerStarted","Data":"31d897c04efafa88ef2e878d1dccbafeded7d576383b85f8be0ba8ef160099c3"} Mar 20 08:50:47.436465 master-0 kubenswrapper[18707]: I0320 08:50:47.435400 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"84b73fa6-b86b-4b65-826c-8f139d45c3d4","Type":"ContainerStarted","Data":"2a20bf4774a7b180c7af035a4c5c7b9a0d86fc11337ff4d8a5f5e1af2ac72be1"} Mar 20 08:50:47.456429 master-0 kubenswrapper[18707]: I0320 08:50:47.456248 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-master-0" podStartSLOduration=2.456222495 podStartE2EDuration="2.456222495s" podCreationTimestamp="2026-03-20 08:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:50:47.455047453 +0000 UTC m=+592.611227809" watchObservedRunningTime="2026-03-20 08:50:47.456222495 +0000 UTC m=+592.612402861" Mar 20 08:50:55.560536 master-0 kubenswrapper[18707]: I0320 08:50:55.560453 18707 scope.go:117] "RemoveContainer" containerID="1e47e53c73caf9294d7cc552d435de2bba3335889632bb5e47c9437fae1ad38a" Mar 20 08:50:55.577960 master-0 kubenswrapper[18707]: I0320 08:50:55.577851 18707 scope.go:117] "RemoveContainer" containerID="9109a59c4f2c870fa24f5897d11d6ce0d82c7037a7c29d3008301ef58dbfb30e" Mar 20 08:50:55.593472 master-0 kubenswrapper[18707]: I0320 08:50:55.593367 18707 scope.go:117] "RemoveContainer" containerID="a1bf05287a33269822d24c09f055433696b780b4f7f403c232dda246f6cacc28" Mar 20 08:50:55.607597 master-0 kubenswrapper[18707]: I0320 08:50:55.607496 18707 scope.go:117] "RemoveContainer" containerID="c71b79252c7c80b19481c7db2c5281ad6c66edb0a67d74d3f77f82c8ec887429" Mar 20 08:50:55.635594 master-0 kubenswrapper[18707]: I0320 08:50:55.635097 18707 scope.go:117] "RemoveContainer" containerID="ef595b86616a9d59758e3bb52cb2c40900df30c7a5c7962dd70066b8351abfcc" Mar 20 08:50:55.660124 master-0 kubenswrapper[18707]: I0320 08:50:55.660021 18707 scope.go:117] "RemoveContainer" containerID="bf080bd2aa0228ff67560c7f40f2f04126f362d98ad14ce5d016639c7975d354" Mar 20 08:50:55.679517 master-0 kubenswrapper[18707]: I0320 08:50:55.679438 18707 scope.go:117] "RemoveContainer" containerID="cbc84cc16fdf36c19a2ec8ddb8d8c567dd3a2aefe284ff96c39ef8fc1be8eb8d" Mar 20 08:51:07.585377 master-0 kubenswrapper[18707]: I0320 08:51:07.585280 18707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 20 08:51:07.586473 master-0 kubenswrapper[18707]: I0320 08:51:07.585845 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://665625ddaf4d7d5a13e6f9aa415e12a52677c52b3254cb6bcb690bbf3d2cdd27" gracePeriod=30 Mar 20 08:51:07.586473 master-0 kubenswrapper[18707]: I0320 08:51:07.586244 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" containerID="cri-o://26d1f23f09ec46d0564e314771aedd57d50a0394449491bf05654764fec7468d" gracePeriod=30 Mar 20 08:51:07.586473 master-0 kubenswrapper[18707]: I0320 08:51:07.586389 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://61bf9561fb6a2bfd80add2b9b814a82fa5954086996b7e4feb2d7aa26a526193" gracePeriod=30 Mar 20 08:51:07.586832 master-0 kubenswrapper[18707]: I0320 08:51:07.586610 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager" containerID="cri-o://871676bb611c7b17e30bacdec9ff5c25cf4c40cdada2c8d2a4f54b77ebf11820" gracePeriod=30 Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: I0320 08:51:07.587602 18707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: E0320 08:51:07.587983 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: I0320 08:51:07.588003 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: E0320 08:51:07.588022 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: I0320 08:51:07.588030 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: E0320 08:51:07.588041 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: I0320 08:51:07.588048 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: E0320 08:51:07.588078 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager-recovery-controller" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: I0320 08:51:07.588087 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager-recovery-controller" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: E0320 08:51:07.588099 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: I0320 08:51:07.588106 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: E0320 08:51:07.588114 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: I0320 08:51:07.588120 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: E0320 08:51:07.588145 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager-cert-syncer" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: I0320 08:51:07.588152 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager-cert-syncer" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: E0320 08:51:07.588162 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager-cert-syncer" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: I0320 08:51:07.588169 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager-cert-syncer" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: E0320 08:51:07.588198 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: I0320 08:51:07.588206 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: I0320 08:51:07.588373 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager" Mar 20 08:51:07.588350 master-0 kubenswrapper[18707]: I0320 08:51:07.588397 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" Mar 20 08:51:07.590378 master-0 kubenswrapper[18707]: I0320 08:51:07.588418 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager-cert-syncer" Mar 20 08:51:07.590378 master-0 kubenswrapper[18707]: I0320 08:51:07.588431 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" Mar 20 08:51:07.590378 master-0 kubenswrapper[18707]: I0320 08:51:07.588442 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" Mar 20 08:51:07.590378 master-0 kubenswrapper[18707]: I0320 08:51:07.588454 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" Mar 20 08:51:07.590378 master-0 kubenswrapper[18707]: I0320 08:51:07.588467 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager" Mar 20 08:51:07.590378 master-0 kubenswrapper[18707]: I0320 08:51:07.588481 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager-recovery-controller" Mar 20 08:51:07.590378 master-0 kubenswrapper[18707]: I0320 08:51:07.588496 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" Mar 20 08:51:07.590378 master-0 kubenswrapper[18707]: I0320 08:51:07.588506 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager-cert-syncer" Mar 20 08:51:07.590378 master-0 kubenswrapper[18707]: E0320 08:51:07.588663 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" Mar 20 08:51:07.590378 master-0 kubenswrapper[18707]: I0320 08:51:07.588675 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" Mar 20 08:51:07.590378 master-0 kubenswrapper[18707]: E0320 08:51:07.588695 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" Mar 20 08:51:07.590378 master-0 kubenswrapper[18707]: I0320 08:51:07.588703 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2028761b8522f874dcebf13c4683d033" containerName="cluster-policy-controller" Mar 20 08:51:07.590378 master-0 kubenswrapper[18707]: I0320 08:51:07.588870 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2028761b8522f874dcebf13c4683d033" containerName="kube-controller-manager" Mar 20 08:51:07.662278 master-0 kubenswrapper[18707]: I0320 08:51:07.662168 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7953905b6830b623394f78c614eeb251-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"7953905b6830b623394f78c614eeb251\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:07.662540 master-0 kubenswrapper[18707]: I0320 08:51:07.662417 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7953905b6830b623394f78c614eeb251-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"7953905b6830b623394f78c614eeb251\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:07.765279 master-0 kubenswrapper[18707]: I0320 08:51:07.765210 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7953905b6830b623394f78c614eeb251-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"7953905b6830b623394f78c614eeb251\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:07.766547 master-0 kubenswrapper[18707]: I0320 08:51:07.766503 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7953905b6830b623394f78c614eeb251-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"7953905b6830b623394f78c614eeb251\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:07.766721 master-0 kubenswrapper[18707]: I0320 08:51:07.766690 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7953905b6830b623394f78c614eeb251-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"7953905b6830b623394f78c614eeb251\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:07.766926 master-0 kubenswrapper[18707]: I0320 08:51:07.766900 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7953905b6830b623394f78c614eeb251-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"7953905b6830b623394f78c614eeb251\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:07.878809 master-0 kubenswrapper[18707]: I0320 08:51:07.878676 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager-cert-syncer/1.log" Mar 20 08:51:07.879397 master-0 kubenswrapper[18707]: I0320 08:51:07.879317 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/cluster-policy-controller/3.log" Mar 20 08:51:07.880375 master-0 kubenswrapper[18707]: I0320 08:51:07.880353 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:51:07.881311 master-0 kubenswrapper[18707]: I0320 08:51:07.881263 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager-cert-syncer/0.log" Mar 20 08:51:07.881454 master-0 kubenswrapper[18707]: I0320 08:51:07.881429 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:07.885883 master-0 kubenswrapper[18707]: I0320 08:51:07.885835 18707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="2028761b8522f874dcebf13c4683d033" podUID="7953905b6830b623394f78c614eeb251" Mar 20 08:51:07.968557 master-0 kubenswrapper[18707]: I0320 08:51:07.968515 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-resource-dir\") pod \"2028761b8522f874dcebf13c4683d033\" (UID: \"2028761b8522f874dcebf13c4683d033\") " Mar 20 08:51:07.968750 master-0 kubenswrapper[18707]: I0320 08:51:07.968624 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-cert-dir\") pod \"2028761b8522f874dcebf13c4683d033\" (UID: \"2028761b8522f874dcebf13c4683d033\") " Mar 20 08:51:07.968869 master-0 kubenswrapper[18707]: I0320 08:51:07.968844 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "2028761b8522f874dcebf13c4683d033" (UID: "2028761b8522f874dcebf13c4683d033"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:51:07.968910 master-0 kubenswrapper[18707]: I0320 08:51:07.968842 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "2028761b8522f874dcebf13c4683d033" (UID: "2028761b8522f874dcebf13c4683d033"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:51:08.070038 master-0 kubenswrapper[18707]: I0320 08:51:08.069997 18707 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:51:08.070038 master-0 kubenswrapper[18707]: I0320 08:51:08.070030 18707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2028761b8522f874dcebf13c4683d033-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:51:08.625365 master-0 kubenswrapper[18707]: I0320 08:51:08.625272 18707 generic.go:334] "Generic (PLEG): container finished" podID="d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4" containerID="e5b832789574933e49e27a75531482aec0f71ec35968658bf7bdedc31d2d16cf" exitCode=0 Mar 20 08:51:08.625365 master-0 kubenswrapper[18707]: I0320 08:51:08.625370 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4","Type":"ContainerDied","Data":"e5b832789574933e49e27a75531482aec0f71ec35968658bf7bdedc31d2d16cf"} Mar 20 08:51:08.630672 master-0 kubenswrapper[18707]: I0320 08:51:08.630608 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager-cert-syncer/1.log" Mar 20 08:51:08.632084 master-0 kubenswrapper[18707]: I0320 08:51:08.632002 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/cluster-policy-controller/3.log" Mar 20 08:51:08.633850 master-0 kubenswrapper[18707]: I0320 08:51:08.633814 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager/1.log" Mar 20 08:51:08.635784 master-0 kubenswrapper[18707]: I0320 08:51:08.635727 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager-cert-syncer/0.log" Mar 20 08:51:08.635973 master-0 kubenswrapper[18707]: I0320 08:51:08.635814 18707 generic.go:334] "Generic (PLEG): container finished" podID="2028761b8522f874dcebf13c4683d033" containerID="26d1f23f09ec46d0564e314771aedd57d50a0394449491bf05654764fec7468d" exitCode=0 Mar 20 08:51:08.635973 master-0 kubenswrapper[18707]: I0320 08:51:08.635937 18707 generic.go:334] "Generic (PLEG): container finished" podID="2028761b8522f874dcebf13c4683d033" containerID="61bf9561fb6a2bfd80add2b9b814a82fa5954086996b7e4feb2d7aa26a526193" exitCode=2 Mar 20 08:51:08.635973 master-0 kubenswrapper[18707]: I0320 08:51:08.635967 18707 generic.go:334] "Generic (PLEG): container finished" podID="2028761b8522f874dcebf13c4683d033" containerID="871676bb611c7b17e30bacdec9ff5c25cf4c40cdada2c8d2a4f54b77ebf11820" exitCode=0 Mar 20 08:51:08.636357 master-0 kubenswrapper[18707]: I0320 08:51:08.635991 18707 generic.go:334] "Generic (PLEG): container finished" podID="2028761b8522f874dcebf13c4683d033" containerID="665625ddaf4d7d5a13e6f9aa415e12a52677c52b3254cb6bcb690bbf3d2cdd27" exitCode=0 Mar 20 08:51:08.636357 master-0 kubenswrapper[18707]: I0320 08:51:08.636012 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:08.636357 master-0 kubenswrapper[18707]: I0320 08:51:08.636040 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="760e18ae3700547257879a5def4bd8a3e6845f819368981331d76400fe97af9e" Mar 20 08:51:08.636357 master-0 kubenswrapper[18707]: I0320 08:51:08.635946 18707 scope.go:117] "RemoveContainer" containerID="1a645c4dd6894d87bc5827ce1df71595038461daa2bda7238648a01805dbab0e" Mar 20 08:51:08.662291 master-0 kubenswrapper[18707]: I0320 08:51:08.662132 18707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="2028761b8522f874dcebf13c4683d033" podUID="7953905b6830b623394f78c614eeb251" Mar 20 08:51:08.674807 master-0 kubenswrapper[18707]: I0320 08:51:08.674687 18707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="2028761b8522f874dcebf13c4683d033" podUID="7953905b6830b623394f78c614eeb251" Mar 20 08:51:08.678623 master-0 kubenswrapper[18707]: I0320 08:51:08.677537 18707 scope.go:117] "RemoveContainer" containerID="cfd1a328fad8a1fbf228c2c6250b1d586490a1a703c23535a3eb7dcf1bbe5867" Mar 20 08:51:08.695451 master-0 kubenswrapper[18707]: I0320 08:51:08.695399 18707 scope.go:117] "RemoveContainer" containerID="a800488fb62a072a6848e70ff9d43658046c14f0dd3e6ca8078acf4e79046779" Mar 20 08:51:09.104629 master-0 kubenswrapper[18707]: I0320 08:51:09.104551 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2028761b8522f874dcebf13c4683d033" path="/var/lib/kubelet/pods/2028761b8522f874dcebf13c4683d033/volumes" Mar 20 08:51:09.650848 master-0 kubenswrapper[18707]: I0320 08:51:09.650738 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2028761b8522f874dcebf13c4683d033/kube-controller-manager-cert-syncer/1.log" Mar 20 08:51:09.946451 master-0 kubenswrapper[18707]: I0320 08:51:09.946368 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 20 08:51:10.104877 master-0 kubenswrapper[18707]: I0320 08:51:10.104788 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-kube-api-access\") pod \"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4\" (UID: \"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4\") " Mar 20 08:51:10.105222 master-0 kubenswrapper[18707]: I0320 08:51:10.104932 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-kubelet-dir\") pod \"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4\" (UID: \"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4\") " Mar 20 08:51:10.105222 master-0 kubenswrapper[18707]: I0320 08:51:10.105010 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4" (UID: "d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:51:10.105922 master-0 kubenswrapper[18707]: I0320 08:51:10.105841 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-var-lock\") pod \"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4\" (UID: \"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4\") " Mar 20 08:51:10.106594 master-0 kubenswrapper[18707]: I0320 08:51:10.106560 18707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:51:10.106892 master-0 kubenswrapper[18707]: I0320 08:51:10.106859 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-var-lock" (OuterVolumeSpecName: "var-lock") pod "d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4" (UID: "d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:51:10.109546 master-0 kubenswrapper[18707]: I0320 08:51:10.109502 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4" (UID: "d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:51:10.208845 master-0 kubenswrapper[18707]: I0320 08:51:10.208657 18707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:51:10.208845 master-0 kubenswrapper[18707]: I0320 08:51:10.208729 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:51:10.661265 master-0 kubenswrapper[18707]: I0320 08:51:10.661143 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4","Type":"ContainerDied","Data":"8fe8dfc5e8cfa16e3d49c6c2f6fac118fe2c31451336dba895802d42aa6be26b"} Mar 20 08:51:10.661265 master-0 kubenswrapper[18707]: I0320 08:51:10.661271 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fe8dfc5e8cfa16e3d49c6c2f6fac118fe2c31451336dba895802d42aa6be26b" Mar 20 08:51:10.661265 master-0 kubenswrapper[18707]: I0320 08:51:10.661208 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 20 08:51:16.935527 master-0 kubenswrapper[18707]: I0320 08:51:16.935346 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:51:16.941093 master-0 kubenswrapper[18707]: I0320 08:51:16.939826 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"installer-3-master-0\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 20 08:51:17.036832 master-0 kubenswrapper[18707]: I0320 08:51:17.036761 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") pod \"d245e5b2-a30d-45c8-9b79-6e8096765c14\" (UID: \"d245e5b2-a30d-45c8-9b79-6e8096765c14\") " Mar 20 08:51:17.043484 master-0 kubenswrapper[18707]: I0320 08:51:17.043423 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d245e5b2-a30d-45c8-9b79-6e8096765c14" (UID: "d245e5b2-a30d-45c8-9b79-6e8096765c14"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:51:17.139834 master-0 kubenswrapper[18707]: I0320 08:51:17.139711 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d245e5b2-a30d-45c8-9b79-6e8096765c14-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:51:22.093798 master-0 kubenswrapper[18707]: I0320 08:51:22.093460 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:22.120410 master-0 kubenswrapper[18707]: I0320 08:51:22.117586 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="411fa221-8961-4a5b-ba94-a27ccbb555b1" Mar 20 08:51:22.120410 master-0 kubenswrapper[18707]: I0320 08:51:22.117633 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="411fa221-8961-4a5b-ba94-a27ccbb555b1" Mar 20 08:51:22.141373 master-0 kubenswrapper[18707]: I0320 08:51:22.141278 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 20 08:51:22.141711 master-0 kubenswrapper[18707]: I0320 08:51:22.141454 18707 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:22.148518 master-0 kubenswrapper[18707]: I0320 08:51:22.147790 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 20 08:51:22.169578 master-0 kubenswrapper[18707]: I0320 08:51:22.169493 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:22.176006 master-0 kubenswrapper[18707]: I0320 08:51:22.175944 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 20 08:51:22.763419 master-0 kubenswrapper[18707]: I0320 08:51:22.763309 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"7953905b6830b623394f78c614eeb251","Type":"ContainerStarted","Data":"eee3c019520db05254889557f747480d6daa08c439db039b3084cb22d52227b7"} Mar 20 08:51:22.763419 master-0 kubenswrapper[18707]: I0320 08:51:22.763393 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"7953905b6830b623394f78c614eeb251","Type":"ContainerStarted","Data":"d633def5644d2939cf76b3ff1ca50d431eefd60fe6e3c53fdac7aa954b91c3d0"} Mar 20 08:51:22.763419 master-0 kubenswrapper[18707]: I0320 08:51:22.763408 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"7953905b6830b623394f78c614eeb251","Type":"ContainerStarted","Data":"47a0bc110bb8e180c0d8aec527c63a230ff8e1ff6f1bcadeae851148a7655c28"} Mar 20 08:51:23.793155 master-0 kubenswrapper[18707]: I0320 08:51:23.792448 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"7953905b6830b623394f78c614eeb251","Type":"ContainerStarted","Data":"4f6ae1640c74b96a0df0629b9b63fe65ebaa1ff15f95802689673b759615b737"} Mar 20 08:51:23.793155 master-0 kubenswrapper[18707]: I0320 08:51:23.792527 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"7953905b6830b623394f78c614eeb251","Type":"ContainerStarted","Data":"eaf8f3bc5689088b8fa94827b5652f8c3849d77cca5fef2e6de82d175214a2f2"} Mar 20 08:51:23.820844 master-0 kubenswrapper[18707]: I0320 08:51:23.820743 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=1.820720369 podStartE2EDuration="1.820720369s" podCreationTimestamp="2026-03-20 08:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:51:23.81271223 +0000 UTC m=+628.968892606" watchObservedRunningTime="2026-03-20 08:51:23.820720369 +0000 UTC m=+628.976900725" Mar 20 08:51:25.396002 master-0 kubenswrapper[18707]: E0320 08:51:25.395881 18707 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 20 08:51:25.397962 master-0 kubenswrapper[18707]: I0320 08:51:25.397858 18707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 20 08:51:25.398458 master-0 kubenswrapper[18707]: E0320 08:51:25.398384 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4" containerName="installer" Mar 20 08:51:25.398458 master-0 kubenswrapper[18707]: I0320 08:51:25.398431 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4" containerName="installer" Mar 20 08:51:25.399054 master-0 kubenswrapper[18707]: I0320 08:51:25.398690 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1ed9ccc-e56d-4aab-9aab-1d1a45594dc4" containerName="installer" Mar 20 08:51:25.399408 master-0 kubenswrapper[18707]: I0320 08:51:25.399356 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.399513 master-0 kubenswrapper[18707]: I0320 08:51:25.399442 18707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 20 08:51:25.400007 master-0 kubenswrapper[18707]: I0320 08:51:25.399950 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver" containerID="cri-o://16441156b7e8bbe4b02a4b00165bba66f63737e9ccdc2639e50919dc2f550bf4" gracePeriod=15 Mar 20 08:51:25.400104 master-0 kubenswrapper[18707]: I0320 08:51:25.400026 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-syncer" containerID="cri-o://133aaf528ab1e0887a626b89e3622758b8188d7b50da183f7f2755440645c3ac" gracePeriod=15 Mar 20 08:51:25.400104 master-0 kubenswrapper[18707]: I0320 08:51:25.400050 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b0f5e22bf5db5b532cc2a9b7e1994228d722c79a9f2d60d4ddc02358bee2cb82" gracePeriod=15 Mar 20 08:51:25.400280 master-0 kubenswrapper[18707]: I0320 08:51:25.399962 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://963bd0525df3985d6f43b1fe1bd02659a99df0c473266a6b7445de535552a1f3" gracePeriod=15 Mar 20 08:51:25.400280 master-0 kubenswrapper[18707]: I0320 08:51:25.400057 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-check-endpoints" containerID="cri-o://c2a566d1a6044b46ff0beec324066fa5e7cc44891867becc4439f58df2d6446e" gracePeriod=15 Mar 20 08:51:25.403818 master-0 kubenswrapper[18707]: I0320 08:51:25.403760 18707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 20 08:51:25.404539 master-0 kubenswrapper[18707]: E0320 08:51:25.404486 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-check-endpoints" Mar 20 08:51:25.404539 master-0 kubenswrapper[18707]: I0320 08:51:25.404513 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-check-endpoints" Mar 20 08:51:25.404539 master-0 kubenswrapper[18707]: E0320 08:51:25.404527 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-insecure-readyz" Mar 20 08:51:25.404539 master-0 kubenswrapper[18707]: I0320 08:51:25.404533 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-insecure-readyz" Mar 20 08:51:25.404539 master-0 kubenswrapper[18707]: E0320 08:51:25.404551 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-syncer" Mar 20 08:51:25.404899 master-0 kubenswrapper[18707]: I0320 08:51:25.404560 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-syncer" Mar 20 08:51:25.404899 master-0 kubenswrapper[18707]: E0320 08:51:25.404572 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 08:51:25.404899 master-0 kubenswrapper[18707]: I0320 08:51:25.404578 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 08:51:25.404899 master-0 kubenswrapper[18707]: E0320 08:51:25.404590 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="setup" Mar 20 08:51:25.404899 master-0 kubenswrapper[18707]: I0320 08:51:25.404599 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="setup" Mar 20 08:51:25.404899 master-0 kubenswrapper[18707]: E0320 08:51:25.404634 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver" Mar 20 08:51:25.404899 master-0 kubenswrapper[18707]: I0320 08:51:25.404640 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver" Mar 20 08:51:25.404899 master-0 kubenswrapper[18707]: I0320 08:51:25.404812 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 08:51:25.404899 master-0 kubenswrapper[18707]: I0320 08:51:25.404826 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-insecure-readyz" Mar 20 08:51:25.404899 master-0 kubenswrapper[18707]: I0320 08:51:25.404838 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 08:51:25.404899 master-0 kubenswrapper[18707]: I0320 08:51:25.404856 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-syncer" Mar 20 08:51:25.404899 master-0 kubenswrapper[18707]: I0320 08:51:25.404870 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-check-endpoints" Mar 20 08:51:25.404899 master-0 kubenswrapper[18707]: I0320 08:51:25.404876 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver" Mar 20 08:51:25.413533 master-0 kubenswrapper[18707]: E0320 08:51:25.405016 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 08:51:25.413533 master-0 kubenswrapper[18707]: I0320 08:51:25.405026 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 08:51:25.592226 master-0 kubenswrapper[18707]: I0320 08:51:25.592115 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.592226 master-0 kubenswrapper[18707]: I0320 08:51:25.592202 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:25.592540 master-0 kubenswrapper[18707]: I0320 08:51:25.592417 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:25.592623 master-0 kubenswrapper[18707]: I0320 08:51:25.592579 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.592623 master-0 kubenswrapper[18707]: I0320 08:51:25.592613 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:25.592768 master-0 kubenswrapper[18707]: I0320 08:51:25.592648 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.592768 master-0 kubenswrapper[18707]: I0320 08:51:25.592681 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.592768 master-0 kubenswrapper[18707]: I0320 08:51:25.592762 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.611793 master-0 kubenswrapper[18707]: E0320 08:51:25.611657 18707 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.694065 master-0 kubenswrapper[18707]: I0320 08:51:25.693848 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.694065 master-0 kubenswrapper[18707]: I0320 08:51:25.693920 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.694065 master-0 kubenswrapper[18707]: I0320 08:51:25.693972 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.694065 master-0 kubenswrapper[18707]: I0320 08:51:25.693975 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.694065 master-0 kubenswrapper[18707]: I0320 08:51:25.694018 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:25.694065 master-0 kubenswrapper[18707]: I0320 08:51:25.694065 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:25.695098 master-0 kubenswrapper[18707]: I0320 08:51:25.694230 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:25.695098 master-0 kubenswrapper[18707]: I0320 08:51:25.694378 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.695098 master-0 kubenswrapper[18707]: I0320 08:51:25.694455 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:25.695098 master-0 kubenswrapper[18707]: I0320 08:51:25.694509 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.695098 master-0 kubenswrapper[18707]: I0320 08:51:25.694514 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.695098 master-0 kubenswrapper[18707]: I0320 08:51:25.694548 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.695098 master-0 kubenswrapper[18707]: I0320 08:51:25.694554 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.695098 master-0 kubenswrapper[18707]: I0320 08:51:25.694690 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:25.695098 master-0 kubenswrapper[18707]: I0320 08:51:25.694756 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.695098 master-0 kubenswrapper[18707]: I0320 08:51:25.694961 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5f502b117c7c8479f7f20848a50fec0-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"d5f502b117c7c8479f7f20848a50fec0\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:25.819221 master-0 kubenswrapper[18707]: I0320 08:51:25.819110 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7d5ce05b3d592e63f1f92202d52b9635/kube-apiserver-cert-syncer/0.log" Mar 20 08:51:25.820740 master-0 kubenswrapper[18707]: I0320 08:51:25.820673 18707 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="963bd0525df3985d6f43b1fe1bd02659a99df0c473266a6b7445de535552a1f3" exitCode=0 Mar 20 08:51:25.820740 master-0 kubenswrapper[18707]: I0320 08:51:25.820738 18707 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="c2a566d1a6044b46ff0beec324066fa5e7cc44891867becc4439f58df2d6446e" exitCode=0 Mar 20 08:51:25.820868 master-0 kubenswrapper[18707]: I0320 08:51:25.820755 18707 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="b0f5e22bf5db5b532cc2a9b7e1994228d722c79a9f2d60d4ddc02358bee2cb82" exitCode=0 Mar 20 08:51:25.820868 master-0 kubenswrapper[18707]: I0320 08:51:25.820773 18707 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="133aaf528ab1e0887a626b89e3622758b8188d7b50da183f7f2755440645c3ac" exitCode=2 Mar 20 08:51:25.820868 master-0 kubenswrapper[18707]: I0320 08:51:25.820790 18707 scope.go:117] "RemoveContainer" containerID="980a74af7dbc007260e7377ea4cb1edcafe4c9568ad57a168d88500b7bd91f2e" Mar 20 08:51:25.824040 master-0 kubenswrapper[18707]: I0320 08:51:25.823987 18707 generic.go:334] "Generic (PLEG): container finished" podID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" containerID="31d897c04efafa88ef2e878d1dccbafeded7d576383b85f8be0ba8ef160099c3" exitCode=0 Mar 20 08:51:25.824117 master-0 kubenswrapper[18707]: I0320 08:51:25.824047 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"84b73fa6-b86b-4b65-826c-8f139d45c3d4","Type":"ContainerDied","Data":"31d897c04efafa88ef2e878d1dccbafeded7d576383b85f8be0ba8ef160099c3"} Mar 20 08:51:25.825936 master-0 kubenswrapper[18707]: I0320 08:51:25.825822 18707 status_manager.go:851] "Failed to get status for pod" podUID="7d5ce05b3d592e63f1f92202d52b9635" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:25.827582 master-0 kubenswrapper[18707]: I0320 08:51:25.827509 18707 status_manager.go:851] "Failed to get status for pod" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:25.913803 master-0 kubenswrapper[18707]: I0320 08:51:25.913034 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:25.948898 master-0 kubenswrapper[18707]: W0320 08:51:25.948745 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85632c1cec8974aa874834e4cfff4c77.slice/crio-d4da16b0583e4e49b798b3e3a05db5a67106e210e24d0e88fa9b10672c383cef WatchSource:0}: Error finding container d4da16b0583e4e49b798b3e3a05db5a67106e210e24d0e88fa9b10672c383cef: Status 404 returned error can't find the container with id d4da16b0583e4e49b798b3e3a05db5a67106e210e24d0e88fa9b10672c383cef Mar 20 08:51:25.954889 master-0 kubenswrapper[18707]: E0320 08:51:25.954640 18707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e80963af5f8d7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:85632c1cec8974aa874834e4cfff4c77,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:51:25.953202391 +0000 UTC m=+631.109382747,LastTimestamp:2026-03-20 08:51:25.953202391 +0000 UTC m=+631.109382747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:51:26.836130 master-0 kubenswrapper[18707]: I0320 08:51:26.836067 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7d5ce05b3d592e63f1f92202d52b9635/kube-apiserver-cert-syncer/0.log" Mar 20 08:51:26.838515 master-0 kubenswrapper[18707]: I0320 08:51:26.838439 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"85632c1cec8974aa874834e4cfff4c77","Type":"ContainerStarted","Data":"c541ca942411b60e404f791f689440307713f1cb9fe9db3a23ad1c24be33a6e7"} Mar 20 08:51:26.838589 master-0 kubenswrapper[18707]: I0320 08:51:26.838520 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"85632c1cec8974aa874834e4cfff4c77","Type":"ContainerStarted","Data":"d4da16b0583e4e49b798b3e3a05db5a67106e210e24d0e88fa9b10672c383cef"} Mar 20 08:51:26.839955 master-0 kubenswrapper[18707]: E0320 08:51:26.839913 18707 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:51:26.840006 master-0 kubenswrapper[18707]: I0320 08:51:26.839910 18707 status_manager.go:851] "Failed to get status for pod" podUID="7d5ce05b3d592e63f1f92202d52b9635" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:26.840559 master-0 kubenswrapper[18707]: I0320 08:51:26.840520 18707 status_manager.go:851] "Failed to get status for pod" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:27.195960 master-0 kubenswrapper[18707]: I0320 08:51:27.195909 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 20 08:51:27.197127 master-0 kubenswrapper[18707]: I0320 08:51:27.197059 18707 status_manager.go:851] "Failed to get status for pod" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:27.222869 master-0 kubenswrapper[18707]: I0320 08:51:27.222804 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/84b73fa6-b86b-4b65-826c-8f139d45c3d4-kubelet-dir\") pod \"84b73fa6-b86b-4b65-826c-8f139d45c3d4\" (UID: \"84b73fa6-b86b-4b65-826c-8f139d45c3d4\") " Mar 20 08:51:27.223136 master-0 kubenswrapper[18707]: I0320 08:51:27.222944 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84b73fa6-b86b-4b65-826c-8f139d45c3d4-kube-api-access\") pod \"84b73fa6-b86b-4b65-826c-8f139d45c3d4\" (UID: \"84b73fa6-b86b-4b65-826c-8f139d45c3d4\") " Mar 20 08:51:27.223136 master-0 kubenswrapper[18707]: I0320 08:51:27.223013 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84b73fa6-b86b-4b65-826c-8f139d45c3d4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "84b73fa6-b86b-4b65-826c-8f139d45c3d4" (UID: "84b73fa6-b86b-4b65-826c-8f139d45c3d4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:51:27.223136 master-0 kubenswrapper[18707]: I0320 08:51:27.223091 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/84b73fa6-b86b-4b65-826c-8f139d45c3d4-var-lock\") pod \"84b73fa6-b86b-4b65-826c-8f139d45c3d4\" (UID: \"84b73fa6-b86b-4b65-826c-8f139d45c3d4\") " Mar 20 08:51:27.223658 master-0 kubenswrapper[18707]: I0320 08:51:27.223375 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84b73fa6-b86b-4b65-826c-8f139d45c3d4-var-lock" (OuterVolumeSpecName: "var-lock") pod "84b73fa6-b86b-4b65-826c-8f139d45c3d4" (UID: "84b73fa6-b86b-4b65-826c-8f139d45c3d4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:51:27.223658 master-0 kubenswrapper[18707]: I0320 08:51:27.223613 18707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/84b73fa6-b86b-4b65-826c-8f139d45c3d4-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:51:27.223658 master-0 kubenswrapper[18707]: I0320 08:51:27.223634 18707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/84b73fa6-b86b-4b65-826c-8f139d45c3d4-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:51:27.226396 master-0 kubenswrapper[18707]: I0320 08:51:27.226353 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84b73fa6-b86b-4b65-826c-8f139d45c3d4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "84b73fa6-b86b-4b65-826c-8f139d45c3d4" (UID: "84b73fa6-b86b-4b65-826c-8f139d45c3d4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:51:27.325325 master-0 kubenswrapper[18707]: I0320 08:51:27.325246 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84b73fa6-b86b-4b65-826c-8f139d45c3d4-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:51:27.830790 master-0 kubenswrapper[18707]: I0320 08:51:27.830731 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7d5ce05b3d592e63f1f92202d52b9635/kube-apiserver-cert-syncer/0.log" Mar 20 08:51:27.832142 master-0 kubenswrapper[18707]: I0320 08:51:27.832088 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:27.833728 master-0 kubenswrapper[18707]: I0320 08:51:27.833659 18707 status_manager.go:851] "Failed to get status for pod" podUID="7d5ce05b3d592e63f1f92202d52b9635" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:27.834699 master-0 kubenswrapper[18707]: I0320 08:51:27.834639 18707 status_manager.go:851] "Failed to get status for pod" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:27.841210 master-0 kubenswrapper[18707]: I0320 08:51:27.841152 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir\") pod \"7d5ce05b3d592e63f1f92202d52b9635\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " Mar 20 08:51:27.841610 master-0 kubenswrapper[18707]: I0320 08:51:27.841268 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "7d5ce05b3d592e63f1f92202d52b9635" (UID: "7d5ce05b3d592e63f1f92202d52b9635"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:51:27.841610 master-0 kubenswrapper[18707]: I0320 08:51:27.841403 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir\") pod \"7d5ce05b3d592e63f1f92202d52b9635\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " Mar 20 08:51:27.841610 master-0 kubenswrapper[18707]: I0320 08:51:27.841475 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7d5ce05b3d592e63f1f92202d52b9635" (UID: "7d5ce05b3d592e63f1f92202d52b9635"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:51:27.841832 master-0 kubenswrapper[18707]: I0320 08:51:27.841798 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir\") pod \"7d5ce05b3d592e63f1f92202d52b9635\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " Mar 20 08:51:27.841902 master-0 kubenswrapper[18707]: I0320 08:51:27.841868 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "7d5ce05b3d592e63f1f92202d52b9635" (UID: "7d5ce05b3d592e63f1f92202d52b9635"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:51:27.842717 master-0 kubenswrapper[18707]: I0320 08:51:27.842634 18707 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:51:27.842717 master-0 kubenswrapper[18707]: I0320 08:51:27.842710 18707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:51:27.842863 master-0 kubenswrapper[18707]: I0320 08:51:27.842770 18707 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:51:27.848909 master-0 kubenswrapper[18707]: I0320 08:51:27.848857 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"84b73fa6-b86b-4b65-826c-8f139d45c3d4","Type":"ContainerDied","Data":"2a20bf4774a7b180c7af035a4c5c7b9a0d86fc11337ff4d8a5f5e1af2ac72be1"} Mar 20 08:51:27.849013 master-0 kubenswrapper[18707]: I0320 08:51:27.848910 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a20bf4774a7b180c7af035a4c5c7b9a0d86fc11337ff4d8a5f5e1af2ac72be1" Mar 20 08:51:27.849013 master-0 kubenswrapper[18707]: I0320 08:51:27.848911 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 20 08:51:27.854847 master-0 kubenswrapper[18707]: I0320 08:51:27.854805 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7d5ce05b3d592e63f1f92202d52b9635/kube-apiserver-cert-syncer/0.log" Mar 20 08:51:27.856851 master-0 kubenswrapper[18707]: I0320 08:51:27.856796 18707 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="16441156b7e8bbe4b02a4b00165bba66f63737e9ccdc2639e50919dc2f550bf4" exitCode=0 Mar 20 08:51:27.856929 master-0 kubenswrapper[18707]: I0320 08:51:27.856880 18707 scope.go:117] "RemoveContainer" containerID="963bd0525df3985d6f43b1fe1bd02659a99df0c473266a6b7445de535552a1f3" Mar 20 08:51:27.857001 master-0 kubenswrapper[18707]: I0320 08:51:27.856965 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:27.873201 master-0 kubenswrapper[18707]: I0320 08:51:27.873112 18707 status_manager.go:851] "Failed to get status for pod" podUID="7d5ce05b3d592e63f1f92202d52b9635" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:27.874113 master-0 kubenswrapper[18707]: I0320 08:51:27.874054 18707 status_manager.go:851] "Failed to get status for pod" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:27.884481 master-0 kubenswrapper[18707]: I0320 08:51:27.884434 18707 scope.go:117] "RemoveContainer" containerID="c2a566d1a6044b46ff0beec324066fa5e7cc44891867becc4439f58df2d6446e" Mar 20 08:51:27.887611 master-0 kubenswrapper[18707]: I0320 08:51:27.887525 18707 status_manager.go:851] "Failed to get status for pod" podUID="7d5ce05b3d592e63f1f92202d52b9635" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:27.888577 master-0 kubenswrapper[18707]: I0320 08:51:27.888507 18707 status_manager.go:851] "Failed to get status for pod" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:27.905876 master-0 kubenswrapper[18707]: I0320 08:51:27.905804 18707 scope.go:117] "RemoveContainer" containerID="b0f5e22bf5db5b532cc2a9b7e1994228d722c79a9f2d60d4ddc02358bee2cb82" Mar 20 08:51:27.922027 master-0 kubenswrapper[18707]: I0320 08:51:27.921969 18707 scope.go:117] "RemoveContainer" containerID="133aaf528ab1e0887a626b89e3622758b8188d7b50da183f7f2755440645c3ac" Mar 20 08:51:27.935737 master-0 kubenswrapper[18707]: I0320 08:51:27.935692 18707 scope.go:117] "RemoveContainer" containerID="16441156b7e8bbe4b02a4b00165bba66f63737e9ccdc2639e50919dc2f550bf4" Mar 20 08:51:27.952758 master-0 kubenswrapper[18707]: I0320 08:51:27.952674 18707 scope.go:117] "RemoveContainer" containerID="1409bbcde5e79bac3a67951727093fc0f8d50edb82f42aa3f8fe494469470b11" Mar 20 08:51:27.969194 master-0 kubenswrapper[18707]: I0320 08:51:27.969133 18707 scope.go:117] "RemoveContainer" containerID="963bd0525df3985d6f43b1fe1bd02659a99df0c473266a6b7445de535552a1f3" Mar 20 08:51:27.969719 master-0 kubenswrapper[18707]: E0320 08:51:27.969639 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963bd0525df3985d6f43b1fe1bd02659a99df0c473266a6b7445de535552a1f3\": container with ID starting with 963bd0525df3985d6f43b1fe1bd02659a99df0c473266a6b7445de535552a1f3 not found: ID does not exist" containerID="963bd0525df3985d6f43b1fe1bd02659a99df0c473266a6b7445de535552a1f3" Mar 20 08:51:27.969793 master-0 kubenswrapper[18707]: I0320 08:51:27.969743 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963bd0525df3985d6f43b1fe1bd02659a99df0c473266a6b7445de535552a1f3"} err="failed to get container status \"963bd0525df3985d6f43b1fe1bd02659a99df0c473266a6b7445de535552a1f3\": rpc error: code = NotFound desc = could not find container \"963bd0525df3985d6f43b1fe1bd02659a99df0c473266a6b7445de535552a1f3\": container with ID starting with 963bd0525df3985d6f43b1fe1bd02659a99df0c473266a6b7445de535552a1f3 not found: ID does not exist" Mar 20 08:51:27.969842 master-0 kubenswrapper[18707]: I0320 08:51:27.969801 18707 scope.go:117] "RemoveContainer" containerID="c2a566d1a6044b46ff0beec324066fa5e7cc44891867becc4439f58df2d6446e" Mar 20 08:51:27.970397 master-0 kubenswrapper[18707]: E0320 08:51:27.970333 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2a566d1a6044b46ff0beec324066fa5e7cc44891867becc4439f58df2d6446e\": container with ID starting with c2a566d1a6044b46ff0beec324066fa5e7cc44891867becc4439f58df2d6446e not found: ID does not exist" containerID="c2a566d1a6044b46ff0beec324066fa5e7cc44891867becc4439f58df2d6446e" Mar 20 08:51:27.970468 master-0 kubenswrapper[18707]: I0320 08:51:27.970415 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2a566d1a6044b46ff0beec324066fa5e7cc44891867becc4439f58df2d6446e"} err="failed to get container status \"c2a566d1a6044b46ff0beec324066fa5e7cc44891867becc4439f58df2d6446e\": rpc error: code = NotFound desc = could not find container \"c2a566d1a6044b46ff0beec324066fa5e7cc44891867becc4439f58df2d6446e\": container with ID starting with c2a566d1a6044b46ff0beec324066fa5e7cc44891867becc4439f58df2d6446e not found: ID does not exist" Mar 20 08:51:27.970508 master-0 kubenswrapper[18707]: I0320 08:51:27.970479 18707 scope.go:117] "RemoveContainer" containerID="b0f5e22bf5db5b532cc2a9b7e1994228d722c79a9f2d60d4ddc02358bee2cb82" Mar 20 08:51:27.971129 master-0 kubenswrapper[18707]: E0320 08:51:27.971091 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0f5e22bf5db5b532cc2a9b7e1994228d722c79a9f2d60d4ddc02358bee2cb82\": container with ID starting with b0f5e22bf5db5b532cc2a9b7e1994228d722c79a9f2d60d4ddc02358bee2cb82 not found: ID does not exist" containerID="b0f5e22bf5db5b532cc2a9b7e1994228d722c79a9f2d60d4ddc02358bee2cb82" Mar 20 08:51:27.971170 master-0 kubenswrapper[18707]: I0320 08:51:27.971127 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0f5e22bf5db5b532cc2a9b7e1994228d722c79a9f2d60d4ddc02358bee2cb82"} err="failed to get container status \"b0f5e22bf5db5b532cc2a9b7e1994228d722c79a9f2d60d4ddc02358bee2cb82\": rpc error: code = NotFound desc = could not find container \"b0f5e22bf5db5b532cc2a9b7e1994228d722c79a9f2d60d4ddc02358bee2cb82\": container with ID starting with b0f5e22bf5db5b532cc2a9b7e1994228d722c79a9f2d60d4ddc02358bee2cb82 not found: ID does not exist" Mar 20 08:51:27.971170 master-0 kubenswrapper[18707]: I0320 08:51:27.971149 18707 scope.go:117] "RemoveContainer" containerID="133aaf528ab1e0887a626b89e3622758b8188d7b50da183f7f2755440645c3ac" Mar 20 08:51:27.971656 master-0 kubenswrapper[18707]: E0320 08:51:27.971609 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"133aaf528ab1e0887a626b89e3622758b8188d7b50da183f7f2755440645c3ac\": container with ID starting with 133aaf528ab1e0887a626b89e3622758b8188d7b50da183f7f2755440645c3ac not found: ID does not exist" containerID="133aaf528ab1e0887a626b89e3622758b8188d7b50da183f7f2755440645c3ac" Mar 20 08:51:27.971914 master-0 kubenswrapper[18707]: I0320 08:51:27.971664 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"133aaf528ab1e0887a626b89e3622758b8188d7b50da183f7f2755440645c3ac"} err="failed to get container status \"133aaf528ab1e0887a626b89e3622758b8188d7b50da183f7f2755440645c3ac\": rpc error: code = NotFound desc = could not find container \"133aaf528ab1e0887a626b89e3622758b8188d7b50da183f7f2755440645c3ac\": container with ID starting with 133aaf528ab1e0887a626b89e3622758b8188d7b50da183f7f2755440645c3ac not found: ID does not exist" Mar 20 08:51:27.971914 master-0 kubenswrapper[18707]: I0320 08:51:27.971908 18707 scope.go:117] "RemoveContainer" containerID="16441156b7e8bbe4b02a4b00165bba66f63737e9ccdc2639e50919dc2f550bf4" Mar 20 08:51:27.972387 master-0 kubenswrapper[18707]: E0320 08:51:27.972324 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16441156b7e8bbe4b02a4b00165bba66f63737e9ccdc2639e50919dc2f550bf4\": container with ID starting with 16441156b7e8bbe4b02a4b00165bba66f63737e9ccdc2639e50919dc2f550bf4 not found: ID does not exist" containerID="16441156b7e8bbe4b02a4b00165bba66f63737e9ccdc2639e50919dc2f550bf4" Mar 20 08:51:27.972446 master-0 kubenswrapper[18707]: I0320 08:51:27.972388 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16441156b7e8bbe4b02a4b00165bba66f63737e9ccdc2639e50919dc2f550bf4"} err="failed to get container status \"16441156b7e8bbe4b02a4b00165bba66f63737e9ccdc2639e50919dc2f550bf4\": rpc error: code = NotFound desc = could not find container \"16441156b7e8bbe4b02a4b00165bba66f63737e9ccdc2639e50919dc2f550bf4\": container with ID starting with 16441156b7e8bbe4b02a4b00165bba66f63737e9ccdc2639e50919dc2f550bf4 not found: ID does not exist" Mar 20 08:51:27.972446 master-0 kubenswrapper[18707]: I0320 08:51:27.972417 18707 scope.go:117] "RemoveContainer" containerID="1409bbcde5e79bac3a67951727093fc0f8d50edb82f42aa3f8fe494469470b11" Mar 20 08:51:27.972777 master-0 kubenswrapper[18707]: E0320 08:51:27.972739 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1409bbcde5e79bac3a67951727093fc0f8d50edb82f42aa3f8fe494469470b11\": container with ID starting with 1409bbcde5e79bac3a67951727093fc0f8d50edb82f42aa3f8fe494469470b11 not found: ID does not exist" containerID="1409bbcde5e79bac3a67951727093fc0f8d50edb82f42aa3f8fe494469470b11" Mar 20 08:51:27.972818 master-0 kubenswrapper[18707]: I0320 08:51:27.972774 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1409bbcde5e79bac3a67951727093fc0f8d50edb82f42aa3f8fe494469470b11"} err="failed to get container status \"1409bbcde5e79bac3a67951727093fc0f8d50edb82f42aa3f8fe494469470b11\": rpc error: code = NotFound desc = could not find container \"1409bbcde5e79bac3a67951727093fc0f8d50edb82f42aa3f8fe494469470b11\": container with ID starting with 1409bbcde5e79bac3a67951727093fc0f8d50edb82f42aa3f8fe494469470b11 not found: ID does not exist" Mar 20 08:51:29.105459 master-0 kubenswrapper[18707]: I0320 08:51:29.105375 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d5ce05b3d592e63f1f92202d52b9635" path="/var/lib/kubelet/pods/7d5ce05b3d592e63f1f92202d52b9635/volumes" Mar 20 08:51:30.709986 master-0 kubenswrapper[18707]: E0320 08:51:30.709892 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:30.711321 master-0 kubenswrapper[18707]: E0320 08:51:30.711263 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:30.712259 master-0 kubenswrapper[18707]: E0320 08:51:30.712156 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:30.713035 master-0 kubenswrapper[18707]: E0320 08:51:30.712998 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:30.713858 master-0 kubenswrapper[18707]: E0320 08:51:30.713812 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:30.713938 master-0 kubenswrapper[18707]: I0320 08:51:30.713867 18707 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 08:51:30.714798 master-0 kubenswrapper[18707]: E0320 08:51:30.714749 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 20 08:51:30.915872 master-0 kubenswrapper[18707]: E0320 08:51:30.915801 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 20 08:51:31.318370 master-0 kubenswrapper[18707]: E0320 08:51:31.318275 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 20 08:51:32.119571 master-0 kubenswrapper[18707]: E0320 08:51:32.119488 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 20 08:51:32.171208 master-0 kubenswrapper[18707]: I0320 08:51:32.171120 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:32.171208 master-0 kubenswrapper[18707]: I0320 08:51:32.171204 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:32.171208 master-0 kubenswrapper[18707]: I0320 08:51:32.171222 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:32.171672 master-0 kubenswrapper[18707]: I0320 08:51:32.171235 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:32.179667 master-0 kubenswrapper[18707]: I0320 08:51:32.179599 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:32.180639 master-0 kubenswrapper[18707]: I0320 08:51:32.180616 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:32.181805 master-0 kubenswrapper[18707]: I0320 08:51:32.181622 18707 status_manager.go:851] "Failed to get status for pod" podUID="7953905b6830b623394f78c614eeb251" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:32.182436 master-0 kubenswrapper[18707]: I0320 08:51:32.182381 18707 status_manager.go:851] "Failed to get status for pod" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:32.183616 master-0 kubenswrapper[18707]: I0320 08:51:32.183567 18707 status_manager.go:851] "Failed to get status for pod" podUID="7953905b6830b623394f78c614eeb251" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:32.184511 master-0 kubenswrapper[18707]: I0320 08:51:32.184471 18707 status_manager.go:851] "Failed to get status for pod" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:32.253725 master-0 kubenswrapper[18707]: E0320 08:51:32.253423 18707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e80963af5f8d7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:85632c1cec8974aa874834e4cfff4c77,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-20 08:51:25.953202391 +0000 UTC m=+631.109382747,LastTimestamp:2026-03-20 08:51:25.953202391 +0000 UTC m=+631.109382747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 20 08:51:32.925447 master-0 kubenswrapper[18707]: I0320 08:51:32.925381 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:32.926384 master-0 kubenswrapper[18707]: I0320 08:51:32.926308 18707 status_manager.go:851] "Failed to get status for pod" podUID="7953905b6830b623394f78c614eeb251" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:32.927142 master-0 kubenswrapper[18707]: I0320 08:51:32.927092 18707 status_manager.go:851] "Failed to get status for pod" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:32.927502 master-0 kubenswrapper[18707]: I0320 08:51:32.927468 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:51:32.928487 master-0 kubenswrapper[18707]: I0320 08:51:32.928429 18707 status_manager.go:851] "Failed to get status for pod" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:32.929226 master-0 kubenswrapper[18707]: I0320 08:51:32.929162 18707 status_manager.go:851] "Failed to get status for pod" podUID="7953905b6830b623394f78c614eeb251" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:33.721105 master-0 kubenswrapper[18707]: E0320 08:51:33.720984 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 20 08:51:35.100075 master-0 kubenswrapper[18707]: I0320 08:51:35.099985 18707 status_manager.go:851] "Failed to get status for pod" podUID="7953905b6830b623394f78c614eeb251" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:35.101406 master-0 kubenswrapper[18707]: I0320 08:51:35.101354 18707 status_manager.go:851] "Failed to get status for pod" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:36.922333 master-0 kubenswrapper[18707]: E0320 08:51:36.922254 18707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 20 08:51:38.093662 master-0 kubenswrapper[18707]: I0320 08:51:38.093581 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:38.095404 master-0 kubenswrapper[18707]: I0320 08:51:38.095285 18707 status_manager.go:851] "Failed to get status for pod" podUID="7953905b6830b623394f78c614eeb251" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:38.096829 master-0 kubenswrapper[18707]: I0320 08:51:38.096736 18707 status_manager.go:851] "Failed to get status for pod" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:38.116823 master-0 kubenswrapper[18707]: I0320 08:51:38.116666 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="10f6b2bf-da60-45b1-b95a-76acdc7f1e38" Mar 20 08:51:38.116823 master-0 kubenswrapper[18707]: I0320 08:51:38.116712 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="10f6b2bf-da60-45b1-b95a-76acdc7f1e38" Mar 20 08:51:38.118680 master-0 kubenswrapper[18707]: E0320 08:51:38.118586 18707 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:38.119455 master-0 kubenswrapper[18707]: I0320 08:51:38.119411 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:38.155756 master-0 kubenswrapper[18707]: W0320 08:51:38.155652 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5f502b117c7c8479f7f20848a50fec0.slice/crio-8f453f6d2df8cd756bf279ae5d27bb09d1dc137056b5e5eb7b5bb582b149aac6 WatchSource:0}: Error finding container 8f453f6d2df8cd756bf279ae5d27bb09d1dc137056b5e5eb7b5bb582b149aac6: Status 404 returned error can't find the container with id 8f453f6d2df8cd756bf279ae5d27bb09d1dc137056b5e5eb7b5bb582b149aac6 Mar 20 08:51:38.982127 master-0 kubenswrapper[18707]: I0320 08:51:38.982049 18707 generic.go:334] "Generic (PLEG): container finished" podID="d5f502b117c7c8479f7f20848a50fec0" containerID="f7ba9cb9d8d6984a97789c05f08a034f0374d374fa1bae228802f13ba40a61d6" exitCode=0 Mar 20 08:51:38.982127 master-0 kubenswrapper[18707]: I0320 08:51:38.982111 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"d5f502b117c7c8479f7f20848a50fec0","Type":"ContainerDied","Data":"f7ba9cb9d8d6984a97789c05f08a034f0374d374fa1bae228802f13ba40a61d6"} Mar 20 08:51:38.982585 master-0 kubenswrapper[18707]: I0320 08:51:38.982153 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"d5f502b117c7c8479f7f20848a50fec0","Type":"ContainerStarted","Data":"8f453f6d2df8cd756bf279ae5d27bb09d1dc137056b5e5eb7b5bb582b149aac6"} Mar 20 08:51:38.982585 master-0 kubenswrapper[18707]: I0320 08:51:38.982493 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="10f6b2bf-da60-45b1-b95a-76acdc7f1e38" Mar 20 08:51:38.982585 master-0 kubenswrapper[18707]: I0320 08:51:38.982507 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="10f6b2bf-da60-45b1-b95a-76acdc7f1e38" Mar 20 08:51:38.983208 master-0 kubenswrapper[18707]: I0320 08:51:38.983144 18707 status_manager.go:851] "Failed to get status for pod" podUID="7953905b6830b623394f78c614eeb251" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:38.983208 master-0 kubenswrapper[18707]: E0320 08:51:38.983144 18707 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:38.983932 master-0 kubenswrapper[18707]: I0320 08:51:38.983898 18707 status_manager.go:851] "Failed to get status for pod" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 20 08:51:40.006664 master-0 kubenswrapper[18707]: I0320 08:51:40.005036 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"d5f502b117c7c8479f7f20848a50fec0","Type":"ContainerStarted","Data":"6389a4757acd7099b65cad7481f70972c29cee5c9944636ac70010e4d6606da2"} Mar 20 08:51:40.006664 master-0 kubenswrapper[18707]: I0320 08:51:40.005109 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"d5f502b117c7c8479f7f20848a50fec0","Type":"ContainerStarted","Data":"1f915e89ee347124365a33a114a99dfbea48fc0752c91f32a6d34ffd8e3d6177"} Mar 20 08:51:40.006664 master-0 kubenswrapper[18707]: I0320 08:51:40.005122 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"d5f502b117c7c8479f7f20848a50fec0","Type":"ContainerStarted","Data":"a0951c3dde1304b7b53e5c05b703cd1bd1624b8825affb320b191ca880106e57"} Mar 20 08:51:41.016507 master-0 kubenswrapper[18707]: I0320 08:51:41.016421 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"d5f502b117c7c8479f7f20848a50fec0","Type":"ContainerStarted","Data":"de28dd2da12954fa71d54c2af21e5033027ad2a1ad3c7a3b08c3825faa9eb87a"} Mar 20 08:51:41.016507 master-0 kubenswrapper[18707]: I0320 08:51:41.016493 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"d5f502b117c7c8479f7f20848a50fec0","Type":"ContainerStarted","Data":"3c22db1b68fc11fdd9c5e9a7e18e1fce72973212b710f6b354cc52b3f3761d52"} Mar 20 08:51:41.017350 master-0 kubenswrapper[18707]: I0320 08:51:41.016997 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:41.017350 master-0 kubenswrapper[18707]: I0320 08:51:41.017294 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="10f6b2bf-da60-45b1-b95a-76acdc7f1e38" Mar 20 08:51:41.017446 master-0 kubenswrapper[18707]: I0320 08:51:41.017367 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="10f6b2bf-da60-45b1-b95a-76acdc7f1e38" Mar 20 08:51:43.120506 master-0 kubenswrapper[18707]: I0320 08:51:43.120425 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:43.120506 master-0 kubenswrapper[18707]: I0320 08:51:43.120497 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:43.129856 master-0 kubenswrapper[18707]: I0320 08:51:43.129790 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:46.045493 master-0 kubenswrapper[18707]: I0320 08:51:46.045438 18707 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:51:46.122871 master-0 kubenswrapper[18707]: I0320 08:51:46.122811 18707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="d5f502b117c7c8479f7f20848a50fec0" podUID="1b44706d-8ad0-4010-bf74-5c7cc732f79e" Mar 20 08:51:47.066176 master-0 kubenswrapper[18707]: I0320 08:51:47.066113 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="10f6b2bf-da60-45b1-b95a-76acdc7f1e38" Mar 20 08:51:47.066176 master-0 kubenswrapper[18707]: I0320 08:51:47.066160 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="10f6b2bf-da60-45b1-b95a-76acdc7f1e38" Mar 20 08:51:47.070008 master-0 kubenswrapper[18707]: I0320 08:51:47.069913 18707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="d5f502b117c7c8479f7f20848a50fec0" podUID="1b44706d-8ad0-4010-bf74-5c7cc732f79e" Mar 20 08:51:55.753419 master-0 kubenswrapper[18707]: I0320 08:51:55.753341 18707 scope.go:117] "RemoveContainer" containerID="665625ddaf4d7d5a13e6f9aa415e12a52677c52b3254cb6bcb690bbf3d2cdd27" Mar 20 08:51:55.778704 master-0 kubenswrapper[18707]: I0320 08:51:55.778610 18707 scope.go:117] "RemoveContainer" containerID="871676bb611c7b17e30bacdec9ff5c25cf4c40cdada2c8d2a4f54b77ebf11820" Mar 20 08:51:55.823089 master-0 kubenswrapper[18707]: I0320 08:51:55.822992 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 08:51:56.055395 master-0 kubenswrapper[18707]: I0320 08:51:56.055273 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 20 08:51:56.466345 master-0 kubenswrapper[18707]: I0320 08:51:56.466170 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 08:51:56.596025 master-0 kubenswrapper[18707]: I0320 08:51:56.595977 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-anne24rv49795" Mar 20 08:51:56.976262 master-0 kubenswrapper[18707]: I0320 08:51:56.976125 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 08:51:57.147465 master-0 kubenswrapper[18707]: I0320 08:51:57.147398 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 08:51:57.177443 master-0 kubenswrapper[18707]: I0320 08:51:57.177397 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 08:51:57.351169 master-0 kubenswrapper[18707]: I0320 08:51:57.351060 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 20 08:51:57.509550 master-0 kubenswrapper[18707]: I0320 08:51:57.509285 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 08:51:57.649027 master-0 kubenswrapper[18707]: I0320 08:51:57.648869 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 20 08:51:57.710474 master-0 kubenswrapper[18707]: I0320 08:51:57.710387 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 20 08:51:57.722209 master-0 kubenswrapper[18707]: I0320 08:51:57.722140 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 08:51:57.771599 master-0 kubenswrapper[18707]: I0320 08:51:57.771526 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 08:51:57.800664 master-0 kubenswrapper[18707]: I0320 08:51:57.800597 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 20 08:51:57.843444 master-0 kubenswrapper[18707]: I0320 08:51:57.843341 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 08:51:57.844698 master-0 kubenswrapper[18707]: I0320 08:51:57.844654 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 20 08:51:58.071886 master-0 kubenswrapper[18707]: I0320 08:51:58.071794 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 08:51:58.258780 master-0 kubenswrapper[18707]: I0320 08:51:58.258688 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 08:51:58.276100 master-0 kubenswrapper[18707]: I0320 08:51:58.275971 18707 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 08:51:58.288675 master-0 kubenswrapper[18707]: I0320 08:51:58.288582 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 20 08:51:58.288841 master-0 kubenswrapper[18707]: I0320 08:51:58.288694 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 20 08:51:58.288841 master-0 kubenswrapper[18707]: I0320 08:51:58.288734 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-dcb9594d9-wlht7"] Mar 20 08:51:58.334925 master-0 kubenswrapper[18707]: I0320 08:51:58.334677 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=12.334645337 podStartE2EDuration="12.334645337s" podCreationTimestamp="2026-03-20 08:51:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:51:58.327343268 +0000 UTC m=+663.483523624" watchObservedRunningTime="2026-03-20 08:51:58.334645337 +0000 UTC m=+663.490825703" Mar 20 08:51:58.386957 master-0 kubenswrapper[18707]: I0320 08:51:58.386861 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 08:51:58.403081 master-0 kubenswrapper[18707]: I0320 08:51:58.402991 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 08:51:58.798447 master-0 kubenswrapper[18707]: I0320 08:51:58.798365 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 20 08:51:59.072589 master-0 kubenswrapper[18707]: I0320 08:51:59.072357 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 08:51:59.113097 master-0 kubenswrapper[18707]: I0320 08:51:59.113004 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 08:51:59.287989 master-0 kubenswrapper[18707]: I0320 08:51:59.287885 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 20 08:51:59.342985 master-0 kubenswrapper[18707]: I0320 08:51:59.342809 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 08:51:59.454289 master-0 kubenswrapper[18707]: I0320 08:51:59.454153 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 08:51:59.461456 master-0 kubenswrapper[18707]: I0320 08:51:59.461415 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-v9mgl" Mar 20 08:51:59.587329 master-0 kubenswrapper[18707]: I0320 08:51:59.587228 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 08:51:59.594280 master-0 kubenswrapper[18707]: I0320 08:51:59.594142 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 08:51:59.667485 master-0 kubenswrapper[18707]: I0320 08:51:59.667384 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 08:51:59.675290 master-0 kubenswrapper[18707]: I0320 08:51:59.675176 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 20 08:51:59.685122 master-0 kubenswrapper[18707]: I0320 08:51:59.685020 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 08:51:59.753109 master-0 kubenswrapper[18707]: I0320 08:51:59.753038 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 08:51:59.872475 master-0 kubenswrapper[18707]: I0320 08:51:59.872300 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 08:51:59.905480 master-0 kubenswrapper[18707]: I0320 08:51:59.905365 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 08:51:59.936700 master-0 kubenswrapper[18707]: I0320 08:51:59.936611 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 08:51:59.973869 master-0 kubenswrapper[18707]: I0320 08:51:59.973777 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 20 08:52:00.000176 master-0 kubenswrapper[18707]: I0320 08:52:00.000103 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 08:52:00.108976 master-0 kubenswrapper[18707]: I0320 08:52:00.108870 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 08:52:00.253994 master-0 kubenswrapper[18707]: I0320 08:52:00.253830 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 20 08:52:00.295567 master-0 kubenswrapper[18707]: I0320 08:52:00.295406 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 08:52:00.303381 master-0 kubenswrapper[18707]: I0320 08:52:00.303266 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 20 08:52:00.350042 master-0 kubenswrapper[18707]: I0320 08:52:00.349947 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 08:52:00.411930 master-0 kubenswrapper[18707]: I0320 08:52:00.411860 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 20 08:52:00.487065 master-0 kubenswrapper[18707]: I0320 08:52:00.486967 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 08:52:00.497515 master-0 kubenswrapper[18707]: I0320 08:52:00.497399 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 08:52:00.519820 master-0 kubenswrapper[18707]: I0320 08:52:00.519634 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 08:52:00.579442 master-0 kubenswrapper[18707]: I0320 08:52:00.579372 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 08:52:00.711550 master-0 kubenswrapper[18707]: I0320 08:52:00.711481 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 08:52:00.712994 master-0 kubenswrapper[18707]: I0320 08:52:00.712920 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-4smb9" Mar 20 08:52:00.733924 master-0 kubenswrapper[18707]: I0320 08:52:00.733847 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-qvkkb" Mar 20 08:52:01.176126 master-0 kubenswrapper[18707]: I0320 08:52:01.176049 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 08:52:01.206595 master-0 kubenswrapper[18707]: I0320 08:52:01.206500 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 08:52:01.268083 master-0 kubenswrapper[18707]: I0320 08:52:01.267996 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 08:52:01.326716 master-0 kubenswrapper[18707]: I0320 08:52:01.326551 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 08:52:01.444797 master-0 kubenswrapper[18707]: I0320 08:52:01.444585 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 08:52:01.483565 master-0 kubenswrapper[18707]: I0320 08:52:01.483462 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 08:52:01.527119 master-0 kubenswrapper[18707]: I0320 08:52:01.527043 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 08:52:01.631966 master-0 kubenswrapper[18707]: I0320 08:52:01.631893 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 20 08:52:01.656393 master-0 kubenswrapper[18707]: I0320 08:52:01.656327 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 08:52:01.696422 master-0 kubenswrapper[18707]: I0320 08:52:01.695984 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 08:52:01.718874 master-0 kubenswrapper[18707]: I0320 08:52:01.718762 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 08:52:01.730843 master-0 kubenswrapper[18707]: I0320 08:52:01.730779 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 08:52:01.763122 master-0 kubenswrapper[18707]: I0320 08:52:01.763025 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 20 08:52:01.794672 master-0 kubenswrapper[18707]: I0320 08:52:01.794583 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 08:52:01.865431 master-0 kubenswrapper[18707]: I0320 08:52:01.865281 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 20 08:52:01.932141 master-0 kubenswrapper[18707]: I0320 08:52:01.932075 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 20 08:52:01.995578 master-0 kubenswrapper[18707]: I0320 08:52:01.995414 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 08:52:02.090718 master-0 kubenswrapper[18707]: I0320 08:52:02.090617 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-rfqnk" Mar 20 08:52:02.150748 master-0 kubenswrapper[18707]: I0320 08:52:02.150663 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 08:52:02.167333 master-0 kubenswrapper[18707]: I0320 08:52:02.167259 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 08:52:02.197893 master-0 kubenswrapper[18707]: I0320 08:52:02.197837 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 20 08:52:02.250224 master-0 kubenswrapper[18707]: I0320 08:52:02.250038 18707 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 08:52:02.452529 master-0 kubenswrapper[18707]: I0320 08:52:02.452456 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 20 08:52:02.543218 master-0 kubenswrapper[18707]: I0320 08:52:02.542682 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-vd4cn" Mar 20 08:52:02.621353 master-0 kubenswrapper[18707]: I0320 08:52:02.621289 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 08:52:02.628974 master-0 kubenswrapper[18707]: I0320 08:52:02.628928 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 08:52:02.638287 master-0 kubenswrapper[18707]: I0320 08:52:02.638218 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 08:52:02.691346 master-0 kubenswrapper[18707]: I0320 08:52:02.691157 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 08:52:02.720654 master-0 kubenswrapper[18707]: I0320 08:52:02.720558 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 08:52:02.765670 master-0 kubenswrapper[18707]: I0320 08:52:02.765599 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 08:52:02.766046 master-0 kubenswrapper[18707]: I0320 08:52:02.765834 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 08:52:02.854377 master-0 kubenswrapper[18707]: I0320 08:52:02.854298 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 20 08:52:02.893383 master-0 kubenswrapper[18707]: I0320 08:52:02.893293 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 08:52:02.917104 master-0 kubenswrapper[18707]: I0320 08:52:02.916616 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 08:52:02.921268 master-0 kubenswrapper[18707]: I0320 08:52:02.921222 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 08:52:02.965715 master-0 kubenswrapper[18707]: I0320 08:52:02.965644 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 20 08:52:02.993437 master-0 kubenswrapper[18707]: I0320 08:52:02.993373 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 08:52:03.062670 master-0 kubenswrapper[18707]: I0320 08:52:03.062599 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 08:52:03.125717 master-0 kubenswrapper[18707]: I0320 08:52:03.125584 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:52:03.129278 master-0 kubenswrapper[18707]: I0320 08:52:03.129261 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 20 08:52:03.144584 master-0 kubenswrapper[18707]: I0320 08:52:03.143941 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 08:52:03.145735 master-0 kubenswrapper[18707]: I0320 08:52:03.145703 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 08:52:03.176949 master-0 kubenswrapper[18707]: I0320 08:52:03.176901 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 20 08:52:03.211660 master-0 kubenswrapper[18707]: I0320 08:52:03.211615 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 08:52:03.232211 master-0 kubenswrapper[18707]: I0320 08:52:03.232139 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 08:52:03.401355 master-0 kubenswrapper[18707]: I0320 08:52:03.401223 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 08:52:03.431437 master-0 kubenswrapper[18707]: I0320 08:52:03.431327 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 08:52:03.455952 master-0 kubenswrapper[18707]: I0320 08:52:03.455871 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 20 08:52:03.549760 master-0 kubenswrapper[18707]: I0320 08:52:03.549661 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 20 08:52:03.687159 master-0 kubenswrapper[18707]: I0320 08:52:03.686967 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 08:52:03.696934 master-0 kubenswrapper[18707]: I0320 08:52:03.696844 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 08:52:03.829136 master-0 kubenswrapper[18707]: I0320 08:52:03.829063 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 20 08:52:03.829415 master-0 kubenswrapper[18707]: I0320 08:52:03.829208 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 08:52:03.857902 master-0 kubenswrapper[18707]: I0320 08:52:03.857813 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-2trhv" Mar 20 08:52:03.888469 master-0 kubenswrapper[18707]: I0320 08:52:03.888407 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 08:52:03.896988 master-0 kubenswrapper[18707]: I0320 08:52:03.896926 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 20 08:52:03.899158 master-0 kubenswrapper[18707]: I0320 08:52:03.898506 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-wq6zb" Mar 20 08:52:03.969375 master-0 kubenswrapper[18707]: I0320 08:52:03.969264 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 20 08:52:03.979839 master-0 kubenswrapper[18707]: I0320 08:52:03.979762 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 08:52:03.985093 master-0 kubenswrapper[18707]: I0320 08:52:03.985022 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 20 08:52:03.991505 master-0 kubenswrapper[18707]: I0320 08:52:03.991435 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 08:52:04.012898 master-0 kubenswrapper[18707]: I0320 08:52:04.012813 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 20 08:52:04.042838 master-0 kubenswrapper[18707]: I0320 08:52:04.042795 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 08:52:04.091444 master-0 kubenswrapper[18707]: I0320 08:52:04.090980 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 20 08:52:04.195600 master-0 kubenswrapper[18707]: I0320 08:52:04.195529 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 08:52:04.301695 master-0 kubenswrapper[18707]: I0320 08:52:04.301631 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:52:04.323594 master-0 kubenswrapper[18707]: I0320 08:52:04.323526 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-js69c" Mar 20 08:52:04.329603 master-0 kubenswrapper[18707]: I0320 08:52:04.329547 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 08:52:04.471848 master-0 kubenswrapper[18707]: I0320 08:52:04.471803 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 08:52:04.473449 master-0 kubenswrapper[18707]: I0320 08:52:04.473142 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 08:52:04.518879 master-0 kubenswrapper[18707]: I0320 08:52:04.518071 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 08:52:04.560980 master-0 kubenswrapper[18707]: I0320 08:52:04.560930 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 20 08:52:04.563682 master-0 kubenswrapper[18707]: I0320 08:52:04.563648 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 08:52:04.645128 master-0 kubenswrapper[18707]: I0320 08:52:04.645055 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 08:52:04.651319 master-0 kubenswrapper[18707]: I0320 08:52:04.651229 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 20 08:52:04.707176 master-0 kubenswrapper[18707]: I0320 08:52:04.706721 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 20 08:52:04.707176 master-0 kubenswrapper[18707]: I0320 08:52:04.706913 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 08:52:04.847213 master-0 kubenswrapper[18707]: I0320 08:52:04.846999 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 08:52:04.848983 master-0 kubenswrapper[18707]: I0320 08:52:04.848943 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 08:52:04.962586 master-0 kubenswrapper[18707]: I0320 08:52:04.962499 18707 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 08:52:04.971427 master-0 kubenswrapper[18707]: I0320 08:52:04.971378 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 08:52:05.026229 master-0 kubenswrapper[18707]: I0320 08:52:05.026152 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 20 08:52:05.031225 master-0 kubenswrapper[18707]: I0320 08:52:05.031179 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 20 08:52:05.036722 master-0 kubenswrapper[18707]: I0320 08:52:05.036672 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 08:52:05.126917 master-0 kubenswrapper[18707]: I0320 08:52:05.126765 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 20 08:52:05.195223 master-0 kubenswrapper[18707]: I0320 08:52:05.195111 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 08:52:05.204667 master-0 kubenswrapper[18707]: I0320 08:52:05.204594 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 08:52:05.316851 master-0 kubenswrapper[18707]: I0320 08:52:05.316763 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-7d577" Mar 20 08:52:05.334962 master-0 kubenswrapper[18707]: I0320 08:52:05.334864 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 08:52:05.382698 master-0 kubenswrapper[18707]: I0320 08:52:05.382516 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 08:52:05.447455 master-0 kubenswrapper[18707]: I0320 08:52:05.447391 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 20 08:52:05.485220 master-0 kubenswrapper[18707]: I0320 08:52:05.485136 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 08:52:05.666616 master-0 kubenswrapper[18707]: I0320 08:52:05.666429 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 08:52:05.677046 master-0 kubenswrapper[18707]: I0320 08:52:05.676986 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 08:52:05.692847 master-0 kubenswrapper[18707]: I0320 08:52:05.692780 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 08:52:05.798408 master-0 kubenswrapper[18707]: I0320 08:52:05.798314 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 08:52:05.853735 master-0 kubenswrapper[18707]: I0320 08:52:05.853671 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 20 08:52:05.933387 master-0 kubenswrapper[18707]: I0320 08:52:05.933232 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 08:52:06.002549 master-0 kubenswrapper[18707]: I0320 08:52:06.002449 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 08:52:06.046922 master-0 kubenswrapper[18707]: I0320 08:52:06.046836 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 20 08:52:06.064766 master-0 kubenswrapper[18707]: I0320 08:52:06.064700 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 08:52:06.077333 master-0 kubenswrapper[18707]: I0320 08:52:06.077248 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-whhgj" Mar 20 08:52:06.126227 master-0 kubenswrapper[18707]: I0320 08:52:06.125558 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 08:52:06.126227 master-0 kubenswrapper[18707]: I0320 08:52:06.125769 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 20 08:52:06.151228 master-0 kubenswrapper[18707]: I0320 08:52:06.147216 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 08:52:06.165337 master-0 kubenswrapper[18707]: I0320 08:52:06.165279 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-bksjt" Mar 20 08:52:06.251096 master-0 kubenswrapper[18707]: I0320 08:52:06.250953 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 08:52:06.317301 master-0 kubenswrapper[18707]: I0320 08:52:06.317243 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 20 08:52:06.365104 master-0 kubenswrapper[18707]: I0320 08:52:06.365049 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 08:52:06.379708 master-0 kubenswrapper[18707]: I0320 08:52:06.379643 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 20 08:52:06.384559 master-0 kubenswrapper[18707]: I0320 08:52:06.384535 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:52:06.450068 master-0 kubenswrapper[18707]: I0320 08:52:06.450009 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 08:52:06.491280 master-0 kubenswrapper[18707]: I0320 08:52:06.491221 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 08:52:06.494237 master-0 kubenswrapper[18707]: I0320 08:52:06.494173 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 08:52:06.496105 master-0 kubenswrapper[18707]: I0320 08:52:06.496079 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 08:52:06.530256 master-0 kubenswrapper[18707]: I0320 08:52:06.530088 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 20 08:52:06.531494 master-0 kubenswrapper[18707]: I0320 08:52:06.531449 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 08:52:06.593426 master-0 kubenswrapper[18707]: I0320 08:52:06.593352 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 08:52:06.621548 master-0 kubenswrapper[18707]: I0320 08:52:06.621486 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 08:52:06.689131 master-0 kubenswrapper[18707]: I0320 08:52:06.689076 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 08:52:06.750200 master-0 kubenswrapper[18707]: I0320 08:52:06.750021 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 20 08:52:06.846843 master-0 kubenswrapper[18707]: I0320 08:52:06.846747 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 08:52:06.904525 master-0 kubenswrapper[18707]: I0320 08:52:06.904453 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 20 08:52:06.950104 master-0 kubenswrapper[18707]: I0320 08:52:06.949885 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 08:52:06.967733 master-0 kubenswrapper[18707]: I0320 08:52:06.967659 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 08:52:06.971155 master-0 kubenswrapper[18707]: I0320 08:52:06.971092 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 08:52:06.988052 master-0 kubenswrapper[18707]: I0320 08:52:06.987971 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-h4mf9" Mar 20 08:52:07.005293 master-0 kubenswrapper[18707]: I0320 08:52:07.004742 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-kv9b9" Mar 20 08:52:07.094839 master-0 kubenswrapper[18707]: I0320 08:52:07.093229 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 08:52:07.177342 master-0 kubenswrapper[18707]: I0320 08:52:07.177152 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 20 08:52:07.194851 master-0 kubenswrapper[18707]: I0320 08:52:07.194769 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-15j6gm4rb8461" Mar 20 08:52:07.278508 master-0 kubenswrapper[18707]: I0320 08:52:07.278355 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:52:07.343986 master-0 kubenswrapper[18707]: I0320 08:52:07.343903 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 20 08:52:07.369247 master-0 kubenswrapper[18707]: I0320 08:52:07.369145 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 20 08:52:07.384167 master-0 kubenswrapper[18707]: I0320 08:52:07.384106 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 08:52:07.456059 master-0 kubenswrapper[18707]: I0320 08:52:07.454154 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 08:52:07.573324 master-0 kubenswrapper[18707]: I0320 08:52:07.573238 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 08:52:07.574713 master-0 kubenswrapper[18707]: I0320 08:52:07.574672 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 20 08:52:07.576428 master-0 kubenswrapper[18707]: I0320 08:52:07.576390 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 08:52:07.615150 master-0 kubenswrapper[18707]: I0320 08:52:07.615063 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 08:52:07.649291 master-0 kubenswrapper[18707]: I0320 08:52:07.649230 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 08:52:07.683683 master-0 kubenswrapper[18707]: I0320 08:52:07.683616 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 08:52:07.684001 master-0 kubenswrapper[18707]: I0320 08:52:07.683800 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 08:52:07.684001 master-0 kubenswrapper[18707]: I0320 08:52:07.683813 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 20 08:52:07.819413 master-0 kubenswrapper[18707]: I0320 08:52:07.819340 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 08:52:07.887085 master-0 kubenswrapper[18707]: I0320 08:52:07.887008 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 08:52:07.915050 master-0 kubenswrapper[18707]: I0320 08:52:07.914993 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 20 08:52:07.944574 master-0 kubenswrapper[18707]: I0320 08:52:07.944471 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 20 08:52:07.956005 master-0 kubenswrapper[18707]: I0320 08:52:07.955938 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 08:52:07.986042 master-0 kubenswrapper[18707]: I0320 08:52:07.985958 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-wftxn" Mar 20 08:52:08.126443 master-0 kubenswrapper[18707]: I0320 08:52:08.126266 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 08:52:08.180278 master-0 kubenswrapper[18707]: I0320 08:52:08.179071 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 08:52:08.187250 master-0 kubenswrapper[18707]: I0320 08:52:08.187164 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 20 08:52:08.202940 master-0 kubenswrapper[18707]: I0320 08:52:08.202870 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 08:52:08.335242 master-0 kubenswrapper[18707]: I0320 08:52:08.335124 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 08:52:08.360745 master-0 kubenswrapper[18707]: I0320 08:52:08.360662 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 08:52:08.410561 master-0 kubenswrapper[18707]: I0320 08:52:08.410404 18707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 20 08:52:08.410857 master-0 kubenswrapper[18707]: I0320 08:52:08.410751 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="85632c1cec8974aa874834e4cfff4c77" containerName="startup-monitor" containerID="cri-o://c541ca942411b60e404f791f689440307713f1cb9fe9db3a23ad1c24be33a6e7" gracePeriod=5 Mar 20 08:52:08.448071 master-0 kubenswrapper[18707]: I0320 08:52:08.448013 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 08:52:08.462913 master-0 kubenswrapper[18707]: I0320 08:52:08.462866 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 08:52:08.523119 master-0 kubenswrapper[18707]: I0320 08:52:08.523013 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 08:52:08.551695 master-0 kubenswrapper[18707]: I0320 08:52:08.551645 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 08:52:08.574807 master-0 kubenswrapper[18707]: I0320 08:52:08.574758 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 08:52:08.604910 master-0 kubenswrapper[18707]: I0320 08:52:08.604846 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 20 08:52:08.628868 master-0 kubenswrapper[18707]: I0320 08:52:08.628799 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 20 08:52:08.646735 master-0 kubenswrapper[18707]: I0320 08:52:08.646661 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 08:52:08.647135 master-0 kubenswrapper[18707]: I0320 08:52:08.647070 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 08:52:08.648453 master-0 kubenswrapper[18707]: I0320 08:52:08.648387 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 08:52:08.679467 master-0 kubenswrapper[18707]: I0320 08:52:08.679299 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 08:52:08.736218 master-0 kubenswrapper[18707]: I0320 08:52:08.736143 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 08:52:08.846122 master-0 kubenswrapper[18707]: I0320 08:52:08.846061 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 08:52:08.859818 master-0 kubenswrapper[18707]: I0320 08:52:08.859723 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 20 08:52:08.863161 master-0 kubenswrapper[18707]: I0320 08:52:08.863123 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 08:52:08.902577 master-0 kubenswrapper[18707]: I0320 08:52:08.902508 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 08:52:09.033489 master-0 kubenswrapper[18707]: I0320 08:52:09.033330 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 08:52:09.095531 master-0 kubenswrapper[18707]: I0320 08:52:09.095469 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 08:52:09.227836 master-0 kubenswrapper[18707]: I0320 08:52:09.227764 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 08:52:09.321727 master-0 kubenswrapper[18707]: I0320 08:52:09.321692 18707 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 08:52:09.353041 master-0 kubenswrapper[18707]: I0320 08:52:09.353003 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 20 08:52:09.396466 master-0 kubenswrapper[18707]: I0320 08:52:09.396380 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 08:52:09.495374 master-0 kubenswrapper[18707]: I0320 08:52:09.495275 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-62tl6" Mar 20 08:52:09.573436 master-0 kubenswrapper[18707]: I0320 08:52:09.573287 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 20 08:52:09.666549 master-0 kubenswrapper[18707]: I0320 08:52:09.666477 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 20 08:52:09.700266 master-0 kubenswrapper[18707]: I0320 08:52:09.700219 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 08:52:09.792624 master-0 kubenswrapper[18707]: I0320 08:52:09.792548 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 08:52:09.803331 master-0 kubenswrapper[18707]: I0320 08:52:09.803259 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 08:52:09.876576 master-0 kubenswrapper[18707]: I0320 08:52:09.876442 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 08:52:09.876985 master-0 kubenswrapper[18707]: I0320 08:52:09.876576 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 20 08:52:09.912912 master-0 kubenswrapper[18707]: I0320 08:52:09.912845 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 08:52:09.930542 master-0 kubenswrapper[18707]: I0320 08:52:09.930472 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 08:52:09.976473 master-0 kubenswrapper[18707]: I0320 08:52:09.976429 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 20 08:52:10.005270 master-0 kubenswrapper[18707]: I0320 08:52:10.005219 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 08:52:10.182611 master-0 kubenswrapper[18707]: I0320 08:52:10.182433 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 08:52:10.237418 master-0 kubenswrapper[18707]: I0320 08:52:10.237337 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 08:52:10.297783 master-0 kubenswrapper[18707]: I0320 08:52:10.297709 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 08:52:10.300354 master-0 kubenswrapper[18707]: I0320 08:52:10.300298 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 08:52:10.342941 master-0 kubenswrapper[18707]: I0320 08:52:10.342874 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 08:52:10.521934 master-0 kubenswrapper[18707]: I0320 08:52:10.521746 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 20 08:52:10.576736 master-0 kubenswrapper[18707]: I0320 08:52:10.576657 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 08:52:10.714491 master-0 kubenswrapper[18707]: I0320 08:52:10.714385 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 08:52:10.801170 master-0 kubenswrapper[18707]: I0320 08:52:10.801083 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 08:52:10.888469 master-0 kubenswrapper[18707]: I0320 08:52:10.888400 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 08:52:10.949350 master-0 kubenswrapper[18707]: I0320 08:52:10.949171 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 08:52:10.964593 master-0 kubenswrapper[18707]: I0320 08:52:10.964510 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 08:52:11.103773 master-0 kubenswrapper[18707]: I0320 08:52:11.102413 18707 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 08:52:11.167838 master-0 kubenswrapper[18707]: I0320 08:52:11.167772 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 08:52:11.169076 master-0 kubenswrapper[18707]: I0320 08:52:11.169033 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 20 08:52:11.190375 master-0 kubenswrapper[18707]: I0320 08:52:11.190300 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 08:52:11.253336 master-0 kubenswrapper[18707]: I0320 08:52:11.253260 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 08:52:11.264818 master-0 kubenswrapper[18707]: I0320 08:52:11.264761 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 20 08:52:11.290917 master-0 kubenswrapper[18707]: I0320 08:52:11.290842 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 08:52:11.307287 master-0 kubenswrapper[18707]: I0320 08:52:11.307225 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 08:52:11.312356 master-0 kubenswrapper[18707]: I0320 08:52:11.312305 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 08:52:11.455649 master-0 kubenswrapper[18707]: I0320 08:52:11.455463 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-scgh2" Mar 20 08:52:11.482254 master-0 kubenswrapper[18707]: I0320 08:52:11.482173 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 08:52:11.491889 master-0 kubenswrapper[18707]: I0320 08:52:11.489340 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-d7bxn" Mar 20 08:52:11.538379 master-0 kubenswrapper[18707]: I0320 08:52:11.538291 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 08:52:11.588121 master-0 kubenswrapper[18707]: I0320 08:52:11.588056 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 20 08:52:11.739915 master-0 kubenswrapper[18707]: I0320 08:52:11.739773 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 08:52:11.744015 master-0 kubenswrapper[18707]: I0320 08:52:11.743983 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 08:52:12.032889 master-0 kubenswrapper[18707]: I0320 08:52:12.032170 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 20 08:52:12.321957 master-0 kubenswrapper[18707]: I0320 08:52:12.321880 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 20 08:52:13.064924 master-0 kubenswrapper[18707]: I0320 08:52:13.064842 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 20 08:52:13.252291 master-0 kubenswrapper[18707]: I0320 08:52:13.252150 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 08:52:13.504533 master-0 kubenswrapper[18707]: I0320 08:52:13.504443 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 08:52:13.738868 master-0 kubenswrapper[18707]: I0320 08:52:13.738717 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 08:52:14.001599 master-0 kubenswrapper[18707]: I0320 08:52:14.001443 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_85632c1cec8974aa874834e4cfff4c77/startup-monitor/0.log" Mar 20 08:52:14.001599 master-0 kubenswrapper[18707]: I0320 08:52:14.001584 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:52:14.042101 master-0 kubenswrapper[18707]: I0320 08:52:14.042031 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-n6dht" Mar 20 08:52:14.111944 master-0 kubenswrapper[18707]: I0320 08:52:14.111859 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-manifests\") pod \"85632c1cec8974aa874834e4cfff4c77\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " Mar 20 08:52:14.112601 master-0 kubenswrapper[18707]: I0320 08:52:14.111993 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-manifests" (OuterVolumeSpecName: "manifests") pod "85632c1cec8974aa874834e4cfff4c77" (UID: "85632c1cec8974aa874834e4cfff4c77"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:52:14.112601 master-0 kubenswrapper[18707]: I0320 08:52:14.112032 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-pod-resource-dir\") pod \"85632c1cec8974aa874834e4cfff4c77\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " Mar 20 08:52:14.112601 master-0 kubenswrapper[18707]: I0320 08:52:14.112228 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-lock\") pod \"85632c1cec8974aa874834e4cfff4c77\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " Mar 20 08:52:14.112601 master-0 kubenswrapper[18707]: I0320 08:52:14.112303 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-log\") pod \"85632c1cec8974aa874834e4cfff4c77\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " Mar 20 08:52:14.112601 master-0 kubenswrapper[18707]: I0320 08:52:14.112335 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-resource-dir\") pod \"85632c1cec8974aa874834e4cfff4c77\" (UID: \"85632c1cec8974aa874834e4cfff4c77\") " Mar 20 08:52:14.112601 master-0 kubenswrapper[18707]: I0320 08:52:14.112459 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-lock" (OuterVolumeSpecName: "var-lock") pod "85632c1cec8974aa874834e4cfff4c77" (UID: "85632c1cec8974aa874834e4cfff4c77"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:52:14.112601 master-0 kubenswrapper[18707]: I0320 08:52:14.112521 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-log" (OuterVolumeSpecName: "var-log") pod "85632c1cec8974aa874834e4cfff4c77" (UID: "85632c1cec8974aa874834e4cfff4c77"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:52:14.112837 master-0 kubenswrapper[18707]: I0320 08:52:14.112585 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "85632c1cec8974aa874834e4cfff4c77" (UID: "85632c1cec8974aa874834e4cfff4c77"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:52:14.113165 master-0 kubenswrapper[18707]: I0320 08:52:14.113128 18707 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-manifests\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:14.113165 master-0 kubenswrapper[18707]: I0320 08:52:14.113155 18707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:14.113353 master-0 kubenswrapper[18707]: I0320 08:52:14.113168 18707 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-var-log\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:14.113353 master-0 kubenswrapper[18707]: I0320 08:52:14.113198 18707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:14.122553 master-0 kubenswrapper[18707]: I0320 08:52:14.122498 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "85632c1cec8974aa874834e4cfff4c77" (UID: "85632c1cec8974aa874834e4cfff4c77"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:52:14.215156 master-0 kubenswrapper[18707]: I0320 08:52:14.215093 18707 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/85632c1cec8974aa874834e4cfff4c77-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:14.232804 master-0 kubenswrapper[18707]: I0320 08:52:14.232743 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 08:52:14.315172 master-0 kubenswrapper[18707]: I0320 08:52:14.315099 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_85632c1cec8974aa874834e4cfff4c77/startup-monitor/0.log" Mar 20 08:52:14.315539 master-0 kubenswrapper[18707]: I0320 08:52:14.315249 18707 generic.go:334] "Generic (PLEG): container finished" podID="85632c1cec8974aa874834e4cfff4c77" containerID="c541ca942411b60e404f791f689440307713f1cb9fe9db3a23ad1c24be33a6e7" exitCode=137 Mar 20 08:52:14.315539 master-0 kubenswrapper[18707]: I0320 08:52:14.315330 18707 scope.go:117] "RemoveContainer" containerID="c541ca942411b60e404f791f689440307713f1cb9fe9db3a23ad1c24be33a6e7" Mar 20 08:52:14.315693 master-0 kubenswrapper[18707]: I0320 08:52:14.315625 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 20 08:52:14.335991 master-0 kubenswrapper[18707]: I0320 08:52:14.335929 18707 scope.go:117] "RemoveContainer" containerID="c541ca942411b60e404f791f689440307713f1cb9fe9db3a23ad1c24be33a6e7" Mar 20 08:52:14.336829 master-0 kubenswrapper[18707]: E0320 08:52:14.336775 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c541ca942411b60e404f791f689440307713f1cb9fe9db3a23ad1c24be33a6e7\": container with ID starting with c541ca942411b60e404f791f689440307713f1cb9fe9db3a23ad1c24be33a6e7 not found: ID does not exist" containerID="c541ca942411b60e404f791f689440307713f1cb9fe9db3a23ad1c24be33a6e7" Mar 20 08:52:14.336918 master-0 kubenswrapper[18707]: I0320 08:52:14.336851 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c541ca942411b60e404f791f689440307713f1cb9fe9db3a23ad1c24be33a6e7"} err="failed to get container status \"c541ca942411b60e404f791f689440307713f1cb9fe9db3a23ad1c24be33a6e7\": rpc error: code = NotFound desc = could not find container \"c541ca942411b60e404f791f689440307713f1cb9fe9db3a23ad1c24be33a6e7\": container with ID starting with c541ca942411b60e404f791f689440307713f1cb9fe9db3a23ad1c24be33a6e7 not found: ID does not exist" Mar 20 08:52:14.665346 master-0 kubenswrapper[18707]: I0320 08:52:14.663883 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-frnfd" Mar 20 08:52:14.778080 master-0 kubenswrapper[18707]: I0320 08:52:14.776898 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 08:52:15.106508 master-0 kubenswrapper[18707]: I0320 08:52:15.106457 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85632c1cec8974aa874834e4cfff4c77" path="/var/lib/kubelet/pods/85632c1cec8974aa874834e4cfff4c77/volumes" Mar 20 08:52:23.325351 master-0 kubenswrapper[18707]: I0320 08:52:23.325276 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" podUID="ecd2e0e2-a8c1-42bf-8637-8999030075f1" containerName="oauth-openshift" containerID="cri-o://1c142f6df3e02bb51343a9d48043cfbb00fa968fa9a477f02c864cdd8a5a9600" gracePeriod=15 Mar 20 08:52:23.825143 master-0 kubenswrapper[18707]: I0320 08:52:23.825063 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:52:23.874219 master-0 kubenswrapper[18707]: I0320 08:52:23.874133 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-router-certs\") pod \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " Mar 20 08:52:23.874511 master-0 kubenswrapper[18707]: I0320 08:52:23.874237 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4sz4\" (UniqueName: \"kubernetes.io/projected/ecd2e0e2-a8c1-42bf-8637-8999030075f1-kube-api-access-c4sz4\") pod \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " Mar 20 08:52:23.874511 master-0 kubenswrapper[18707]: I0320 08:52:23.874277 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-error\") pod \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " Mar 20 08:52:23.874511 master-0 kubenswrapper[18707]: I0320 08:52:23.874300 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-service-ca\") pod \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " Mar 20 08:52:23.874511 master-0 kubenswrapper[18707]: I0320 08:52:23.874344 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-login\") pod \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " Mar 20 08:52:23.874511 master-0 kubenswrapper[18707]: I0320 08:52:23.874361 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-ocp-branding-template\") pod \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " Mar 20 08:52:23.874511 master-0 kubenswrapper[18707]: I0320 08:52:23.874383 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-session\") pod \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " Mar 20 08:52:23.874511 master-0 kubenswrapper[18707]: I0320 08:52:23.874413 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecd2e0e2-a8c1-42bf-8637-8999030075f1-audit-dir\") pod \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " Mar 20 08:52:23.874511 master-0 kubenswrapper[18707]: I0320 08:52:23.874462 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-trusted-ca-bundle\") pod \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " Mar 20 08:52:23.874798 master-0 kubenswrapper[18707]: I0320 08:52:23.874528 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-provider-selection\") pod \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " Mar 20 08:52:23.874798 master-0 kubenswrapper[18707]: I0320 08:52:23.874547 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-audit-policies\") pod \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " Mar 20 08:52:23.874798 master-0 kubenswrapper[18707]: I0320 08:52:23.874574 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-serving-cert\") pod \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " Mar 20 08:52:23.874798 master-0 kubenswrapper[18707]: I0320 08:52:23.874592 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-cliconfig\") pod \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\" (UID: \"ecd2e0e2-a8c1-42bf-8637-8999030075f1\") " Mar 20 08:52:23.876928 master-0 kubenswrapper[18707]: I0320 08:52:23.875346 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecd2e0e2-a8c1-42bf-8637-8999030075f1-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ecd2e0e2-a8c1-42bf-8637-8999030075f1" (UID: "ecd2e0e2-a8c1-42bf-8637-8999030075f1"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:52:23.876928 master-0 kubenswrapper[18707]: I0320 08:52:23.875522 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ecd2e0e2-a8c1-42bf-8637-8999030075f1" (UID: "ecd2e0e2-a8c1-42bf-8637-8999030075f1"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:52:23.876928 master-0 kubenswrapper[18707]: I0320 08:52:23.875548 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ecd2e0e2-a8c1-42bf-8637-8999030075f1" (UID: "ecd2e0e2-a8c1-42bf-8637-8999030075f1"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:52:23.876928 master-0 kubenswrapper[18707]: I0320 08:52:23.875787 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ecd2e0e2-a8c1-42bf-8637-8999030075f1" (UID: "ecd2e0e2-a8c1-42bf-8637-8999030075f1"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:52:23.876928 master-0 kubenswrapper[18707]: I0320 08:52:23.876228 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ecd2e0e2-a8c1-42bf-8637-8999030075f1" (UID: "ecd2e0e2-a8c1-42bf-8637-8999030075f1"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:52:23.879135 master-0 kubenswrapper[18707]: I0320 08:52:23.878362 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ecd2e0e2-a8c1-42bf-8637-8999030075f1" (UID: "ecd2e0e2-a8c1-42bf-8637-8999030075f1"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:52:23.879135 master-0 kubenswrapper[18707]: I0320 08:52:23.878401 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ecd2e0e2-a8c1-42bf-8637-8999030075f1" (UID: "ecd2e0e2-a8c1-42bf-8637-8999030075f1"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:52:23.879135 master-0 kubenswrapper[18707]: I0320 08:52:23.879018 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ecd2e0e2-a8c1-42bf-8637-8999030075f1" (UID: "ecd2e0e2-a8c1-42bf-8637-8999030075f1"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:52:23.882135 master-0 kubenswrapper[18707]: I0320 08:52:23.879343 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ecd2e0e2-a8c1-42bf-8637-8999030075f1" (UID: "ecd2e0e2-a8c1-42bf-8637-8999030075f1"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:52:23.882135 master-0 kubenswrapper[18707]: I0320 08:52:23.879400 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecd2e0e2-a8c1-42bf-8637-8999030075f1-kube-api-access-c4sz4" (OuterVolumeSpecName: "kube-api-access-c4sz4") pod "ecd2e0e2-a8c1-42bf-8637-8999030075f1" (UID: "ecd2e0e2-a8c1-42bf-8637-8999030075f1"). InnerVolumeSpecName "kube-api-access-c4sz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:52:23.882135 master-0 kubenswrapper[18707]: I0320 08:52:23.879598 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ecd2e0e2-a8c1-42bf-8637-8999030075f1" (UID: "ecd2e0e2-a8c1-42bf-8637-8999030075f1"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:52:23.882135 master-0 kubenswrapper[18707]: I0320 08:52:23.880653 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ecd2e0e2-a8c1-42bf-8637-8999030075f1" (UID: "ecd2e0e2-a8c1-42bf-8637-8999030075f1"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:52:23.882135 master-0 kubenswrapper[18707]: I0320 08:52:23.881893 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ecd2e0e2-a8c1-42bf-8637-8999030075f1" (UID: "ecd2e0e2-a8c1-42bf-8637-8999030075f1"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:52:23.976734 master-0 kubenswrapper[18707]: I0320 08:52:23.976639 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:23.976734 master-0 kubenswrapper[18707]: I0320 08:52:23.976704 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:23.976734 master-0 kubenswrapper[18707]: I0320 08:52:23.976716 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:23.976734 master-0 kubenswrapper[18707]: I0320 08:52:23.976733 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4sz4\" (UniqueName: \"kubernetes.io/projected/ecd2e0e2-a8c1-42bf-8637-8999030075f1-kube-api-access-c4sz4\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:23.976734 master-0 kubenswrapper[18707]: I0320 08:52:23.976748 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:23.976734 master-0 kubenswrapper[18707]: I0320 08:52:23.976757 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:23.976734 master-0 kubenswrapper[18707]: I0320 08:52:23.976768 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:23.977363 master-0 kubenswrapper[18707]: I0320 08:52:23.976781 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:23.977363 master-0 kubenswrapper[18707]: I0320 08:52:23.976796 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:23.977363 master-0 kubenswrapper[18707]: I0320 08:52:23.976808 18707 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecd2e0e2-a8c1-42bf-8637-8999030075f1-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:23.977363 master-0 kubenswrapper[18707]: I0320 08:52:23.976819 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:23.977363 master-0 kubenswrapper[18707]: I0320 08:52:23.976828 18707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ecd2e0e2-a8c1-42bf-8637-8999030075f1-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:23.977363 master-0 kubenswrapper[18707]: I0320 08:52:23.976839 18707 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecd2e0e2-a8c1-42bf-8637-8999030075f1-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 20 08:52:24.402792 master-0 kubenswrapper[18707]: I0320 08:52:24.402707 18707 generic.go:334] "Generic (PLEG): container finished" podID="ecd2e0e2-a8c1-42bf-8637-8999030075f1" containerID="1c142f6df3e02bb51343a9d48043cfbb00fa968fa9a477f02c864cdd8a5a9600" exitCode=0 Mar 20 08:52:24.403874 master-0 kubenswrapper[18707]: I0320 08:52:24.402847 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" Mar 20 08:52:24.404693 master-0 kubenswrapper[18707]: I0320 08:52:24.404029 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" event={"ID":"ecd2e0e2-a8c1-42bf-8637-8999030075f1","Type":"ContainerDied","Data":"1c142f6df3e02bb51343a9d48043cfbb00fa968fa9a477f02c864cdd8a5a9600"} Mar 20 08:52:24.405449 master-0 kubenswrapper[18707]: I0320 08:52:24.405402 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-dcb9594d9-wlht7" event={"ID":"ecd2e0e2-a8c1-42bf-8637-8999030075f1","Type":"ContainerDied","Data":"ceb3ffd79d8b395b6e52a5b41ec3cd1c27194d6f64a2f6a384db2267eb73bcfa"} Mar 20 08:52:24.408979 master-0 kubenswrapper[18707]: I0320 08:52:24.406294 18707 scope.go:117] "RemoveContainer" containerID="1c142f6df3e02bb51343a9d48043cfbb00fa968fa9a477f02c864cdd8a5a9600" Mar 20 08:52:24.431645 master-0 kubenswrapper[18707]: I0320 08:52:24.431586 18707 scope.go:117] "RemoveContainer" containerID="1c142f6df3e02bb51343a9d48043cfbb00fa968fa9a477f02c864cdd8a5a9600" Mar 20 08:52:24.432371 master-0 kubenswrapper[18707]: E0320 08:52:24.432094 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c142f6df3e02bb51343a9d48043cfbb00fa968fa9a477f02c864cdd8a5a9600\": container with ID starting with 1c142f6df3e02bb51343a9d48043cfbb00fa968fa9a477f02c864cdd8a5a9600 not found: ID does not exist" containerID="1c142f6df3e02bb51343a9d48043cfbb00fa968fa9a477f02c864cdd8a5a9600" Mar 20 08:52:24.432371 master-0 kubenswrapper[18707]: I0320 08:52:24.432148 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c142f6df3e02bb51343a9d48043cfbb00fa968fa9a477f02c864cdd8a5a9600"} err="failed to get container status \"1c142f6df3e02bb51343a9d48043cfbb00fa968fa9a477f02c864cdd8a5a9600\": rpc error: code = NotFound desc = could not find container \"1c142f6df3e02bb51343a9d48043cfbb00fa968fa9a477f02c864cdd8a5a9600\": container with ID starting with 1c142f6df3e02bb51343a9d48043cfbb00fa968fa9a477f02c864cdd8a5a9600 not found: ID does not exist" Mar 20 08:52:24.529977 master-0 kubenswrapper[18707]: I0320 08:52:24.529878 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-dcb9594d9-wlht7"] Mar 20 08:52:24.617575 master-0 kubenswrapper[18707]: I0320 08:52:24.617473 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-dcb9594d9-wlht7"] Mar 20 08:52:25.102786 master-0 kubenswrapper[18707]: I0320 08:52:25.102743 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecd2e0e2-a8c1-42bf-8637-8999030075f1" path="/var/lib/kubelet/pods/ecd2e0e2-a8c1-42bf-8637-8999030075f1/volumes" Mar 20 08:52:28.414449 master-0 kubenswrapper[18707]: I0320 08:52:28.413141 18707 patch_prober.go:28] interesting pod/marketplace-operator-89ccd998f-mvn4t container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" start-of-body= Mar 20 08:52:28.414449 master-0 kubenswrapper[18707]: I0320 08:52:28.413860 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" podUID="acb704a9-6c8d-4378-ae93-e7095b1fce85" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" Mar 20 08:52:28.448217 master-0 kubenswrapper[18707]: I0320 08:52:28.447452 18707 generic.go:334] "Generic (PLEG): container finished" podID="acb704a9-6c8d-4378-ae93-e7095b1fce85" containerID="d5730ce99ce84daa469646e6254d205d4c489fad363bcb80a9e011d6d98bb833" exitCode=0 Mar 20 08:52:28.448217 master-0 kubenswrapper[18707]: I0320 08:52:28.447518 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" event={"ID":"acb704a9-6c8d-4378-ae93-e7095b1fce85","Type":"ContainerDied","Data":"d5730ce99ce84daa469646e6254d205d4c489fad363bcb80a9e011d6d98bb833"} Mar 20 08:52:28.448217 master-0 kubenswrapper[18707]: I0320 08:52:28.447574 18707 scope.go:117] "RemoveContainer" containerID="4edf8229d20f78af8d6c9bd895781409ea67ff831e7f63eef258c5dd26b7d38d" Mar 20 08:52:28.448217 master-0 kubenswrapper[18707]: I0320 08:52:28.448137 18707 scope.go:117] "RemoveContainer" containerID="d5730ce99ce84daa469646e6254d205d4c489fad363bcb80a9e011d6d98bb833" Mar 20 08:52:28.448574 master-0 kubenswrapper[18707]: E0320 08:52:28.448387 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-89ccd998f-mvn4t_openshift-marketplace(acb704a9-6c8d-4378-ae93-e7095b1fce85)\"" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" podUID="acb704a9-6c8d-4378-ae93-e7095b1fce85" Mar 20 08:52:31.580299 master-0 kubenswrapper[18707]: I0320 08:52:31.580245 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:52:31.581859 master-0 kubenswrapper[18707]: I0320 08:52:31.581780 18707 scope.go:117] "RemoveContainer" containerID="d5730ce99ce84daa469646e6254d205d4c489fad363bcb80a9e011d6d98bb833" Mar 20 08:52:31.582510 master-0 kubenswrapper[18707]: E0320 08:52:31.582466 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-89ccd998f-mvn4t_openshift-marketplace(acb704a9-6c8d-4378-ae93-e7095b1fce85)\"" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" podUID="acb704a9-6c8d-4378-ae93-e7095b1fce85" Mar 20 08:52:38.410895 master-0 kubenswrapper[18707]: I0320 08:52:38.410507 18707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:52:38.412636 master-0 kubenswrapper[18707]: I0320 08:52:38.411655 18707 scope.go:117] "RemoveContainer" containerID="d5730ce99ce84daa469646e6254d205d4c489fad363bcb80a9e011d6d98bb833" Mar 20 08:52:39.543235 master-0 kubenswrapper[18707]: I0320 08:52:39.543142 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" event={"ID":"acb704a9-6c8d-4378-ae93-e7095b1fce85","Type":"ContainerStarted","Data":"dce0dde7f3c728ea8ee2f22184c1e8ecc97c19fa7b0e44f68d94cfbdecd46f44"} Mar 20 08:52:39.544230 master-0 kubenswrapper[18707]: I0320 08:52:39.544201 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:52:39.547595 master-0 kubenswrapper[18707]: I0320 08:52:39.547551 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-mvn4t" Mar 20 08:52:49.414752 master-0 kubenswrapper[18707]: I0320 08:52:49.414634 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-66b8ffb895-dgq8g"] Mar 20 08:52:49.415459 master-0 kubenswrapper[18707]: E0320 08:52:49.415111 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85632c1cec8974aa874834e4cfff4c77" containerName="startup-monitor" Mar 20 08:52:49.415459 master-0 kubenswrapper[18707]: I0320 08:52:49.415132 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="85632c1cec8974aa874834e4cfff4c77" containerName="startup-monitor" Mar 20 08:52:49.415459 master-0 kubenswrapper[18707]: E0320 08:52:49.415166 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" containerName="installer" Mar 20 08:52:49.415459 master-0 kubenswrapper[18707]: I0320 08:52:49.415174 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" containerName="installer" Mar 20 08:52:49.415459 master-0 kubenswrapper[18707]: E0320 08:52:49.415228 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd2e0e2-a8c1-42bf-8637-8999030075f1" containerName="oauth-openshift" Mar 20 08:52:49.415459 master-0 kubenswrapper[18707]: I0320 08:52:49.415238 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd2e0e2-a8c1-42bf-8637-8999030075f1" containerName="oauth-openshift" Mar 20 08:52:49.415459 master-0 kubenswrapper[18707]: I0320 08:52:49.415404 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="85632c1cec8974aa874834e4cfff4c77" containerName="startup-monitor" Mar 20 08:52:49.415459 master-0 kubenswrapper[18707]: I0320 08:52:49.415422 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b73fa6-b86b-4b65-826c-8f139d45c3d4" containerName="installer" Mar 20 08:52:49.415459 master-0 kubenswrapper[18707]: I0320 08:52:49.415440 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecd2e0e2-a8c1-42bf-8637-8999030075f1" containerName="oauth-openshift" Mar 20 08:52:49.416201 master-0 kubenswrapper[18707]: I0320 08:52:49.416151 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-66b8ffb895-dgq8g" Mar 20 08:52:49.418763 master-0 kubenswrapper[18707]: I0320 08:52:49.418702 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-zkhzs" Mar 20 08:52:49.419516 master-0 kubenswrapper[18707]: I0320 08:52:49.419481 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 08:52:49.419691 master-0 kubenswrapper[18707]: I0320 08:52:49.419665 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 08:52:49.427861 master-0 kubenswrapper[18707]: I0320 08:52:49.427809 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-58857c5dc9-9wf6k"] Mar 20 08:52:49.429334 master-0 kubenswrapper[18707]: I0320 08:52:49.429306 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.432268 master-0 kubenswrapper[18707]: I0320 08:52:49.432201 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-psbxb" Mar 20 08:52:49.432477 master-0 kubenswrapper[18707]: I0320 08:52:49.432289 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 08:52:49.432477 master-0 kubenswrapper[18707]: I0320 08:52:49.432221 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 08:52:49.432697 master-0 kubenswrapper[18707]: I0320 08:52:49.432665 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 08:52:49.435391 master-0 kubenswrapper[18707]: I0320 08:52:49.435349 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 08:52:49.438518 master-0 kubenswrapper[18707]: I0320 08:52:49.438461 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 08:52:49.438808 master-0 kubenswrapper[18707]: I0320 08:52:49.438780 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-69c949f678-v7wb6"] Mar 20 08:52:49.440455 master-0 kubenswrapper[18707]: I0320 08:52:49.440414 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-69c949f678-v7wb6" Mar 20 08:52:49.447415 master-0 kubenswrapper[18707]: I0320 08:52:49.447290 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 20 08:52:49.450159 master-0 kubenswrapper[18707]: I0320 08:52:49.450098 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 08:52:49.454585 master-0 kubenswrapper[18707]: I0320 08:52:49.452515 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.455305 master-0 kubenswrapper[18707]: I0320 08:52:49.455262 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 20 08:52:49.455684 master-0 kubenswrapper[18707]: I0320 08:52:49.455527 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-fofds5kbvc1lq" Mar 20 08:52:49.455810 master-0 kubenswrapper[18707]: I0320 08:52:49.455751 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 20 08:52:49.455982 master-0 kubenswrapper[18707]: I0320 08:52:49.455953 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 20 08:52:49.456080 master-0 kubenswrapper[18707]: I0320 08:52:49.456048 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 20 08:52:49.456367 master-0 kubenswrapper[18707]: I0320 08:52:49.456138 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 20 08:52:49.456367 master-0 kubenswrapper[18707]: I0320 08:52:49.456329 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 20 08:52:49.456680 master-0 kubenswrapper[18707]: I0320 08:52:49.456641 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 20 08:52:49.463155 master-0 kubenswrapper[18707]: I0320 08:52:49.463095 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-ffb45cdb5-7r4lm"] Mar 20 08:52:49.464232 master-0 kubenswrapper[18707]: I0320 08:52:49.463935 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 20 08:52:49.465365 master-0 kubenswrapper[18707]: I0320 08:52:49.465333 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.465852 master-0 kubenswrapper[18707]: I0320 08:52:49.465538 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 20 08:52:49.465852 master-0 kubenswrapper[18707]: I0320 08:52:49.465562 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 20 08:52:49.467156 master-0 kubenswrapper[18707]: I0320 08:52:49.467104 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 20 08:52:49.468820 master-0 kubenswrapper[18707]: I0320 08:52:49.468799 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-8684f7dbf-tqtg9"] Mar 20 08:52:49.469969 master-0 kubenswrapper[18707]: I0320 08:52:49.469947 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.474830 master-0 kubenswrapper[18707]: I0320 08:52:49.474808 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 08:52:49.480551 master-0 kubenswrapper[18707]: I0320 08:52:49.476481 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 20 08:52:49.482665 master-0 kubenswrapper[18707]: I0320 08:52:49.475027 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 08:52:49.482665 master-0 kubenswrapper[18707]: I0320 08:52:49.475223 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 08:52:49.482665 master-0 kubenswrapper[18707]: I0320 08:52:49.475374 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 08:52:49.482665 master-0 kubenswrapper[18707]: I0320 08:52:49.475420 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 08:52:49.482665 master-0 kubenswrapper[18707]: I0320 08:52:49.475702 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 08:52:49.482665 master-0 kubenswrapper[18707]: I0320 08:52:49.476350 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 08:52:49.482665 master-0 kubenswrapper[18707]: I0320 08:52:49.476395 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 08:52:49.482665 master-0 kubenswrapper[18707]: I0320 08:52:49.476720 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 08:52:49.482665 master-0 kubenswrapper[18707]: I0320 08:52:49.482162 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-scgh2" Mar 20 08:52:49.482665 master-0 kubenswrapper[18707]: I0320 08:52:49.476983 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 08:52:49.482665 master-0 kubenswrapper[18707]: I0320 08:52:49.477671 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 08:52:49.489615 master-0 kubenswrapper[18707]: I0320 08:52:49.489448 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 08:52:49.515471 master-0 kubenswrapper[18707]: I0320 08:52:49.515117 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-66b8ffb895-dgq8g"] Mar 20 08:52:49.516699 master-0 kubenswrapper[18707]: I0320 08:52:49.516681 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58857c5dc9-9wf6k"] Mar 20 08:52:49.516993 master-0 kubenswrapper[18707]: I0320 08:52:49.516828 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.526870 master-0 kubenswrapper[18707]: I0320 08:52:49.526814 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 20 08:52:49.527207 master-0 kubenswrapper[18707]: I0320 08:52:49.527168 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 20 08:52:49.529678 master-0 kubenswrapper[18707]: I0320 08:52:49.529565 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 08:52:49.530172 master-0 kubenswrapper[18707]: I0320 08:52:49.530147 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 20 08:52:49.534923 master-0 kubenswrapper[18707]: I0320 08:52:49.534886 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8684f7dbf-tqtg9"] Mar 20 08:52:49.536796 master-0 kubenswrapper[18707]: I0320 08:52:49.536754 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.537534 master-0 kubenswrapper[18707]: I0320 08:52:49.537502 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.537614 master-0 kubenswrapper[18707]: I0320 08:52:49.537553 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/13126442-5154-471a-97f7-fa6d917c1ba1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.537614 master-0 kubenswrapper[18707]: I0320 08:52:49.537584 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb5l6\" (UniqueName: \"kubernetes.io/projected/13126442-5154-471a-97f7-fa6d917c1ba1-kube-api-access-hb5l6\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.537716 master-0 kubenswrapper[18707]: I0320 08:52:49.537642 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-web-config\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.538375 master-0 kubenswrapper[18707]: I0320 08:52:49.538350 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-config-volume\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.538452 master-0 kubenswrapper[18707]: I0320 08:52:49.538393 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/13126442-5154-471a-97f7-fa6d917c1ba1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.538452 master-0 kubenswrapper[18707]: I0320 08:52:49.538437 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.538654 master-0 kubenswrapper[18707]: I0320 08:52:49.538457 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjgk6\" (UniqueName: \"kubernetes.io/projected/dfad7071-5d2b-4ffb-a260-0e084ef08c99-kube-api-access-zjgk6\") pod \"downloads-66b8ffb895-dgq8g\" (UID: \"dfad7071-5d2b-4ffb-a260-0e084ef08c99\") " pod="openshift-console/downloads-66b8ffb895-dgq8g" Mar 20 08:52:49.538654 master-0 kubenswrapper[18707]: I0320 08:52:49.538491 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.539059 master-0 kubenswrapper[18707]: I0320 08:52:49.538924 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/13126442-5154-471a-97f7-fa6d917c1ba1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.539131 master-0 kubenswrapper[18707]: I0320 08:52:49.539095 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/13126442-5154-471a-97f7-fa6d917c1ba1-config-out\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.539174 master-0 kubenswrapper[18707]: I0320 08:52:49.539143 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13126442-5154-471a-97f7-fa6d917c1ba1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.541733 master-0 kubenswrapper[18707]: I0320 08:52:49.541616 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 20 08:52:49.541830 master-0 kubenswrapper[18707]: I0320 08:52:49.541549 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 20 08:52:49.542350 master-0 kubenswrapper[18707]: I0320 08:52:49.542307 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 20 08:52:49.543595 master-0 kubenswrapper[18707]: I0320 08:52:49.543553 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 20 08:52:49.543710 master-0 kubenswrapper[18707]: I0320 08:52:49.543648 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 20 08:52:49.550368 master-0 kubenswrapper[18707]: I0320 08:52:49.550322 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 20 08:52:49.552066 master-0 kubenswrapper[18707]: I0320 08:52:49.552030 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 20 08:52:49.559548 master-0 kubenswrapper[18707]: I0320 08:52:49.558907 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ffb45cdb5-7r4lm"] Mar 20 08:52:49.575463 master-0 kubenswrapper[18707]: I0320 08:52:49.575409 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-69c949f678-v7wb6"] Mar 20 08:52:49.643040 master-0 kubenswrapper[18707]: I0320 08:52:49.642951 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-web-config\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.643040 master-0 kubenswrapper[18707]: I0320 08:52:49.643042 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.643513 master-0 kubenswrapper[18707]: I0320 08:52:49.643079 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2mgj\" (UniqueName: \"kubernetes.io/projected/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-kube-api-access-f2mgj\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.643513 master-0 kubenswrapper[18707]: I0320 08:52:49.643113 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.643513 master-0 kubenswrapper[18707]: I0320 08:52:49.643141 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-config-out\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.643513 master-0 kubenswrapper[18707]: I0320 08:52:49.643221 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-user-template-error\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.643513 master-0 kubenswrapper[18707]: I0320 08:52:49.643305 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.643513 master-0 kubenswrapper[18707]: I0320 08:52:49.643378 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.643513 master-0 kubenswrapper[18707]: I0320 08:52:49.643415 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-user-template-login\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.643513 master-0 kubenswrapper[18707]: I0320 08:52:49.643454 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-config-volume\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.643513 master-0 kubenswrapper[18707]: I0320 08:52:49.643479 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-config\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.643513 master-0 kubenswrapper[18707]: I0320 08:52:49.643504 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-oauth-serving-cert\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.644123 master-0 kubenswrapper[18707]: I0320 08:52:49.643529 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89b00efb-dd09-42fd-824e-e5e317962bb3-audit-dir\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.644123 master-0 kubenswrapper[18707]: I0320 08:52:49.643595 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.644123 master-0 kubenswrapper[18707]: I0320 08:52:49.643621 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.644123 master-0 kubenswrapper[18707]: I0320 08:52:49.643657 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.644456 master-0 kubenswrapper[18707]: I0320 08:52:49.644274 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.644456 master-0 kubenswrapper[18707]: I0320 08:52:49.644309 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.644456 master-0 kubenswrapper[18707]: I0320 08:52:49.644342 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/13126442-5154-471a-97f7-fa6d917c1ba1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.644456 master-0 kubenswrapper[18707]: I0320 08:52:49.644371 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.644456 master-0 kubenswrapper[18707]: I0320 08:52:49.644394 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-trusted-ca-bundle\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.644456 master-0 kubenswrapper[18707]: I0320 08:52:49.644428 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g865c\" (UniqueName: \"kubernetes.io/projected/89b00efb-dd09-42fd-824e-e5e317962bb3-kube-api-access-g865c\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.644776 master-0 kubenswrapper[18707]: I0320 08:52:49.644526 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.644776 master-0 kubenswrapper[18707]: I0320 08:52:49.644552 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-console-config\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.644776 master-0 kubenswrapper[18707]: I0320 08:52:49.644590 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx59m\" (UniqueName: \"kubernetes.io/projected/a22aca95-adb1-41a5-bf0f-fb10574ad5a5-kube-api-access-wx59m\") pod \"multus-admission-controller-69c949f678-v7wb6\" (UID: \"a22aca95-adb1-41a5-bf0f-fb10574ad5a5\") " pod="openshift-multus/multus-admission-controller-69c949f678-v7wb6" Mar 20 08:52:49.644776 master-0 kubenswrapper[18707]: I0320 08:52:49.644634 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.644776 master-0 kubenswrapper[18707]: I0320 08:52:49.644664 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjgk6\" (UniqueName: \"kubernetes.io/projected/dfad7071-5d2b-4ffb-a260-0e084ef08c99-kube-api-access-zjgk6\") pod \"downloads-66b8ffb895-dgq8g\" (UID: \"dfad7071-5d2b-4ffb-a260-0e084ef08c99\") " pod="openshift-console/downloads-66b8ffb895-dgq8g" Mar 20 08:52:49.644776 master-0 kubenswrapper[18707]: I0320 08:52:49.644695 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.644776 master-0 kubenswrapper[18707]: I0320 08:52:49.644752 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-service-ca\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.645050 master-0 kubenswrapper[18707]: I0320 08:52:49.644782 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-session\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.645050 master-0 kubenswrapper[18707]: I0320 08:52:49.644813 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.645050 master-0 kubenswrapper[18707]: I0320 08:52:49.644841 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.645050 master-0 kubenswrapper[18707]: I0320 08:52:49.644872 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.645050 master-0 kubenswrapper[18707]: I0320 08:52:49.644897 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-config\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.645050 master-0 kubenswrapper[18707]: I0320 08:52:49.644944 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/13126442-5154-471a-97f7-fa6d917c1ba1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.645050 master-0 kubenswrapper[18707]: I0320 08:52:49.644972 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.645050 master-0 kubenswrapper[18707]: I0320 08:52:49.645008 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-router-certs\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.645050 master-0 kubenswrapper[18707]: I0320 08:52:49.645044 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/13126442-5154-471a-97f7-fa6d917c1ba1-config-out\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.645336 master-0 kubenswrapper[18707]: I0320 08:52:49.645074 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13126442-5154-471a-97f7-fa6d917c1ba1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.645336 master-0 kubenswrapper[18707]: I0320 08:52:49.645103 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-oauth-config\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.645336 master-0 kubenswrapper[18707]: I0320 08:52:49.645127 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.645336 master-0 kubenswrapper[18707]: I0320 08:52:49.645159 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.645336 master-0 kubenswrapper[18707]: I0320 08:52:49.645209 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a22aca95-adb1-41a5-bf0f-fb10574ad5a5-webhook-certs\") pod \"multus-admission-controller-69c949f678-v7wb6\" (UID: \"a22aca95-adb1-41a5-bf0f-fb10574ad5a5\") " pod="openshift-multus/multus-admission-controller-69c949f678-v7wb6" Mar 20 08:52:49.645336 master-0 kubenswrapper[18707]: I0320 08:52:49.645245 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.645336 master-0 kubenswrapper[18707]: I0320 08:52:49.645270 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-service-ca\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.645336 master-0 kubenswrapper[18707]: I0320 08:52:49.645312 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.645336 master-0 kubenswrapper[18707]: I0320 08:52:49.645338 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-oauth-config\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.645613 master-0 kubenswrapper[18707]: I0320 08:52:49.645374 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.645613 master-0 kubenswrapper[18707]: I0320 08:52:49.645404 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwk2g\" (UniqueName: \"kubernetes.io/projected/2ec18453-e818-426d-b2d5-e8364dc2067b-kube-api-access-nwk2g\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.645613 master-0 kubenswrapper[18707]: I0320 08:52:49.645434 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.645613 master-0 kubenswrapper[18707]: I0320 08:52:49.645468 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-oauth-serving-cert\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.645613 master-0 kubenswrapper[18707]: I0320 08:52:49.645494 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzkd2\" (UniqueName: \"kubernetes.io/projected/4a7bfb28-46ef-431e-8f40-da7e723613d6-kube-api-access-dzkd2\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.645613 master-0 kubenswrapper[18707]: I0320 08:52:49.645517 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/89b00efb-dd09-42fd-824e-e5e317962bb3-audit-policies\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.645613 master-0 kubenswrapper[18707]: I0320 08:52:49.645552 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/13126442-5154-471a-97f7-fa6d917c1ba1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.645613 master-0 kubenswrapper[18707]: I0320 08:52:49.645584 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.645613 master-0 kubenswrapper[18707]: I0320 08:52:49.645612 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb5l6\" (UniqueName: \"kubernetes.io/projected/13126442-5154-471a-97f7-fa6d917c1ba1-kube-api-access-hb5l6\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.645931 master-0 kubenswrapper[18707]: I0320 08:52:49.645644 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.645931 master-0 kubenswrapper[18707]: I0320 08:52:49.645670 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-web-config\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.645931 master-0 kubenswrapper[18707]: I0320 08:52:49.645698 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-service-ca\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.646868 master-0 kubenswrapper[18707]: I0320 08:52:49.646813 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-web-config\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.647059 master-0 kubenswrapper[18707]: I0320 08:52:49.646924 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/13126442-5154-471a-97f7-fa6d917c1ba1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.648172 master-0 kubenswrapper[18707]: I0320 08:52:49.648091 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/13126442-5154-471a-97f7-fa6d917c1ba1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.648661 master-0 kubenswrapper[18707]: I0320 08:52:49.648635 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13126442-5154-471a-97f7-fa6d917c1ba1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.650230 master-0 kubenswrapper[18707]: I0320 08:52:49.649856 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.650476 master-0 kubenswrapper[18707]: I0320 08:52:49.650398 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.650476 master-0 kubenswrapper[18707]: I0320 08:52:49.650437 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-config-volume\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.651272 master-0 kubenswrapper[18707]: I0320 08:52:49.651244 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/13126442-5154-471a-97f7-fa6d917c1ba1-config-out\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.651551 master-0 kubenswrapper[18707]: I0320 08:52:49.651505 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/13126442-5154-471a-97f7-fa6d917c1ba1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.653254 master-0 kubenswrapper[18707]: I0320 08:52:49.653148 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.658259 master-0 kubenswrapper[18707]: I0320 08:52:49.656556 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/13126442-5154-471a-97f7-fa6d917c1ba1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.664886 master-0 kubenswrapper[18707]: I0320 08:52:49.664814 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjgk6\" (UniqueName: \"kubernetes.io/projected/dfad7071-5d2b-4ffb-a260-0e084ef08c99-kube-api-access-zjgk6\") pod \"downloads-66b8ffb895-dgq8g\" (UID: \"dfad7071-5d2b-4ffb-a260-0e084ef08c99\") " pod="openshift-console/downloads-66b8ffb895-dgq8g" Mar 20 08:52:49.665458 master-0 kubenswrapper[18707]: I0320 08:52:49.665392 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb5l6\" (UniqueName: \"kubernetes.io/projected/13126442-5154-471a-97f7-fa6d917c1ba1-kube-api-access-hb5l6\") pod \"alertmanager-main-0\" (UID: \"13126442-5154-471a-97f7-fa6d917c1ba1\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:49.744315 master-0 kubenswrapper[18707]: I0320 08:52:49.744238 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-66b8ffb895-dgq8g" Mar 20 08:52:49.747660 master-0 kubenswrapper[18707]: I0320 08:52:49.747589 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-oauth-serving-cert\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.747981 master-0 kubenswrapper[18707]: I0320 08:52:49.747928 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzkd2\" (UniqueName: \"kubernetes.io/projected/4a7bfb28-46ef-431e-8f40-da7e723613d6-kube-api-access-dzkd2\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.748051 master-0 kubenswrapper[18707]: I0320 08:52:49.748025 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/89b00efb-dd09-42fd-824e-e5e317962bb3-audit-policies\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.748112 master-0 kubenswrapper[18707]: I0320 08:52:49.748088 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.748156 master-0 kubenswrapper[18707]: I0320 08:52:49.748126 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.748156 master-0 kubenswrapper[18707]: I0320 08:52:49.748149 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-web-config\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.748274 master-0 kubenswrapper[18707]: I0320 08:52:49.748205 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-service-ca\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.748274 master-0 kubenswrapper[18707]: I0320 08:52:49.748250 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.748274 master-0 kubenswrapper[18707]: I0320 08:52:49.748271 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2mgj\" (UniqueName: \"kubernetes.io/projected/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-kube-api-access-f2mgj\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.748380 master-0 kubenswrapper[18707]: I0320 08:52:49.748292 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.748380 master-0 kubenswrapper[18707]: I0320 08:52:49.748359 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-config-out\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.748452 master-0 kubenswrapper[18707]: I0320 08:52:49.748387 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-user-template-error\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.748452 master-0 kubenswrapper[18707]: I0320 08:52:49.748413 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.748452 master-0 kubenswrapper[18707]: I0320 08:52:49.748440 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.748923 master-0 kubenswrapper[18707]: I0320 08:52:49.748885 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-oauth-serving-cert\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.749351 master-0 kubenswrapper[18707]: I0320 08:52:49.749308 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/89b00efb-dd09-42fd-824e-e5e317962bb3-audit-policies\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.749576 master-0 kubenswrapper[18707]: I0320 08:52:49.749545 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.749928 master-0 kubenswrapper[18707]: E0320 08:52:49.749898 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:52:49.750006 master-0 kubenswrapper[18707]: E0320 08:52:49.749989 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert podName:4a7bfb28-46ef-431e-8f40-da7e723613d6 nodeName:}" failed. No retries permitted until 2026-03-20 08:52:50.249965124 +0000 UTC m=+715.406145480 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert") pod "console-58857c5dc9-9wf6k" (UID: "4a7bfb28-46ef-431e-8f40-da7e723613d6") : secret "console-serving-cert" not found Mar 20 08:52:49.750170 master-0 kubenswrapper[18707]: I0320 08:52:49.750135 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.750255 master-0 kubenswrapper[18707]: I0320 08:52:49.750229 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-user-template-login\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.750306 master-0 kubenswrapper[18707]: I0320 08:52:49.750272 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-config\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.750353 master-0 kubenswrapper[18707]: I0320 08:52:49.750304 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-oauth-serving-cert\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.750353 master-0 kubenswrapper[18707]: I0320 08:52:49.750312 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.750353 master-0 kubenswrapper[18707]: I0320 08:52:49.750327 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89b00efb-dd09-42fd-824e-e5e317962bb3-audit-dir\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.750476 master-0 kubenswrapper[18707]: I0320 08:52:49.750378 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.750476 master-0 kubenswrapper[18707]: I0320 08:52:49.750419 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.750476 master-0 kubenswrapper[18707]: I0320 08:52:49.750460 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.750601 master-0 kubenswrapper[18707]: I0320 08:52:49.750490 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.750601 master-0 kubenswrapper[18707]: I0320 08:52:49.750518 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.750601 master-0 kubenswrapper[18707]: I0320 08:52:49.750553 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.750601 master-0 kubenswrapper[18707]: I0320 08:52:49.750583 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-trusted-ca-bundle\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.750758 master-0 kubenswrapper[18707]: I0320 08:52:49.750624 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g865c\" (UniqueName: \"kubernetes.io/projected/89b00efb-dd09-42fd-824e-e5e317962bb3-kube-api-access-g865c\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.750758 master-0 kubenswrapper[18707]: I0320 08:52:49.750669 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.750758 master-0 kubenswrapper[18707]: I0320 08:52:49.750697 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-console-config\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.750758 master-0 kubenswrapper[18707]: I0320 08:52:49.750736 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx59m\" (UniqueName: \"kubernetes.io/projected/a22aca95-adb1-41a5-bf0f-fb10574ad5a5-kube-api-access-wx59m\") pod \"multus-admission-controller-69c949f678-v7wb6\" (UID: \"a22aca95-adb1-41a5-bf0f-fb10574ad5a5\") " pod="openshift-multus/multus-admission-controller-69c949f678-v7wb6" Mar 20 08:52:49.750927 master-0 kubenswrapper[18707]: I0320 08:52:49.750795 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.750927 master-0 kubenswrapper[18707]: I0320 08:52:49.750824 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-session\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.750927 master-0 kubenswrapper[18707]: I0320 08:52:49.750852 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-service-ca\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.750927 master-0 kubenswrapper[18707]: I0320 08:52:49.750853 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-service-ca\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.750927 master-0 kubenswrapper[18707]: I0320 08:52:49.750910 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.751154 master-0 kubenswrapper[18707]: I0320 08:52:49.750943 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.751518 master-0 kubenswrapper[18707]: I0320 08:52:49.751485 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-service-ca\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.752158 master-0 kubenswrapper[18707]: I0320 08:52:49.752124 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.752333 master-0 kubenswrapper[18707]: E0320 08:52:49.752281 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:52:49.752407 master-0 kubenswrapper[18707]: E0320 08:52:49.752386 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert podName:2ec18453-e818-426d-b2d5-e8364dc2067b nodeName:}" failed. No retries permitted until 2026-03-20 08:52:50.252356113 +0000 UTC m=+715.408536689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert") pod "console-ffb45cdb5-7r4lm" (UID: "2ec18453-e818-426d-b2d5-e8364dc2067b") : secret "console-serving-cert" not found Mar 20 08:52:49.752909 master-0 kubenswrapper[18707]: I0320 08:52:49.752835 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.754450 master-0 kubenswrapper[18707]: I0320 08:52:49.754132 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-config\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.754450 master-0 kubenswrapper[18707]: I0320 08:52:49.754149 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.754450 master-0 kubenswrapper[18707]: I0320 08:52:49.754436 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.754736 master-0 kubenswrapper[18707]: I0320 08:52:49.754470 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-router-certs\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.754736 master-0 kubenswrapper[18707]: I0320 08:52:49.754513 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-oauth-config\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.754736 master-0 kubenswrapper[18707]: I0320 08:52:49.754532 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.754736 master-0 kubenswrapper[18707]: I0320 08:52:49.754539 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-trusted-ca-bundle\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.755063 master-0 kubenswrapper[18707]: I0320 08:52:49.754885 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-config\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.755063 master-0 kubenswrapper[18707]: I0320 08:52:49.754943 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.755063 master-0 kubenswrapper[18707]: I0320 08:52:49.755002 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a22aca95-adb1-41a5-bf0f-fb10574ad5a5-webhook-certs\") pod \"multus-admission-controller-69c949f678-v7wb6\" (UID: \"a22aca95-adb1-41a5-bf0f-fb10574ad5a5\") " pod="openshift-multus/multus-admission-controller-69c949f678-v7wb6" Mar 20 08:52:49.755272 master-0 kubenswrapper[18707]: I0320 08:52:49.755060 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-service-ca\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.755272 master-0 kubenswrapper[18707]: I0320 08:52:49.755109 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.755272 master-0 kubenswrapper[18707]: I0320 08:52:49.755147 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-oauth-config\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.755272 master-0 kubenswrapper[18707]: I0320 08:52:49.755212 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwk2g\" (UniqueName: \"kubernetes.io/projected/2ec18453-e818-426d-b2d5-e8364dc2067b-kube-api-access-nwk2g\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.755272 master-0 kubenswrapper[18707]: I0320 08:52:49.755234 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.757042 master-0 kubenswrapper[18707]: I0320 08:52:49.755860 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-console-config\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.757042 master-0 kubenswrapper[18707]: I0320 08:52:49.755994 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-user-template-error\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.758242 master-0 kubenswrapper[18707]: I0320 08:52:49.757564 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-session\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.758242 master-0 kubenswrapper[18707]: I0320 08:52:49.757618 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89b00efb-dd09-42fd-824e-e5e317962bb3-audit-dir\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.758242 master-0 kubenswrapper[18707]: I0320 08:52:49.758013 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.759572 master-0 kubenswrapper[18707]: I0320 08:52:49.759375 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.760016 master-0 kubenswrapper[18707]: I0320 08:52:49.759948 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-config-out\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.760700 master-0 kubenswrapper[18707]: I0320 08:52:49.760646 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-user-template-login\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.761416 master-0 kubenswrapper[18707]: I0320 08:52:49.761367 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-service-ca\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.762651 master-0 kubenswrapper[18707]: I0320 08:52:49.762625 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-oauth-serving-cert\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.763391 master-0 kubenswrapper[18707]: I0320 08:52:49.763345 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.763391 master-0 kubenswrapper[18707]: I0320 08:52:49.763363 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-oauth-config\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.764263 master-0 kubenswrapper[18707]: I0320 08:52:49.764091 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.764540 master-0 kubenswrapper[18707]: I0320 08:52:49.764504 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.764810 master-0 kubenswrapper[18707]: I0320 08:52:49.764775 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.765696 master-0 kubenswrapper[18707]: I0320 08:52:49.765642 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.767700 master-0 kubenswrapper[18707]: I0320 08:52:49.767664 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-web-config\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.768126 master-0 kubenswrapper[18707]: I0320 08:52:49.768104 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.768386 master-0 kubenswrapper[18707]: I0320 08:52:49.768340 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.773091 master-0 kubenswrapper[18707]: I0320 08:52:49.773044 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.773091 master-0 kubenswrapper[18707]: I0320 08:52:49.773084 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.773091 master-0 kubenswrapper[18707]: I0320 08:52:49.773091 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.773259 master-0 kubenswrapper[18707]: I0320 08:52:49.773120 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-oauth-config\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.774406 master-0 kubenswrapper[18707]: I0320 08:52:49.774350 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.774488 master-0 kubenswrapper[18707]: I0320 08:52:49.774448 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/89b00efb-dd09-42fd-824e-e5e317962bb3-v4-0-config-system-router-certs\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.774824 master-0 kubenswrapper[18707]: I0320 08:52:49.774792 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a22aca95-adb1-41a5-bf0f-fb10574ad5a5-webhook-certs\") pod \"multus-admission-controller-69c949f678-v7wb6\" (UID: \"a22aca95-adb1-41a5-bf0f-fb10574ad5a5\") " pod="openshift-multus/multus-admission-controller-69c949f678-v7wb6" Mar 20 08:52:49.775640 master-0 kubenswrapper[18707]: I0320 08:52:49.775597 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2mgj\" (UniqueName: \"kubernetes.io/projected/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-kube-api-access-f2mgj\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.776366 master-0 kubenswrapper[18707]: I0320 08:52:49.776008 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzkd2\" (UniqueName: \"kubernetes.io/projected/4a7bfb28-46ef-431e-8f40-da7e723613d6-kube-api-access-dzkd2\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:49.777215 master-0 kubenswrapper[18707]: I0320 08:52:49.777118 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx59m\" (UniqueName: \"kubernetes.io/projected/a22aca95-adb1-41a5-bf0f-fb10574ad5a5-kube-api-access-wx59m\") pod \"multus-admission-controller-69c949f678-v7wb6\" (UID: \"a22aca95-adb1-41a5-bf0f-fb10574ad5a5\") " pod="openshift-multus/multus-admission-controller-69c949f678-v7wb6" Mar 20 08:52:49.777937 master-0 kubenswrapper[18707]: I0320 08:52:49.777853 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ae74b21-9a7d-4dda-9c35-65dd5b27dec6-config\") pod \"prometheus-k8s-0\" (UID: \"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.779176 master-0 kubenswrapper[18707]: I0320 08:52:49.779142 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g865c\" (UniqueName: \"kubernetes.io/projected/89b00efb-dd09-42fd-824e-e5e317962bb3-kube-api-access-g865c\") pod \"oauth-openshift-8684f7dbf-tqtg9\" (UID: \"89b00efb-dd09-42fd-824e-e5e317962bb3\") " pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.784546 master-0 kubenswrapper[18707]: I0320 08:52:49.784472 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwk2g\" (UniqueName: \"kubernetes.io/projected/2ec18453-e818-426d-b2d5-e8364dc2067b-kube-api-access-nwk2g\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:49.831217 master-0 kubenswrapper[18707]: I0320 08:52:49.831141 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-69c949f678-v7wb6" Mar 20 08:52:49.846048 master-0 kubenswrapper[18707]: I0320 08:52:49.845974 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:49.880813 master-0 kubenswrapper[18707]: I0320 08:52:49.880747 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:49.907487 master-0 kubenswrapper[18707]: I0320 08:52:49.907435 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 20 08:52:50.198955 master-0 kubenswrapper[18707]: I0320 08:52:50.198867 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-66b8ffb895-dgq8g"] Mar 20 08:52:50.268730 master-0 kubenswrapper[18707]: I0320 08:52:50.268649 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:50.268986 master-0 kubenswrapper[18707]: I0320 08:52:50.268744 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:50.269076 master-0 kubenswrapper[18707]: E0320 08:52:50.268968 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:52:50.269076 master-0 kubenswrapper[18707]: E0320 08:52:50.269025 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:52:50.269150 master-0 kubenswrapper[18707]: E0320 08:52:50.269102 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert podName:2ec18453-e818-426d-b2d5-e8364dc2067b nodeName:}" failed. No retries permitted until 2026-03-20 08:52:51.269081032 +0000 UTC m=+716.425261398 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert") pod "console-ffb45cdb5-7r4lm" (UID: "2ec18453-e818-426d-b2d5-e8364dc2067b") : secret "console-serving-cert" not found Mar 20 08:52:50.269150 master-0 kubenswrapper[18707]: E0320 08:52:50.269124 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert podName:4a7bfb28-46ef-431e-8f40-da7e723613d6 nodeName:}" failed. No retries permitted until 2026-03-20 08:52:51.269116113 +0000 UTC m=+716.425296479 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert") pod "console-58857c5dc9-9wf6k" (UID: "4a7bfb28-46ef-431e-8f40-da7e723613d6") : secret "console-serving-cert" not found Mar 20 08:52:50.320104 master-0 kubenswrapper[18707]: I0320 08:52:50.319578 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-69c949f678-v7wb6"] Mar 20 08:52:50.398595 master-0 kubenswrapper[18707]: I0320 08:52:50.398531 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8684f7dbf-tqtg9"] Mar 20 08:52:50.407538 master-0 kubenswrapper[18707]: W0320 08:52:50.407321 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89b00efb_dd09_42fd_824e_e5e317962bb3.slice/crio-5be8535d3a69aa4c4fda33268d5d58ff3f354c76e4831f6bc8c9e6d3cc4ccc00 WatchSource:0}: Error finding container 5be8535d3a69aa4c4fda33268d5d58ff3f354c76e4831f6bc8c9e6d3cc4ccc00: Status 404 returned error can't find the container with id 5be8535d3a69aa4c4fda33268d5d58ff3f354c76e4831f6bc8c9e6d3cc4ccc00 Mar 20 08:52:50.414204 master-0 kubenswrapper[18707]: I0320 08:52:50.413908 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 20 08:52:50.437205 master-0 kubenswrapper[18707]: I0320 08:52:50.436900 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 20 08:52:50.443250 master-0 kubenswrapper[18707]: W0320 08:52:50.443176 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ae74b21_9a7d_4dda_9c35_65dd5b27dec6.slice/crio-c9fecd51cd1bb333a080655a521e4a7562ea36599e9333ee6cff88674b2b93ad WatchSource:0}: Error finding container c9fecd51cd1bb333a080655a521e4a7562ea36599e9333ee6cff88674b2b93ad: Status 404 returned error can't find the container with id c9fecd51cd1bb333a080655a521e4a7562ea36599e9333ee6cff88674b2b93ad Mar 20 08:52:50.640777 master-0 kubenswrapper[18707]: I0320 08:52:50.640679 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-66b8ffb895-dgq8g" event={"ID":"dfad7071-5d2b-4ffb-a260-0e084ef08c99","Type":"ContainerStarted","Data":"269020740f75311facef978836805818cdaf19341bb554090b9970bd227e72e0"} Mar 20 08:52:50.645304 master-0 kubenswrapper[18707]: I0320 08:52:50.645247 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6","Type":"ContainerStarted","Data":"c9fecd51cd1bb333a080655a521e4a7562ea36599e9333ee6cff88674b2b93ad"} Mar 20 08:52:50.651926 master-0 kubenswrapper[18707]: I0320 08:52:50.648118 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"13126442-5154-471a-97f7-fa6d917c1ba1","Type":"ContainerStarted","Data":"af5d2d9a7527297673e732df0c5415945eaf4d5258f3963ba204837a7b0751ac"} Mar 20 08:52:50.651926 master-0 kubenswrapper[18707]: I0320 08:52:50.651598 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-69c949f678-v7wb6" event={"ID":"a22aca95-adb1-41a5-bf0f-fb10574ad5a5","Type":"ContainerStarted","Data":"ada34b71113c3ad789390c2efea4cbc9e25545443c3248e63b0f34880a70e910"} Mar 20 08:52:50.651926 master-0 kubenswrapper[18707]: I0320 08:52:50.651632 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-69c949f678-v7wb6" event={"ID":"a22aca95-adb1-41a5-bf0f-fb10574ad5a5","Type":"ContainerStarted","Data":"7c0af0f8a6222b24a586ec05b889daf4c8e86f120c485c00f94a2fb3a35bbee7"} Mar 20 08:52:50.656256 master-0 kubenswrapper[18707]: I0320 08:52:50.656215 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" event={"ID":"89b00efb-dd09-42fd-824e-e5e317962bb3","Type":"ContainerStarted","Data":"5be8535d3a69aa4c4fda33268d5d58ff3f354c76e4831f6bc8c9e6d3cc4ccc00"} Mar 20 08:52:51.289663 master-0 kubenswrapper[18707]: I0320 08:52:51.287938 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:51.289663 master-0 kubenswrapper[18707]: I0320 08:52:51.288099 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:51.289663 master-0 kubenswrapper[18707]: E0320 08:52:51.288283 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:52:51.289663 master-0 kubenswrapper[18707]: E0320 08:52:51.288359 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert podName:4a7bfb28-46ef-431e-8f40-da7e723613d6 nodeName:}" failed. No retries permitted until 2026-03-20 08:52:53.288338792 +0000 UTC m=+718.444519138 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert") pod "console-58857c5dc9-9wf6k" (UID: "4a7bfb28-46ef-431e-8f40-da7e723613d6") : secret "console-serving-cert" not found Mar 20 08:52:51.289663 master-0 kubenswrapper[18707]: E0320 08:52:51.288796 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:52:51.289663 master-0 kubenswrapper[18707]: E0320 08:52:51.288821 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert podName:2ec18453-e818-426d-b2d5-e8364dc2067b nodeName:}" failed. No retries permitted until 2026-03-20 08:52:53.288814006 +0000 UTC m=+718.444994362 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert") pod "console-ffb45cdb5-7r4lm" (UID: "2ec18453-e818-426d-b2d5-e8364dc2067b") : secret "console-serving-cert" not found Mar 20 08:52:51.665663 master-0 kubenswrapper[18707]: I0320 08:52:51.665563 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" event={"ID":"89b00efb-dd09-42fd-824e-e5e317962bb3","Type":"ContainerStarted","Data":"2d1d1412c668aee93dabb24fe2bb26b8370272770d0c02a9f84b90e5eb6f2c70"} Mar 20 08:52:51.666379 master-0 kubenswrapper[18707]: I0320 08:52:51.666288 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:51.678292 master-0 kubenswrapper[18707]: I0320 08:52:51.671712 18707 generic.go:334] "Generic (PLEG): container finished" podID="8ae74b21-9a7d-4dda-9c35-65dd5b27dec6" containerID="296e9d4c4db7a7bcd7ce96b535cbd346a841b666cbd8a63e6042d09386f01c80" exitCode=0 Mar 20 08:52:51.678292 master-0 kubenswrapper[18707]: I0320 08:52:51.671830 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6","Type":"ContainerDied","Data":"296e9d4c4db7a7bcd7ce96b535cbd346a841b666cbd8a63e6042d09386f01c80"} Mar 20 08:52:51.678292 master-0 kubenswrapper[18707]: I0320 08:52:51.674416 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-69c949f678-v7wb6" event={"ID":"a22aca95-adb1-41a5-bf0f-fb10574ad5a5","Type":"ContainerStarted","Data":"1ae2fc9eaf451cbda6bc209370b5f1063a411e46b34f98800bfe4a605f6d407a"} Mar 20 08:52:51.678292 master-0 kubenswrapper[18707]: I0320 08:52:51.675149 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" Mar 20 08:52:51.678292 master-0 kubenswrapper[18707]: I0320 08:52:51.677554 18707 generic.go:334] "Generic (PLEG): container finished" podID="13126442-5154-471a-97f7-fa6d917c1ba1" containerID="13c248c75a30f5da3cb8f6689d7b9c4cbe7c1131d5bc710b0842d84e4ec30edc" exitCode=0 Mar 20 08:52:51.678292 master-0 kubenswrapper[18707]: I0320 08:52:51.677625 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"13126442-5154-471a-97f7-fa6d917c1ba1","Type":"ContainerDied","Data":"13c248c75a30f5da3cb8f6689d7b9c4cbe7c1131d5bc710b0842d84e4ec30edc"} Mar 20 08:52:51.700333 master-0 kubenswrapper[18707]: I0320 08:52:51.695860 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-8684f7dbf-tqtg9" podStartSLOduration=65.695834015 podStartE2EDuration="1m5.695834015s" podCreationTimestamp="2026-03-20 08:51:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:52:51.692962312 +0000 UTC m=+716.849142688" watchObservedRunningTime="2026-03-20 08:52:51.695834015 +0000 UTC m=+716.852014381" Mar 20 08:52:51.785626 master-0 kubenswrapper[18707]: I0320 08:52:51.773379 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-69c949f678-v7wb6" podStartSLOduration=182.773347613 podStartE2EDuration="3m2.773347613s" podCreationTimestamp="2026-03-20 08:49:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:52:51.756772309 +0000 UTC m=+716.912952665" watchObservedRunningTime="2026-03-20 08:52:51.773347613 +0000 UTC m=+716.929527969" Mar 20 08:52:51.794666 master-0 kubenswrapper[18707]: I0320 08:52:51.794259 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-649577484c-p72cd"] Mar 20 08:52:51.794666 master-0 kubenswrapper[18707]: I0320 08:52:51.794556 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-649577484c-p72cd" podUID="9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546" containerName="multus-admission-controller" containerID="cri-o://2e6d9acc962059a71851afc29bfffa1cd8b01261ca8f4794a3d9883228efc778" gracePeriod=30 Mar 20 08:52:51.794805 master-0 kubenswrapper[18707]: I0320 08:52:51.794714 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-649577484c-p72cd" podUID="9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546" containerName="kube-rbac-proxy" containerID="cri-o://ac514f213147fbb98ac1f2791391ac7cc4bbf81039d4cbb379d8da7b1cb3d38b" gracePeriod=30 Mar 20 08:52:52.692089 master-0 kubenswrapper[18707]: I0320 08:52:52.692027 18707 generic.go:334] "Generic (PLEG): container finished" podID="9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546" containerID="ac514f213147fbb98ac1f2791391ac7cc4bbf81039d4cbb379d8da7b1cb3d38b" exitCode=0 Mar 20 08:52:52.692834 master-0 kubenswrapper[18707]: I0320 08:52:52.692247 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-649577484c-p72cd" event={"ID":"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546","Type":"ContainerDied","Data":"ac514f213147fbb98ac1f2791391ac7cc4bbf81039d4cbb379d8da7b1cb3d38b"} Mar 20 08:52:52.696291 master-0 kubenswrapper[18707]: I0320 08:52:52.696261 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6","Type":"ContainerStarted","Data":"a9414f8602c5c74c5735918ab0f34a7c5bddae0d827656ba729433dc4eeeb415"} Mar 20 08:52:52.696390 master-0 kubenswrapper[18707]: I0320 08:52:52.696377 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6","Type":"ContainerStarted","Data":"da13e0fbaa4c14824038263ad2df44e2e997f48157d5dcf74778b40901609fb7"} Mar 20 08:52:52.696484 master-0 kubenswrapper[18707]: I0320 08:52:52.696440 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6","Type":"ContainerStarted","Data":"ed370d9b99a88c303165c129ea0491c2b6bcc1855992ccdd8e468cba381b41a3"} Mar 20 08:52:52.696565 master-0 kubenswrapper[18707]: I0320 08:52:52.696551 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6","Type":"ContainerStarted","Data":"cca52befec32806d12a33e607b8000e1a429e74f84d71ba8fbf464aa30acca32"} Mar 20 08:52:52.699909 master-0 kubenswrapper[18707]: I0320 08:52:52.699884 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"13126442-5154-471a-97f7-fa6d917c1ba1","Type":"ContainerStarted","Data":"0c733f3a3cf2dfed0d1e3f6275c1e2576b8804f4d1fcfcc0ba95b37b85316a80"} Mar 20 08:52:52.700045 master-0 kubenswrapper[18707]: I0320 08:52:52.700028 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"13126442-5154-471a-97f7-fa6d917c1ba1","Type":"ContainerStarted","Data":"6447e2c01b1c0dacdde16e8fe3289316b0ebaee4221c27575a93bcba6d676b99"} Mar 20 08:52:52.700112 master-0 kubenswrapper[18707]: I0320 08:52:52.700099 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"13126442-5154-471a-97f7-fa6d917c1ba1","Type":"ContainerStarted","Data":"14fc51c40b9170818df2bdab3b7e67fe8a80615f38def434ce543f6efb94932e"} Mar 20 08:52:52.700179 master-0 kubenswrapper[18707]: I0320 08:52:52.700167 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"13126442-5154-471a-97f7-fa6d917c1ba1","Type":"ContainerStarted","Data":"561b933c647d8ad4238daf540ae833d0085a1e672f6cc654404f2ffc828f57ac"} Mar 20 08:52:53.359205 master-0 kubenswrapper[18707]: I0320 08:52:53.359080 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:53.359205 master-0 kubenswrapper[18707]: I0320 08:52:53.359211 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:53.359837 master-0 kubenswrapper[18707]: E0320 08:52:53.359308 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:52:53.359837 master-0 kubenswrapper[18707]: E0320 08:52:53.359398 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert podName:4a7bfb28-46ef-431e-8f40-da7e723613d6 nodeName:}" failed. No retries permitted until 2026-03-20 08:52:57.359375975 +0000 UTC m=+722.515556331 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert") pod "console-58857c5dc9-9wf6k" (UID: "4a7bfb28-46ef-431e-8f40-da7e723613d6") : secret "console-serving-cert" not found Mar 20 08:52:53.359837 master-0 kubenswrapper[18707]: E0320 08:52:53.359308 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:52:53.359837 master-0 kubenswrapper[18707]: E0320 08:52:53.359463 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert podName:2ec18453-e818-426d-b2d5-e8364dc2067b nodeName:}" failed. No retries permitted until 2026-03-20 08:52:57.359448988 +0000 UTC m=+722.515629344 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert") pod "console-ffb45cdb5-7r4lm" (UID: "2ec18453-e818-426d-b2d5-e8364dc2067b") : secret "console-serving-cert" not found Mar 20 08:52:53.723570 master-0 kubenswrapper[18707]: I0320 08:52:53.723389 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6","Type":"ContainerStarted","Data":"42287087b3746389cbb529e4d2386e3f141ff48ac429f7dc20276ba0077b2907"} Mar 20 08:52:53.723570 master-0 kubenswrapper[18707]: I0320 08:52:53.723452 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8ae74b21-9a7d-4dda-9c35-65dd5b27dec6","Type":"ContainerStarted","Data":"8be4aaea489635371bdb613eda17e7aca04034743e64a76031cc28ce945a5fb1"} Mar 20 08:52:53.731075 master-0 kubenswrapper[18707]: I0320 08:52:53.731031 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"13126442-5154-471a-97f7-fa6d917c1ba1","Type":"ContainerStarted","Data":"9c268d5785530b6da7f0a3a4ed0f320b73b554bb0ad60d9760c6c28fd15dab7c"} Mar 20 08:52:53.731157 master-0 kubenswrapper[18707]: I0320 08:52:53.731084 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"13126442-5154-471a-97f7-fa6d917c1ba1","Type":"ContainerStarted","Data":"a77b27791274bacf72db31833d51548df9d2fa49ba882ba88ebfacb879fac70c"} Mar 20 08:52:53.766127 master-0 kubenswrapper[18707]: I0320 08:52:53.766025 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=143.765953602 podStartE2EDuration="2m23.765953602s" podCreationTimestamp="2026-03-20 08:50:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:52:53.758529839 +0000 UTC m=+718.914710215" watchObservedRunningTime="2026-03-20 08:52:53.765953602 +0000 UTC m=+718.922133958" Mar 20 08:52:53.827275 master-0 kubenswrapper[18707]: I0320 08:52:53.827157 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=147.827084711 podStartE2EDuration="2m27.827084711s" podCreationTimestamp="2026-03-20 08:50:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:52:53.805469103 +0000 UTC m=+718.961649459" watchObservedRunningTime="2026-03-20 08:52:53.827084711 +0000 UTC m=+718.983265067" Mar 20 08:52:54.847835 master-0 kubenswrapper[18707]: I0320 08:52:54.847740 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:52:57.445218 master-0 kubenswrapper[18707]: I0320 08:52:57.444411 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:52:57.445218 master-0 kubenswrapper[18707]: I0320 08:52:57.444561 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:52:57.445218 master-0 kubenswrapper[18707]: E0320 08:52:57.444652 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:52:57.445218 master-0 kubenswrapper[18707]: E0320 08:52:57.444770 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert podName:4a7bfb28-46ef-431e-8f40-da7e723613d6 nodeName:}" failed. No retries permitted until 2026-03-20 08:53:05.444745829 +0000 UTC m=+730.600926185 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert") pod "console-58857c5dc9-9wf6k" (UID: "4a7bfb28-46ef-431e-8f40-da7e723613d6") : secret "console-serving-cert" not found Mar 20 08:52:57.445218 master-0 kubenswrapper[18707]: E0320 08:52:57.444869 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:52:57.445218 master-0 kubenswrapper[18707]: E0320 08:52:57.445010 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert podName:2ec18453-e818-426d-b2d5-e8364dc2067b nodeName:}" failed. No retries permitted until 2026-03-20 08:53:05.444977195 +0000 UTC m=+730.601157721 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert") pod "console-ffb45cdb5-7r4lm" (UID: "2ec18453-e818-426d-b2d5-e8364dc2067b") : secret "console-serving-cert" not found Mar 20 08:53:05.515311 master-0 kubenswrapper[18707]: I0320 08:53:05.513492 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:53:05.515311 master-0 kubenswrapper[18707]: I0320 08:53:05.513689 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:53:05.515311 master-0 kubenswrapper[18707]: E0320 08:53:05.513718 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:53:05.515311 master-0 kubenswrapper[18707]: E0320 08:53:05.513855 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert podName:4a7bfb28-46ef-431e-8f40-da7e723613d6 nodeName:}" failed. No retries permitted until 2026-03-20 08:53:21.513825116 +0000 UTC m=+746.670005472 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert") pod "console-58857c5dc9-9wf6k" (UID: "4a7bfb28-46ef-431e-8f40-da7e723613d6") : secret "console-serving-cert" not found Mar 20 08:53:05.515311 master-0 kubenswrapper[18707]: E0320 08:53:05.514041 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:53:05.515311 master-0 kubenswrapper[18707]: E0320 08:53:05.514248 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert podName:2ec18453-e818-426d-b2d5-e8364dc2067b nodeName:}" failed. No retries permitted until 2026-03-20 08:53:21.514217397 +0000 UTC m=+746.670397763 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert") pod "console-ffb45cdb5-7r4lm" (UID: "2ec18453-e818-426d-b2d5-e8364dc2067b") : secret "console-serving-cert" not found Mar 20 08:53:21.564632 master-0 kubenswrapper[18707]: I0320 08:53:21.564575 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:53:21.565415 master-0 kubenswrapper[18707]: I0320 08:53:21.564651 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:53:21.565415 master-0 kubenswrapper[18707]: E0320 08:53:21.564766 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:53:21.565415 master-0 kubenswrapper[18707]: E0320 08:53:21.564842 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert podName:4a7bfb28-46ef-431e-8f40-da7e723613d6 nodeName:}" failed. No retries permitted until 2026-03-20 08:53:53.564821885 +0000 UTC m=+778.721002231 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert") pod "console-58857c5dc9-9wf6k" (UID: "4a7bfb28-46ef-431e-8f40-da7e723613d6") : secret "console-serving-cert" not found Mar 20 08:53:21.565415 master-0 kubenswrapper[18707]: E0320 08:53:21.564779 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:53:21.565415 master-0 kubenswrapper[18707]: E0320 08:53:21.564887 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert podName:2ec18453-e818-426d-b2d5-e8364dc2067b nodeName:}" failed. No retries permitted until 2026-03-20 08:53:53.564875637 +0000 UTC m=+778.721055993 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert") pod "console-ffb45cdb5-7r4lm" (UID: "2ec18453-e818-426d-b2d5-e8364dc2067b") : secret "console-serving-cert" not found Mar 20 08:53:28.731969 master-0 kubenswrapper[18707]: I0320 08:53:28.730996 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-649577484c-p72cd_9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546/multus-admission-controller/0.log" Mar 20 08:53:28.733039 master-0 kubenswrapper[18707]: I0320 08:53:28.732743 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-649577484c-p72cd" Mar 20 08:53:28.836431 master-0 kubenswrapper[18707]: I0320 08:53:28.835770 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546-webhook-certs\") pod \"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546\" (UID: \"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546\") " Mar 20 08:53:28.836431 master-0 kubenswrapper[18707]: I0320 08:53:28.836011 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sffd6\" (UniqueName: \"kubernetes.io/projected/9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546-kube-api-access-sffd6\") pod \"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546\" (UID: \"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546\") " Mar 20 08:53:28.839016 master-0 kubenswrapper[18707]: I0320 08:53:28.838974 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546" (UID: "9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:53:28.846083 master-0 kubenswrapper[18707]: I0320 08:53:28.845967 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546-kube-api-access-sffd6" (OuterVolumeSpecName: "kube-api-access-sffd6") pod "9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546" (UID: "9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546"). InnerVolumeSpecName "kube-api-access-sffd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:53:28.938672 master-0 kubenswrapper[18707]: I0320 08:53:28.938596 18707 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546-webhook-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 08:53:28.938672 master-0 kubenswrapper[18707]: I0320 08:53:28.938642 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sffd6\" (UniqueName: \"kubernetes.io/projected/9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546-kube-api-access-sffd6\") on node \"master-0\" DevicePath \"\"" Mar 20 08:53:29.086752 master-0 kubenswrapper[18707]: I0320 08:53:29.086686 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-649577484c-p72cd_9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546/multus-admission-controller/0.log" Mar 20 08:53:29.087024 master-0 kubenswrapper[18707]: I0320 08:53:29.086802 18707 generic.go:334] "Generic (PLEG): container finished" podID="9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546" containerID="2e6d9acc962059a71851afc29bfffa1cd8b01261ca8f4794a3d9883228efc778" exitCode=137 Mar 20 08:53:29.087024 master-0 kubenswrapper[18707]: I0320 08:53:29.086852 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-649577484c-p72cd" event={"ID":"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546","Type":"ContainerDied","Data":"2e6d9acc962059a71851afc29bfffa1cd8b01261ca8f4794a3d9883228efc778"} Mar 20 08:53:29.087024 master-0 kubenswrapper[18707]: I0320 08:53:29.086895 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-649577484c-p72cd" event={"ID":"9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546","Type":"ContainerDied","Data":"70c28deb955d7c56353c6f5f88a7741aa086cb22ed0b1cbdc8f73512b3523f0a"} Mar 20 08:53:29.087024 master-0 kubenswrapper[18707]: I0320 08:53:29.086919 18707 scope.go:117] "RemoveContainer" containerID="ac514f213147fbb98ac1f2791391ac7cc4bbf81039d4cbb379d8da7b1cb3d38b" Mar 20 08:53:29.087024 master-0 kubenswrapper[18707]: I0320 08:53:29.086919 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-649577484c-p72cd" Mar 20 08:53:29.117374 master-0 kubenswrapper[18707]: I0320 08:53:29.116839 18707 scope.go:117] "RemoveContainer" containerID="2e6d9acc962059a71851afc29bfffa1cd8b01261ca8f4794a3d9883228efc778" Mar 20 08:53:29.166692 master-0 kubenswrapper[18707]: I0320 08:53:29.166644 18707 scope.go:117] "RemoveContainer" containerID="ac514f213147fbb98ac1f2791391ac7cc4bbf81039d4cbb379d8da7b1cb3d38b" Mar 20 08:53:29.167162 master-0 kubenswrapper[18707]: E0320 08:53:29.167113 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac514f213147fbb98ac1f2791391ac7cc4bbf81039d4cbb379d8da7b1cb3d38b\": container with ID starting with ac514f213147fbb98ac1f2791391ac7cc4bbf81039d4cbb379d8da7b1cb3d38b not found: ID does not exist" containerID="ac514f213147fbb98ac1f2791391ac7cc4bbf81039d4cbb379d8da7b1cb3d38b" Mar 20 08:53:29.167245 master-0 kubenswrapper[18707]: I0320 08:53:29.167179 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac514f213147fbb98ac1f2791391ac7cc4bbf81039d4cbb379d8da7b1cb3d38b"} err="failed to get container status \"ac514f213147fbb98ac1f2791391ac7cc4bbf81039d4cbb379d8da7b1cb3d38b\": rpc error: code = NotFound desc = could not find container \"ac514f213147fbb98ac1f2791391ac7cc4bbf81039d4cbb379d8da7b1cb3d38b\": container with ID starting with ac514f213147fbb98ac1f2791391ac7cc4bbf81039d4cbb379d8da7b1cb3d38b not found: ID does not exist" Mar 20 08:53:29.167245 master-0 kubenswrapper[18707]: I0320 08:53:29.167243 18707 scope.go:117] "RemoveContainer" containerID="2e6d9acc962059a71851afc29bfffa1cd8b01261ca8f4794a3d9883228efc778" Mar 20 08:53:29.168024 master-0 kubenswrapper[18707]: E0320 08:53:29.167915 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e6d9acc962059a71851afc29bfffa1cd8b01261ca8f4794a3d9883228efc778\": container with ID starting with 2e6d9acc962059a71851afc29bfffa1cd8b01261ca8f4794a3d9883228efc778 not found: ID does not exist" containerID="2e6d9acc962059a71851afc29bfffa1cd8b01261ca8f4794a3d9883228efc778" Mar 20 08:53:29.168082 master-0 kubenswrapper[18707]: I0320 08:53:29.168019 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e6d9acc962059a71851afc29bfffa1cd8b01261ca8f4794a3d9883228efc778"} err="failed to get container status \"2e6d9acc962059a71851afc29bfffa1cd8b01261ca8f4794a3d9883228efc778\": rpc error: code = NotFound desc = could not find container \"2e6d9acc962059a71851afc29bfffa1cd8b01261ca8f4794a3d9883228efc778\": container with ID starting with 2e6d9acc962059a71851afc29bfffa1cd8b01261ca8f4794a3d9883228efc778 not found: ID does not exist" Mar 20 08:53:29.377035 master-0 kubenswrapper[18707]: I0320 08:53:29.376890 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-649577484c-p72cd"] Mar 20 08:53:29.423963 master-0 kubenswrapper[18707]: I0320 08:53:29.423888 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-649577484c-p72cd"] Mar 20 08:53:31.108614 master-0 kubenswrapper[18707]: I0320 08:53:31.108531 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546" path="/var/lib/kubelet/pods/9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546/volumes" Mar 20 08:53:32.115546 master-0 kubenswrapper[18707]: I0320 08:53:32.115468 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-66b8ffb895-dgq8g" event={"ID":"dfad7071-5d2b-4ffb-a260-0e084ef08c99","Type":"ContainerStarted","Data":"bd740911ac794c117619b309a32512895e82d28847ef5227be66bd9811032a4b"} Mar 20 08:53:32.116240 master-0 kubenswrapper[18707]: I0320 08:53:32.115804 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-66b8ffb895-dgq8g" Mar 20 08:53:32.118282 master-0 kubenswrapper[18707]: I0320 08:53:32.118174 18707 patch_prober.go:28] interesting pod/downloads-66b8ffb895-dgq8g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.100:8080/\": dial tcp 10.128.0.100:8080: connect: connection refused" start-of-body= Mar 20 08:53:32.118376 master-0 kubenswrapper[18707]: I0320 08:53:32.118327 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-dgq8g" podUID="dfad7071-5d2b-4ffb-a260-0e084ef08c99" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.100:8080/\": dial tcp 10.128.0.100:8080: connect: connection refused" Mar 20 08:53:32.167460 master-0 kubenswrapper[18707]: I0320 08:53:32.167335 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-66b8ffb895-dgq8g" podStartSLOduration=176.195324381 podStartE2EDuration="3m37.167303439s" podCreationTimestamp="2026-03-20 08:49:55 +0000 UTC" firstStartedPulling="2026-03-20 08:52:50.213689116 +0000 UTC m=+715.369869492" lastFinishedPulling="2026-03-20 08:53:31.185668154 +0000 UTC m=+756.341848550" observedRunningTime="2026-03-20 08:53:32.149739496 +0000 UTC m=+757.305919842" watchObservedRunningTime="2026-03-20 08:53:32.167303439 +0000 UTC m=+757.323483805" Mar 20 08:53:33.126809 master-0 kubenswrapper[18707]: I0320 08:53:33.125776 18707 patch_prober.go:28] interesting pod/downloads-66b8ffb895-dgq8g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.100:8080/\": dial tcp 10.128.0.100:8080: connect: connection refused" start-of-body= Mar 20 08:53:33.126809 master-0 kubenswrapper[18707]: I0320 08:53:33.125843 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-dgq8g" podUID="dfad7071-5d2b-4ffb-a260-0e084ef08c99" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.100:8080/\": dial tcp 10.128.0.100:8080: connect: connection refused" Mar 20 08:53:39.754500 master-0 kubenswrapper[18707]: I0320 08:53:39.754411 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-66b8ffb895-dgq8g" Mar 20 08:53:49.847223 master-0 kubenswrapper[18707]: I0320 08:53:49.847099 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:53:49.882412 master-0 kubenswrapper[18707]: I0320 08:53:49.882334 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:53:50.349073 master-0 kubenswrapper[18707]: I0320 08:53:50.349010 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 08:53:53.579014 master-0 kubenswrapper[18707]: I0320 08:53:53.578936 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:53:53.579767 master-0 kubenswrapper[18707]: I0320 08:53:53.579076 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:53:53.579767 master-0 kubenswrapper[18707]: E0320 08:53:53.579231 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:53:53.579767 master-0 kubenswrapper[18707]: E0320 08:53:53.579298 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert podName:4a7bfb28-46ef-431e-8f40-da7e723613d6 nodeName:}" failed. No retries permitted until 2026-03-20 08:54:57.579272531 +0000 UTC m=+842.735452887 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert") pod "console-58857c5dc9-9wf6k" (UID: "4a7bfb28-46ef-431e-8f40-da7e723613d6") : secret "console-serving-cert" not found Mar 20 08:53:53.579767 master-0 kubenswrapper[18707]: E0320 08:53:53.579515 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:53:53.579767 master-0 kubenswrapper[18707]: E0320 08:53:53.579677 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert podName:2ec18453-e818-426d-b2d5-e8364dc2067b nodeName:}" failed. No retries permitted until 2026-03-20 08:54:57.57961222 +0000 UTC m=+842.735792626 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert") pod "console-ffb45cdb5-7r4lm" (UID: "2ec18453-e818-426d-b2d5-e8364dc2067b") : secret "console-serving-cert" not found Mar 20 08:54:52.480148 master-0 kubenswrapper[18707]: E0320 08:54:52.480042 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[console-serving-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-console/console-58857c5dc9-9wf6k" podUID="4a7bfb28-46ef-431e-8f40-da7e723613d6" Mar 20 08:54:52.559447 master-0 kubenswrapper[18707]: E0320 08:54:52.559316 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[console-serving-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-console/console-ffb45cdb5-7r4lm" podUID="2ec18453-e818-426d-b2d5-e8364dc2067b" Mar 20 08:54:52.924383 master-0 kubenswrapper[18707]: I0320 08:54:52.924296 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:54:52.924866 master-0 kubenswrapper[18707]: I0320 08:54:52.924505 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:54:57.678162 master-0 kubenswrapper[18707]: I0320 08:54:57.678088 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert\") pod \"console-58857c5dc9-9wf6k\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:54:57.678162 master-0 kubenswrapper[18707]: I0320 08:54:57.678169 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert\") pod \"console-ffb45cdb5-7r4lm\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:54:57.678939 master-0 kubenswrapper[18707]: E0320 08:54:57.678282 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:54:57.678939 master-0 kubenswrapper[18707]: E0320 08:54:57.678339 18707 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 20 08:54:57.678939 master-0 kubenswrapper[18707]: E0320 08:54:57.678356 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert podName:4a7bfb28-46ef-431e-8f40-da7e723613d6 nodeName:}" failed. No retries permitted until 2026-03-20 08:56:59.678337707 +0000 UTC m=+964.834518063 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert") pod "console-58857c5dc9-9wf6k" (UID: "4a7bfb28-46ef-431e-8f40-da7e723613d6") : secret "console-serving-cert" not found Mar 20 08:54:57.678939 master-0 kubenswrapper[18707]: E0320 08:54:57.678381 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert podName:2ec18453-e818-426d-b2d5-e8364dc2067b nodeName:}" failed. No retries permitted until 2026-03-20 08:56:59.678367848 +0000 UTC m=+964.834548204 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert") pod "console-ffb45cdb5-7r4lm" (UID: "2ec18453-e818-426d-b2d5-e8364dc2067b") : secret "console-serving-cert" not found Mar 20 08:55:41.169175 master-0 kubenswrapper[18707]: I0320 08:55:41.169075 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 20 08:55:41.170041 master-0 kubenswrapper[18707]: E0320 08:55:41.169495 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546" containerName="kube-rbac-proxy" Mar 20 08:55:41.170041 master-0 kubenswrapper[18707]: I0320 08:55:41.169513 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546" containerName="kube-rbac-proxy" Mar 20 08:55:41.170041 master-0 kubenswrapper[18707]: E0320 08:55:41.169562 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546" containerName="multus-admission-controller" Mar 20 08:55:41.170041 master-0 kubenswrapper[18707]: I0320 08:55:41.169571 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546" containerName="multus-admission-controller" Mar 20 08:55:41.170041 master-0 kubenswrapper[18707]: I0320 08:55:41.169778 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546" containerName="multus-admission-controller" Mar 20 08:55:41.170041 master-0 kubenswrapper[18707]: I0320 08:55:41.169842 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f2a01e2-89ac-4dd4-bc32-6ad4ae0fc546" containerName="kube-rbac-proxy" Mar 20 08:55:41.170550 master-0 kubenswrapper[18707]: I0320 08:55:41.170519 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 20 08:55:41.173725 master-0 kubenswrapper[18707]: I0320 08:55:41.173626 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 08:55:41.193265 master-0 kubenswrapper[18707]: I0320 08:55:41.187255 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 20 08:55:41.199737 master-0 kubenswrapper[18707]: I0320 08:55:41.199679 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0846d28f-9252-4245-983c-c550050ba908-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"0846d28f-9252-4245-983c-c550050ba908\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 20 08:55:41.200150 master-0 kubenswrapper[18707]: I0320 08:55:41.200073 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0846d28f-9252-4245-983c-c550050ba908-kube-api-access\") pod \"installer-4-master-0\" (UID: \"0846d28f-9252-4245-983c-c550050ba908\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 20 08:55:41.200597 master-0 kubenswrapper[18707]: I0320 08:55:41.200553 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0846d28f-9252-4245-983c-c550050ba908-var-lock\") pod \"installer-4-master-0\" (UID: \"0846d28f-9252-4245-983c-c550050ba908\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 20 08:55:41.302554 master-0 kubenswrapper[18707]: I0320 08:55:41.302455 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0846d28f-9252-4245-983c-c550050ba908-kube-api-access\") pod \"installer-4-master-0\" (UID: \"0846d28f-9252-4245-983c-c550050ba908\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 20 08:55:41.302837 master-0 kubenswrapper[18707]: I0320 08:55:41.302755 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0846d28f-9252-4245-983c-c550050ba908-var-lock\") pod \"installer-4-master-0\" (UID: \"0846d28f-9252-4245-983c-c550050ba908\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 20 08:55:41.302893 master-0 kubenswrapper[18707]: I0320 08:55:41.302842 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0846d28f-9252-4245-983c-c550050ba908-var-lock\") pod \"installer-4-master-0\" (UID: \"0846d28f-9252-4245-983c-c550050ba908\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 20 08:55:41.303109 master-0 kubenswrapper[18707]: I0320 08:55:41.303075 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0846d28f-9252-4245-983c-c550050ba908-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"0846d28f-9252-4245-983c-c550050ba908\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 20 08:55:41.303339 master-0 kubenswrapper[18707]: I0320 08:55:41.303257 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0846d28f-9252-4245-983c-c550050ba908-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"0846d28f-9252-4245-983c-c550050ba908\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 20 08:55:41.319788 master-0 kubenswrapper[18707]: I0320 08:55:41.319711 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0846d28f-9252-4245-983c-c550050ba908-kube-api-access\") pod \"installer-4-master-0\" (UID: \"0846d28f-9252-4245-983c-c550050ba908\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 20 08:55:41.517656 master-0 kubenswrapper[18707]: I0320 08:55:41.517465 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 20 08:55:41.981886 master-0 kubenswrapper[18707]: I0320 08:55:41.981832 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 20 08:55:41.982694 master-0 kubenswrapper[18707]: W0320 08:55:41.982652 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0846d28f_9252_4245_983c_c550050ba908.slice/crio-8820b9345b8e68a7a3ce3bc3cdb83c253d5980f6260ed9a783f1c2c78e2afdab WatchSource:0}: Error finding container 8820b9345b8e68a7a3ce3bc3cdb83c253d5980f6260ed9a783f1c2c78e2afdab: Status 404 returned error can't find the container with id 8820b9345b8e68a7a3ce3bc3cdb83c253d5980f6260ed9a783f1c2c78e2afdab Mar 20 08:55:42.358146 master-0 kubenswrapper[18707]: I0320 08:55:42.358067 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"0846d28f-9252-4245-983c-c550050ba908","Type":"ContainerStarted","Data":"8820b9345b8e68a7a3ce3bc3cdb83c253d5980f6260ed9a783f1c2c78e2afdab"} Mar 20 08:55:43.368608 master-0 kubenswrapper[18707]: I0320 08:55:43.368440 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"0846d28f-9252-4245-983c-c550050ba908","Type":"ContainerStarted","Data":"7fee174a4180167fb20f7d418c19729d334255ce2ad28368d26bbbf2ad567b06"} Mar 20 08:55:43.401905 master-0 kubenswrapper[18707]: I0320 08:55:43.401805 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=2.401783343 podStartE2EDuration="2.401783343s" podCreationTimestamp="2026-03-20 08:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:55:43.398800258 +0000 UTC m=+888.554980624" watchObservedRunningTime="2026-03-20 08:55:43.401783343 +0000 UTC m=+888.557963699" Mar 20 08:55:46.442176 master-0 kubenswrapper[18707]: I0320 08:55:46.441918 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ffb45cdb5-7r4lm"] Mar 20 08:55:46.442815 master-0 kubenswrapper[18707]: E0320 08:55:46.442701 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[console-serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-console/console-ffb45cdb5-7r4lm" podUID="2ec18453-e818-426d-b2d5-e8364dc2067b" Mar 20 08:55:46.491011 master-0 kubenswrapper[18707]: I0320 08:55:46.490944 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-d4699dbdf-6z95l"] Mar 20 08:55:46.492080 master-0 kubenswrapper[18707]: I0320 08:55:46.492051 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.506618 master-0 kubenswrapper[18707]: I0320 08:55:46.506565 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d4699dbdf-6z95l"] Mar 20 08:55:46.611300 master-0 kubenswrapper[18707]: I0320 08:55:46.611247 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-oauth-serving-cert\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.611573 master-0 kubenswrapper[18707]: I0320 08:55:46.611555 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-config\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.611751 master-0 kubenswrapper[18707]: I0320 08:55:46.611683 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-oauth-config\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.611977 master-0 kubenswrapper[18707]: I0320 08:55:46.611963 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z968m\" (UniqueName: \"kubernetes.io/projected/2f81c6b4-2412-41e5-9a3f-c98fed48445a-kube-api-access-z968m\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.612201 master-0 kubenswrapper[18707]: I0320 08:55:46.612162 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-service-ca\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.612337 master-0 kubenswrapper[18707]: I0320 08:55:46.612319 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-serving-cert\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.612472 master-0 kubenswrapper[18707]: I0320 08:55:46.612458 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-trusted-ca-bundle\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.714473 master-0 kubenswrapper[18707]: I0320 08:55:46.714330 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-oauth-serving-cert\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.714473 master-0 kubenswrapper[18707]: I0320 08:55:46.714408 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-config\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.714473 master-0 kubenswrapper[18707]: I0320 08:55:46.714468 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-oauth-config\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.714833 master-0 kubenswrapper[18707]: I0320 08:55:46.714591 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z968m\" (UniqueName: \"kubernetes.io/projected/2f81c6b4-2412-41e5-9a3f-c98fed48445a-kube-api-access-z968m\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.714833 master-0 kubenswrapper[18707]: I0320 08:55:46.714648 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-service-ca\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.714833 master-0 kubenswrapper[18707]: I0320 08:55:46.714708 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-serving-cert\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.714833 master-0 kubenswrapper[18707]: I0320 08:55:46.714751 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-trusted-ca-bundle\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.715612 master-0 kubenswrapper[18707]: I0320 08:55:46.715562 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-oauth-serving-cert\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.715612 master-0 kubenswrapper[18707]: I0320 08:55:46.715597 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-config\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.716149 master-0 kubenswrapper[18707]: I0320 08:55:46.716112 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-trusted-ca-bundle\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.716287 master-0 kubenswrapper[18707]: I0320 08:55:46.716242 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-service-ca\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.719524 master-0 kubenswrapper[18707]: I0320 08:55:46.719474 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-serving-cert\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.720455 master-0 kubenswrapper[18707]: I0320 08:55:46.720426 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-oauth-config\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.736717 master-0 kubenswrapper[18707]: I0320 08:55:46.736656 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z968m\" (UniqueName: \"kubernetes.io/projected/2f81c6b4-2412-41e5-9a3f-c98fed48445a-kube-api-access-z968m\") pod \"console-d4699dbdf-6z95l\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:46.811002 master-0 kubenswrapper[18707]: I0320 08:55:46.810928 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-psbxb" Mar 20 08:55:46.819292 master-0 kubenswrapper[18707]: I0320 08:55:46.819238 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:47.350013 master-0 kubenswrapper[18707]: I0320 08:55:47.347925 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d4699dbdf-6z95l"] Mar 20 08:55:47.355012 master-0 kubenswrapper[18707]: W0320 08:55:47.354956 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f81c6b4_2412_41e5_9a3f_c98fed48445a.slice/crio-ae06d4739d879cf9dc6762237ea1a205a9db4460b820a0df3cdd21f6fc1b8d63 WatchSource:0}: Error finding container ae06d4739d879cf9dc6762237ea1a205a9db4460b820a0df3cdd21f6fc1b8d63: Status 404 returned error can't find the container with id ae06d4739d879cf9dc6762237ea1a205a9db4460b820a0df3cdd21f6fc1b8d63 Mar 20 08:55:47.358718 master-0 kubenswrapper[18707]: I0320 08:55:47.358661 18707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:55:47.413997 master-0 kubenswrapper[18707]: I0320 08:55:47.413931 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:55:47.414261 master-0 kubenswrapper[18707]: I0320 08:55:47.413995 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4699dbdf-6z95l" event={"ID":"2f81c6b4-2412-41e5-9a3f-c98fed48445a","Type":"ContainerStarted","Data":"ae06d4739d879cf9dc6762237ea1a205a9db4460b820a0df3cdd21f6fc1b8d63"} Mar 20 08:55:47.424248 master-0 kubenswrapper[18707]: I0320 08:55:47.424177 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:55:47.531096 master-0 kubenswrapper[18707]: I0320 08:55:47.531031 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-console-config\") pod \"2ec18453-e818-426d-b2d5-e8364dc2067b\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " Mar 20 08:55:47.531096 master-0 kubenswrapper[18707]: I0320 08:55:47.531106 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-oauth-config\") pod \"2ec18453-e818-426d-b2d5-e8364dc2067b\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " Mar 20 08:55:47.531823 master-0 kubenswrapper[18707]: I0320 08:55:47.531197 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-service-ca\") pod \"2ec18453-e818-426d-b2d5-e8364dc2067b\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " Mar 20 08:55:47.531823 master-0 kubenswrapper[18707]: I0320 08:55:47.531226 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-oauth-serving-cert\") pod \"2ec18453-e818-426d-b2d5-e8364dc2067b\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " Mar 20 08:55:47.531823 master-0 kubenswrapper[18707]: I0320 08:55:47.531298 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwk2g\" (UniqueName: \"kubernetes.io/projected/2ec18453-e818-426d-b2d5-e8364dc2067b-kube-api-access-nwk2g\") pod \"2ec18453-e818-426d-b2d5-e8364dc2067b\" (UID: \"2ec18453-e818-426d-b2d5-e8364dc2067b\") " Mar 20 08:55:47.531823 master-0 kubenswrapper[18707]: I0320 08:55:47.531807 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-service-ca" (OuterVolumeSpecName: "service-ca") pod "2ec18453-e818-426d-b2d5-e8364dc2067b" (UID: "2ec18453-e818-426d-b2d5-e8364dc2067b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:55:47.531988 master-0 kubenswrapper[18707]: I0320 08:55:47.531917 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-console-config" (OuterVolumeSpecName: "console-config") pod "2ec18453-e818-426d-b2d5-e8364dc2067b" (UID: "2ec18453-e818-426d-b2d5-e8364dc2067b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:55:47.532120 master-0 kubenswrapper[18707]: I0320 08:55:47.532047 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2ec18453-e818-426d-b2d5-e8364dc2067b" (UID: "2ec18453-e818-426d-b2d5-e8364dc2067b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:55:47.534325 master-0 kubenswrapper[18707]: I0320 08:55:47.534288 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2ec18453-e818-426d-b2d5-e8364dc2067b" (UID: "2ec18453-e818-426d-b2d5-e8364dc2067b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:55:47.534401 master-0 kubenswrapper[18707]: I0320 08:55:47.534334 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ec18453-e818-426d-b2d5-e8364dc2067b-kube-api-access-nwk2g" (OuterVolumeSpecName: "kube-api-access-nwk2g") pod "2ec18453-e818-426d-b2d5-e8364dc2067b" (UID: "2ec18453-e818-426d-b2d5-e8364dc2067b"). InnerVolumeSpecName "kube-api-access-nwk2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:55:47.634099 master-0 kubenswrapper[18707]: I0320 08:55:47.633952 18707 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-console-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:55:47.634099 master-0 kubenswrapper[18707]: I0320 08:55:47.634005 18707 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:55:47.634099 master-0 kubenswrapper[18707]: I0320 08:55:47.634023 18707 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:55:47.634099 master-0 kubenswrapper[18707]: I0320 08:55:47.634036 18707 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ec18453-e818-426d-b2d5-e8364dc2067b-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:55:47.634099 master-0 kubenswrapper[18707]: I0320 08:55:47.634052 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwk2g\" (UniqueName: \"kubernetes.io/projected/2ec18453-e818-426d-b2d5-e8364dc2067b-kube-api-access-nwk2g\") on node \"master-0\" DevicePath \"\"" Mar 20 08:55:48.422497 master-0 kubenswrapper[18707]: I0320 08:55:48.422324 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ffb45cdb5-7r4lm" Mar 20 08:55:48.508606 master-0 kubenswrapper[18707]: I0320 08:55:48.508526 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ffb45cdb5-7r4lm"] Mar 20 08:55:48.518878 master-0 kubenswrapper[18707]: I0320 08:55:48.518812 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-ffb45cdb5-7r4lm"] Mar 20 08:55:48.554664 master-0 kubenswrapper[18707]: I0320 08:55:48.554581 18707 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec18453-e818-426d-b2d5-e8364dc2067b-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:55:49.106669 master-0 kubenswrapper[18707]: I0320 08:55:49.106630 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ec18453-e818-426d-b2d5-e8364dc2067b" path="/var/lib/kubelet/pods/2ec18453-e818-426d-b2d5-e8364dc2067b/volumes" Mar 20 08:55:52.472266 master-0 kubenswrapper[18707]: I0320 08:55:52.472207 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4699dbdf-6z95l" event={"ID":"2f81c6b4-2412-41e5-9a3f-c98fed48445a","Type":"ContainerStarted","Data":"4649d5def556d152588b98a81a0614a6d47323d6036446b93a2ef94623528d29"} Mar 20 08:55:52.495136 master-0 kubenswrapper[18707]: I0320 08:55:52.494936 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d4699dbdf-6z95l" podStartSLOduration=1.626432893 podStartE2EDuration="6.494920869s" podCreationTimestamp="2026-03-20 08:55:46 +0000 UTC" firstStartedPulling="2026-03-20 08:55:47.358549816 +0000 UTC m=+892.514730202" lastFinishedPulling="2026-03-20 08:55:52.227037822 +0000 UTC m=+897.383218178" observedRunningTime="2026-03-20 08:55:52.494379914 +0000 UTC m=+897.650560340" watchObservedRunningTime="2026-03-20 08:55:52.494920869 +0000 UTC m=+897.651101225" Mar 20 08:55:55.971414 master-0 kubenswrapper[18707]: I0320 08:55:55.971328 18707 scope.go:117] "RemoveContainer" containerID="26d1f23f09ec46d0564e314771aedd57d50a0394449491bf05654764fec7468d" Mar 20 08:55:55.994439 master-0 kubenswrapper[18707]: I0320 08:55:55.994389 18707 scope.go:117] "RemoveContainer" containerID="61bf9561fb6a2bfd80add2b9b814a82fa5954086996b7e4feb2d7aa26a526193" Mar 20 08:55:56.819711 master-0 kubenswrapper[18707]: I0320 08:55:56.819610 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:56.819711 master-0 kubenswrapper[18707]: I0320 08:55:56.819707 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:56.829976 master-0 kubenswrapper[18707]: I0320 08:55:56.829389 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:57.542618 master-0 kubenswrapper[18707]: I0320 08:55:57.542545 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 08:55:57.689266 master-0 kubenswrapper[18707]: I0320 08:55:57.686221 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58857c5dc9-9wf6k"] Mar 20 08:55:57.689266 master-0 kubenswrapper[18707]: E0320 08:55:57.687658 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[console-serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-console/console-58857c5dc9-9wf6k" podUID="4a7bfb28-46ef-431e-8f40-da7e723613d6" Mar 20 08:55:58.543882 master-0 kubenswrapper[18707]: I0320 08:55:58.543821 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:55:58.576257 master-0 kubenswrapper[18707]: I0320 08:55:58.576093 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:55:58.627117 master-0 kubenswrapper[18707]: I0320 08:55:58.626996 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-trusted-ca-bundle\") pod \"4a7bfb28-46ef-431e-8f40-da7e723613d6\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " Mar 20 08:55:58.627370 master-0 kubenswrapper[18707]: I0320 08:55:58.627246 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-oauth-config\") pod \"4a7bfb28-46ef-431e-8f40-da7e723613d6\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " Mar 20 08:55:58.627370 master-0 kubenswrapper[18707]: I0320 08:55:58.627281 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-service-ca\") pod \"4a7bfb28-46ef-431e-8f40-da7e723613d6\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " Mar 20 08:55:58.627370 master-0 kubenswrapper[18707]: I0320 08:55:58.627314 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-oauth-serving-cert\") pod \"4a7bfb28-46ef-431e-8f40-da7e723613d6\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " Mar 20 08:55:58.627370 master-0 kubenswrapper[18707]: I0320 08:55:58.627354 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzkd2\" (UniqueName: \"kubernetes.io/projected/4a7bfb28-46ef-431e-8f40-da7e723613d6-kube-api-access-dzkd2\") pod \"4a7bfb28-46ef-431e-8f40-da7e723613d6\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " Mar 20 08:55:58.627514 master-0 kubenswrapper[18707]: I0320 08:55:58.627388 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-config\") pod \"4a7bfb28-46ef-431e-8f40-da7e723613d6\" (UID: \"4a7bfb28-46ef-431e-8f40-da7e723613d6\") " Mar 20 08:55:58.630267 master-0 kubenswrapper[18707]: I0320 08:55:58.630089 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4a7bfb28-46ef-431e-8f40-da7e723613d6" (UID: "4a7bfb28-46ef-431e-8f40-da7e723613d6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:55:58.630888 master-0 kubenswrapper[18707]: I0320 08:55:58.630816 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4a7bfb28-46ef-431e-8f40-da7e723613d6" (UID: "4a7bfb28-46ef-431e-8f40-da7e723613d6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:55:58.630888 master-0 kubenswrapper[18707]: I0320 08:55:58.630864 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-config" (OuterVolumeSpecName: "console-config") pod "4a7bfb28-46ef-431e-8f40-da7e723613d6" (UID: "4a7bfb28-46ef-431e-8f40-da7e723613d6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:55:58.631159 master-0 kubenswrapper[18707]: I0320 08:55:58.630882 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-service-ca" (OuterVolumeSpecName: "service-ca") pod "4a7bfb28-46ef-431e-8f40-da7e723613d6" (UID: "4a7bfb28-46ef-431e-8f40-da7e723613d6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:55:58.633789 master-0 kubenswrapper[18707]: I0320 08:55:58.633718 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a7bfb28-46ef-431e-8f40-da7e723613d6-kube-api-access-dzkd2" (OuterVolumeSpecName: "kube-api-access-dzkd2") pod "4a7bfb28-46ef-431e-8f40-da7e723613d6" (UID: "4a7bfb28-46ef-431e-8f40-da7e723613d6"). InnerVolumeSpecName "kube-api-access-dzkd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:55:58.634228 master-0 kubenswrapper[18707]: I0320 08:55:58.634138 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4a7bfb28-46ef-431e-8f40-da7e723613d6" (UID: "4a7bfb28-46ef-431e-8f40-da7e723613d6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:55:58.730533 master-0 kubenswrapper[18707]: I0320 08:55:58.730469 18707 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:55:58.730533 master-0 kubenswrapper[18707]: I0320 08:55:58.730520 18707 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 08:55:58.730533 master-0 kubenswrapper[18707]: I0320 08:55:58.730534 18707 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:55:58.730533 master-0 kubenswrapper[18707]: I0320 08:55:58.730545 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzkd2\" (UniqueName: \"kubernetes.io/projected/4a7bfb28-46ef-431e-8f40-da7e723613d6-kube-api-access-dzkd2\") on node \"master-0\" DevicePath \"\"" Mar 20 08:55:58.731177 master-0 kubenswrapper[18707]: I0320 08:55:58.730558 18707 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-config\") on node \"master-0\" DevicePath \"\"" Mar 20 08:55:58.731177 master-0 kubenswrapper[18707]: I0320 08:55:58.730571 18707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a7bfb28-46ef-431e-8f40-da7e723613d6-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 08:55:59.551751 master-0 kubenswrapper[18707]: I0320 08:55:59.551672 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58857c5dc9-9wf6k" Mar 20 08:55:59.611310 master-0 kubenswrapper[18707]: I0320 08:55:59.596309 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58857c5dc9-9wf6k"] Mar 20 08:55:59.611310 master-0 kubenswrapper[18707]: I0320 08:55:59.601374 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58857c5dc9-9wf6k"] Mar 20 08:55:59.649079 master-0 kubenswrapper[18707]: I0320 08:55:59.648956 18707 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a7bfb28-46ef-431e-8f40-da7e723613d6-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 08:56:01.109416 master-0 kubenswrapper[18707]: I0320 08:56:01.109346 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a7bfb28-46ef-431e-8f40-da7e723613d6" path="/var/lib/kubelet/pods/4a7bfb28-46ef-431e-8f40-da7e723613d6/volumes" Mar 20 08:56:15.188022 master-0 kubenswrapper[18707]: I0320 08:56:15.187941 18707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 20 08:56:15.188724 master-0 kubenswrapper[18707]: I0320 08:56:15.188454 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7953905b6830b623394f78c614eeb251" containerName="kube-controller-manager" containerID="cri-o://d633def5644d2939cf76b3ff1ca50d431eefd60fe6e3c53fdac7aa954b91c3d0" gracePeriod=30 Mar 20 08:56:15.188724 master-0 kubenswrapper[18707]: I0320 08:56:15.188541 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7953905b6830b623394f78c614eeb251" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://eaf8f3bc5689088b8fa94827b5652f8c3849d77cca5fef2e6de82d175214a2f2" gracePeriod=30 Mar 20 08:56:15.188724 master-0 kubenswrapper[18707]: I0320 08:56:15.188577 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7953905b6830b623394f78c614eeb251" containerName="cluster-policy-controller" containerID="cri-o://eee3c019520db05254889557f747480d6daa08c439db039b3084cb22d52227b7" gracePeriod=30 Mar 20 08:56:15.188884 master-0 kubenswrapper[18707]: I0320 08:56:15.188637 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7953905b6830b623394f78c614eeb251" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://4f6ae1640c74b96a0df0629b9b63fe65ebaa1ff15f95802689673b759615b737" gracePeriod=30 Mar 20 08:56:15.190517 master-0 kubenswrapper[18707]: I0320 08:56:15.190321 18707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 20 08:56:15.190849 master-0 kubenswrapper[18707]: E0320 08:56:15.190814 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7953905b6830b623394f78c614eeb251" containerName="kube-controller-manager" Mar 20 08:56:15.190849 master-0 kubenswrapper[18707]: I0320 08:56:15.190838 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7953905b6830b623394f78c614eeb251" containerName="kube-controller-manager" Mar 20 08:56:15.190970 master-0 kubenswrapper[18707]: E0320 08:56:15.190852 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7953905b6830b623394f78c614eeb251" containerName="kube-controller-manager-recovery-controller" Mar 20 08:56:15.190970 master-0 kubenswrapper[18707]: I0320 08:56:15.190864 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7953905b6830b623394f78c614eeb251" containerName="kube-controller-manager-recovery-controller" Mar 20 08:56:15.190970 master-0 kubenswrapper[18707]: E0320 08:56:15.190898 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7953905b6830b623394f78c614eeb251" containerName="kube-controller-manager-cert-syncer" Mar 20 08:56:15.190970 master-0 kubenswrapper[18707]: I0320 08:56:15.190912 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7953905b6830b623394f78c614eeb251" containerName="kube-controller-manager-cert-syncer" Mar 20 08:56:15.190970 master-0 kubenswrapper[18707]: E0320 08:56:15.190926 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7953905b6830b623394f78c614eeb251" containerName="cluster-policy-controller" Mar 20 08:56:15.190970 master-0 kubenswrapper[18707]: I0320 08:56:15.190939 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7953905b6830b623394f78c614eeb251" containerName="cluster-policy-controller" Mar 20 08:56:15.191235 master-0 kubenswrapper[18707]: I0320 08:56:15.191135 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7953905b6830b623394f78c614eeb251" containerName="kube-controller-manager" Mar 20 08:56:15.191235 master-0 kubenswrapper[18707]: I0320 08:56:15.191178 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7953905b6830b623394f78c614eeb251" containerName="kube-controller-manager-recovery-controller" Mar 20 08:56:15.191235 master-0 kubenswrapper[18707]: I0320 08:56:15.191223 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7953905b6830b623394f78c614eeb251" containerName="cluster-policy-controller" Mar 20 08:56:15.191932 master-0 kubenswrapper[18707]: I0320 08:56:15.191244 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7953905b6830b623394f78c614eeb251" containerName="kube-controller-manager-cert-syncer" Mar 20 08:56:15.356286 master-0 kubenswrapper[18707]: I0320 08:56:15.356171 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0e7a3622f1a5180efe08fda88825b245-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0e7a3622f1a5180efe08fda88825b245\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:15.356619 master-0 kubenswrapper[18707]: I0320 08:56:15.356354 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0e7a3622f1a5180efe08fda88825b245-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0e7a3622f1a5180efe08fda88825b245\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:15.450447 master-0 kubenswrapper[18707]: I0320 08:56:15.450268 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_7953905b6830b623394f78c614eeb251/kube-controller-manager-cert-syncer/0.log" Mar 20 08:56:15.451262 master-0 kubenswrapper[18707]: I0320 08:56:15.451138 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:15.454940 master-0 kubenswrapper[18707]: I0320 08:56:15.454874 18707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="7953905b6830b623394f78c614eeb251" podUID="0e7a3622f1a5180efe08fda88825b245" Mar 20 08:56:15.458446 master-0 kubenswrapper[18707]: I0320 08:56:15.458160 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0e7a3622f1a5180efe08fda88825b245-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0e7a3622f1a5180efe08fda88825b245\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:15.458446 master-0 kubenswrapper[18707]: I0320 08:56:15.458303 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0e7a3622f1a5180efe08fda88825b245-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0e7a3622f1a5180efe08fda88825b245\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:15.458446 master-0 kubenswrapper[18707]: I0320 08:56:15.458359 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0e7a3622f1a5180efe08fda88825b245-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0e7a3622f1a5180efe08fda88825b245\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:15.458733 master-0 kubenswrapper[18707]: I0320 08:56:15.458451 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0e7a3622f1a5180efe08fda88825b245-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0e7a3622f1a5180efe08fda88825b245\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:15.559517 master-0 kubenswrapper[18707]: I0320 08:56:15.559285 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7953905b6830b623394f78c614eeb251-cert-dir\") pod \"7953905b6830b623394f78c614eeb251\" (UID: \"7953905b6830b623394f78c614eeb251\") " Mar 20 08:56:15.559517 master-0 kubenswrapper[18707]: I0320 08:56:15.559344 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7953905b6830b623394f78c614eeb251-resource-dir\") pod \"7953905b6830b623394f78c614eeb251\" (UID: \"7953905b6830b623394f78c614eeb251\") " Mar 20 08:56:15.559517 master-0 kubenswrapper[18707]: I0320 08:56:15.559435 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7953905b6830b623394f78c614eeb251-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "7953905b6830b623394f78c614eeb251" (UID: "7953905b6830b623394f78c614eeb251"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:56:15.559517 master-0 kubenswrapper[18707]: I0320 08:56:15.559448 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7953905b6830b623394f78c614eeb251-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "7953905b6830b623394f78c614eeb251" (UID: "7953905b6830b623394f78c614eeb251"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:56:15.560022 master-0 kubenswrapper[18707]: I0320 08:56:15.559816 18707 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7953905b6830b623394f78c614eeb251-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:56:15.560022 master-0 kubenswrapper[18707]: I0320 08:56:15.559834 18707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7953905b6830b623394f78c614eeb251-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:56:16.241908 master-0 kubenswrapper[18707]: I0320 08:56:16.241832 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_7953905b6830b623394f78c614eeb251/kube-controller-manager-cert-syncer/0.log" Mar 20 08:56:16.244154 master-0 kubenswrapper[18707]: I0320 08:56:16.243438 18707 generic.go:334] "Generic (PLEG): container finished" podID="7953905b6830b623394f78c614eeb251" containerID="4f6ae1640c74b96a0df0629b9b63fe65ebaa1ff15f95802689673b759615b737" exitCode=0 Mar 20 08:56:16.244154 master-0 kubenswrapper[18707]: I0320 08:56:16.243497 18707 generic.go:334] "Generic (PLEG): container finished" podID="7953905b6830b623394f78c614eeb251" containerID="eaf8f3bc5689088b8fa94827b5652f8c3849d77cca5fef2e6de82d175214a2f2" exitCode=2 Mar 20 08:56:16.244154 master-0 kubenswrapper[18707]: I0320 08:56:16.243520 18707 generic.go:334] "Generic (PLEG): container finished" podID="7953905b6830b623394f78c614eeb251" containerID="eee3c019520db05254889557f747480d6daa08c439db039b3084cb22d52227b7" exitCode=0 Mar 20 08:56:16.244154 master-0 kubenswrapper[18707]: I0320 08:56:16.243544 18707 generic.go:334] "Generic (PLEG): container finished" podID="7953905b6830b623394f78c614eeb251" containerID="d633def5644d2939cf76b3ff1ca50d431eefd60fe6e3c53fdac7aa954b91c3d0" exitCode=0 Mar 20 08:56:16.244154 master-0 kubenswrapper[18707]: I0320 08:56:16.243598 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:16.244154 master-0 kubenswrapper[18707]: I0320 08:56:16.243656 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47a0bc110bb8e180c0d8aec527c63a230ff8e1ff6f1bcadeae851148a7655c28" Mar 20 08:56:16.247309 master-0 kubenswrapper[18707]: I0320 08:56:16.246826 18707 generic.go:334] "Generic (PLEG): container finished" podID="0846d28f-9252-4245-983c-c550050ba908" containerID="7fee174a4180167fb20f7d418c19729d334255ce2ad28368d26bbbf2ad567b06" exitCode=0 Mar 20 08:56:16.247309 master-0 kubenswrapper[18707]: I0320 08:56:16.246927 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"0846d28f-9252-4245-983c-c550050ba908","Type":"ContainerDied","Data":"7fee174a4180167fb20f7d418c19729d334255ce2ad28368d26bbbf2ad567b06"} Mar 20 08:56:16.250162 master-0 kubenswrapper[18707]: I0320 08:56:16.250080 18707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="7953905b6830b623394f78c614eeb251" podUID="0e7a3622f1a5180efe08fda88825b245" Mar 20 08:56:16.297342 master-0 kubenswrapper[18707]: I0320 08:56:16.297181 18707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="7953905b6830b623394f78c614eeb251" podUID="0e7a3622f1a5180efe08fda88825b245" Mar 20 08:56:17.104698 master-0 kubenswrapper[18707]: I0320 08:56:17.104622 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7953905b6830b623394f78c614eeb251" path="/var/lib/kubelet/pods/7953905b6830b623394f78c614eeb251/volumes" Mar 20 08:56:17.576387 master-0 kubenswrapper[18707]: I0320 08:56:17.576336 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 20 08:56:17.698610 master-0 kubenswrapper[18707]: I0320 08:56:17.698477 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0846d28f-9252-4245-983c-c550050ba908-var-lock\") pod \"0846d28f-9252-4245-983c-c550050ba908\" (UID: \"0846d28f-9252-4245-983c-c550050ba908\") " Mar 20 08:56:17.698610 master-0 kubenswrapper[18707]: I0320 08:56:17.698586 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0846d28f-9252-4245-983c-c550050ba908-kube-api-access\") pod \"0846d28f-9252-4245-983c-c550050ba908\" (UID: \"0846d28f-9252-4245-983c-c550050ba908\") " Mar 20 08:56:17.698874 master-0 kubenswrapper[18707]: I0320 08:56:17.698606 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0846d28f-9252-4245-983c-c550050ba908-var-lock" (OuterVolumeSpecName: "var-lock") pod "0846d28f-9252-4245-983c-c550050ba908" (UID: "0846d28f-9252-4245-983c-c550050ba908"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:56:17.698874 master-0 kubenswrapper[18707]: I0320 08:56:17.698685 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0846d28f-9252-4245-983c-c550050ba908-kubelet-dir\") pod \"0846d28f-9252-4245-983c-c550050ba908\" (UID: \"0846d28f-9252-4245-983c-c550050ba908\") " Mar 20 08:56:17.698983 master-0 kubenswrapper[18707]: I0320 08:56:17.698864 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0846d28f-9252-4245-983c-c550050ba908-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0846d28f-9252-4245-983c-c550050ba908" (UID: "0846d28f-9252-4245-983c-c550050ba908"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:56:17.699826 master-0 kubenswrapper[18707]: I0320 08:56:17.699081 18707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0846d28f-9252-4245-983c-c550050ba908-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 20 08:56:17.699826 master-0 kubenswrapper[18707]: I0320 08:56:17.699122 18707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0846d28f-9252-4245-983c-c550050ba908-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 20 08:56:17.701858 master-0 kubenswrapper[18707]: I0320 08:56:17.701823 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0846d28f-9252-4245-983c-c550050ba908-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0846d28f-9252-4245-983c-c550050ba908" (UID: "0846d28f-9252-4245-983c-c550050ba908"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:56:17.800726 master-0 kubenswrapper[18707]: I0320 08:56:17.800679 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0846d28f-9252-4245-983c-c550050ba908-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 20 08:56:18.263791 master-0 kubenswrapper[18707]: I0320 08:56:18.263725 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"0846d28f-9252-4245-983c-c550050ba908","Type":"ContainerDied","Data":"8820b9345b8e68a7a3ce3bc3cdb83c253d5980f6260ed9a783f1c2c78e2afdab"} Mar 20 08:56:18.263791 master-0 kubenswrapper[18707]: I0320 08:56:18.263779 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8820b9345b8e68a7a3ce3bc3cdb83c253d5980f6260ed9a783f1c2c78e2afdab" Mar 20 08:56:18.264222 master-0 kubenswrapper[18707]: I0320 08:56:18.263794 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 20 08:56:29.094517 master-0 kubenswrapper[18707]: I0320 08:56:29.094432 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:29.175563 master-0 kubenswrapper[18707]: I0320 08:56:29.175488 18707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="ea46fad8-f097-40d7-8a06-d43d54fa0535" Mar 20 08:56:29.175563 master-0 kubenswrapper[18707]: I0320 08:56:29.175550 18707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="ea46fad8-f097-40d7-8a06-d43d54fa0535" Mar 20 08:56:29.210560 master-0 kubenswrapper[18707]: I0320 08:56:29.210502 18707 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:29.211003 master-0 kubenswrapper[18707]: I0320 08:56:29.210705 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 20 08:56:29.219580 master-0 kubenswrapper[18707]: I0320 08:56:29.219504 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 20 08:56:29.227817 master-0 kubenswrapper[18707]: I0320 08:56:29.227774 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:29.233938 master-0 kubenswrapper[18707]: I0320 08:56:29.233869 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 20 08:56:29.257600 master-0 kubenswrapper[18707]: W0320 08:56:29.257530 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e7a3622f1a5180efe08fda88825b245.slice/crio-78f0a885cd6e16aa7e71093c341c4fe9612e7a3380c46c84e0a9b823efb7304c WatchSource:0}: Error finding container 78f0a885cd6e16aa7e71093c341c4fe9612e7a3380c46c84e0a9b823efb7304c: Status 404 returned error can't find the container with id 78f0a885cd6e16aa7e71093c341c4fe9612e7a3380c46c84e0a9b823efb7304c Mar 20 08:56:29.369468 master-0 kubenswrapper[18707]: I0320 08:56:29.369329 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0e7a3622f1a5180efe08fda88825b245","Type":"ContainerStarted","Data":"78f0a885cd6e16aa7e71093c341c4fe9612e7a3380c46c84e0a9b823efb7304c"} Mar 20 08:56:30.381230 master-0 kubenswrapper[18707]: I0320 08:56:30.381068 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0e7a3622f1a5180efe08fda88825b245","Type":"ContainerStarted","Data":"594af09242a1e72de3485ad22c2b1371768b903c9d046d2eaf635b47d1596b20"} Mar 20 08:56:30.381230 master-0 kubenswrapper[18707]: I0320 08:56:30.381138 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0e7a3622f1a5180efe08fda88825b245","Type":"ContainerStarted","Data":"ffec567ba12b37334dad769c1fcd171e40cf7c8e978e65c62f94bb80904e6915"} Mar 20 08:56:30.381230 master-0 kubenswrapper[18707]: I0320 08:56:30.381151 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0e7a3622f1a5180efe08fda88825b245","Type":"ContainerStarted","Data":"b04cd1e974336e0f8a388d190b2e73a531ccf2e364edccba577c161f7ca6be7f"} Mar 20 08:56:30.381230 master-0 kubenswrapper[18707]: I0320 08:56:30.381213 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0e7a3622f1a5180efe08fda88825b245","Type":"ContainerStarted","Data":"c4be5b1afaa795d923771de5c6a9f6ff1e511f892a95ee852af0f37c42ad2e9a"} Mar 20 08:56:30.417824 master-0 kubenswrapper[18707]: I0320 08:56:30.417709 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=1.4176766889999999 podStartE2EDuration="1.417676689s" podCreationTimestamp="2026-03-20 08:56:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:56:30.411465571 +0000 UTC m=+935.567645917" watchObservedRunningTime="2026-03-20 08:56:30.417676689 +0000 UTC m=+935.573857075" Mar 20 08:56:39.229168 master-0 kubenswrapper[18707]: I0320 08:56:39.229080 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:39.230752 master-0 kubenswrapper[18707]: I0320 08:56:39.229315 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:39.230752 master-0 kubenswrapper[18707]: I0320 08:56:39.229689 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:39.230752 master-0 kubenswrapper[18707]: I0320 08:56:39.229724 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:39.237660 master-0 kubenswrapper[18707]: I0320 08:56:39.237594 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:39.238238 master-0 kubenswrapper[18707]: I0320 08:56:39.238169 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:39.477057 master-0 kubenswrapper[18707]: I0320 08:56:39.476999 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:56:39.480585 master-0 kubenswrapper[18707]: I0320 08:56:39.480483 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 08:57:01.564666 master-0 kubenswrapper[18707]: I0320 08:57:01.564551 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-59477995f9-mk4kq"] Mar 20 08:57:01.565395 master-0 kubenswrapper[18707]: E0320 08:57:01.565376 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0846d28f-9252-4245-983c-c550050ba908" containerName="installer" Mar 20 08:57:01.565461 master-0 kubenswrapper[18707]: I0320 08:57:01.565401 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0846d28f-9252-4245-983c-c550050ba908" containerName="installer" Mar 20 08:57:01.565799 master-0 kubenswrapper[18707]: I0320 08:57:01.565761 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0846d28f-9252-4245-983c-c550050ba908" containerName="installer" Mar 20 08:57:01.566624 master-0 kubenswrapper[18707]: I0320 08:57:01.566583 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" Mar 20 08:57:01.573010 master-0 kubenswrapper[18707]: I0320 08:57:01.569633 18707 reflector.go:368] Caches populated for *v1.Secret from object-"sushy-emulator"/"os-client-config" Mar 20 08:57:01.573010 master-0 kubenswrapper[18707]: I0320 08:57:01.569859 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"kube-root-ca.crt" Mar 20 08:57:01.573010 master-0 kubenswrapper[18707]: I0320 08:57:01.571439 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"sushy-emulator-config" Mar 20 08:57:01.574066 master-0 kubenswrapper[18707]: I0320 08:57:01.573787 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"openshift-service-ca.crt" Mar 20 08:57:01.612033 master-0 kubenswrapper[18707]: I0320 08:57:01.610495 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-59477995f9-mk4kq"] Mar 20 08:57:01.712724 master-0 kubenswrapper[18707]: I0320 08:57:01.712636 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/f984a7e4-591c-40d2-8d60-390bb84559bb-os-client-config\") pod \"sushy-emulator-59477995f9-mk4kq\" (UID: \"f984a7e4-591c-40d2-8d60-390bb84559bb\") " pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" Mar 20 08:57:01.713683 master-0 kubenswrapper[18707]: I0320 08:57:01.713656 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8hw4\" (UniqueName: \"kubernetes.io/projected/f984a7e4-591c-40d2-8d60-390bb84559bb-kube-api-access-f8hw4\") pod \"sushy-emulator-59477995f9-mk4kq\" (UID: \"f984a7e4-591c-40d2-8d60-390bb84559bb\") " pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" Mar 20 08:57:01.713997 master-0 kubenswrapper[18707]: I0320 08:57:01.713940 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/f984a7e4-591c-40d2-8d60-390bb84559bb-sushy-emulator-config\") pod \"sushy-emulator-59477995f9-mk4kq\" (UID: \"f984a7e4-591c-40d2-8d60-390bb84559bb\") " pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" Mar 20 08:57:01.816149 master-0 kubenswrapper[18707]: I0320 08:57:01.815933 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8hw4\" (UniqueName: \"kubernetes.io/projected/f984a7e4-591c-40d2-8d60-390bb84559bb-kube-api-access-f8hw4\") pod \"sushy-emulator-59477995f9-mk4kq\" (UID: \"f984a7e4-591c-40d2-8d60-390bb84559bb\") " pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" Mar 20 08:57:01.816149 master-0 kubenswrapper[18707]: I0320 08:57:01.816036 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/f984a7e4-591c-40d2-8d60-390bb84559bb-sushy-emulator-config\") pod \"sushy-emulator-59477995f9-mk4kq\" (UID: \"f984a7e4-591c-40d2-8d60-390bb84559bb\") " pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" Mar 20 08:57:01.816149 master-0 kubenswrapper[18707]: I0320 08:57:01.816083 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/f984a7e4-591c-40d2-8d60-390bb84559bb-os-client-config\") pod \"sushy-emulator-59477995f9-mk4kq\" (UID: \"f984a7e4-591c-40d2-8d60-390bb84559bb\") " pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" Mar 20 08:57:01.817156 master-0 kubenswrapper[18707]: I0320 08:57:01.817113 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/f984a7e4-591c-40d2-8d60-390bb84559bb-sushy-emulator-config\") pod \"sushy-emulator-59477995f9-mk4kq\" (UID: \"f984a7e4-591c-40d2-8d60-390bb84559bb\") " pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" Mar 20 08:57:01.822344 master-0 kubenswrapper[18707]: I0320 08:57:01.822296 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/f984a7e4-591c-40d2-8d60-390bb84559bb-os-client-config\") pod \"sushy-emulator-59477995f9-mk4kq\" (UID: \"f984a7e4-591c-40d2-8d60-390bb84559bb\") " pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" Mar 20 08:57:01.833922 master-0 kubenswrapper[18707]: I0320 08:57:01.833891 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8hw4\" (UniqueName: \"kubernetes.io/projected/f984a7e4-591c-40d2-8d60-390bb84559bb-kube-api-access-f8hw4\") pod \"sushy-emulator-59477995f9-mk4kq\" (UID: \"f984a7e4-591c-40d2-8d60-390bb84559bb\") " pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" Mar 20 08:57:01.911548 master-0 kubenswrapper[18707]: I0320 08:57:01.911458 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" Mar 20 08:57:02.434680 master-0 kubenswrapper[18707]: W0320 08:57:02.434627 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf984a7e4_591c_40d2_8d60_390bb84559bb.slice/crio-9bc562e5696196c93b07edc8d06b499c8a0df8ef1ab51c8a0a3f9a0188ab3511 WatchSource:0}: Error finding container 9bc562e5696196c93b07edc8d06b499c8a0df8ef1ab51c8a0a3f9a0188ab3511: Status 404 returned error can't find the container with id 9bc562e5696196c93b07edc8d06b499c8a0df8ef1ab51c8a0a3f9a0188ab3511 Mar 20 08:57:02.435847 master-0 kubenswrapper[18707]: I0320 08:57:02.435251 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-59477995f9-mk4kq"] Mar 20 08:57:02.672994 master-0 kubenswrapper[18707]: I0320 08:57:02.672899 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" event={"ID":"f984a7e4-591c-40d2-8d60-390bb84559bb","Type":"ContainerStarted","Data":"9bc562e5696196c93b07edc8d06b499c8a0df8ef1ab51c8a0a3f9a0188ab3511"} Mar 20 08:57:11.511503 master-0 kubenswrapper[18707]: I0320 08:57:11.511424 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" event={"ID":"f984a7e4-591c-40d2-8d60-390bb84559bb","Type":"ContainerStarted","Data":"e576d9a42da586ea4d8cf7240030dada730d7af4d59cfeb7ed7f73da585062df"} Mar 20 08:57:11.544622 master-0 kubenswrapper[18707]: I0320 08:57:11.544512 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" podStartSLOduration=2.05463544 podStartE2EDuration="10.544493611s" podCreationTimestamp="2026-03-20 08:57:01 +0000 UTC" firstStartedPulling="2026-03-20 08:57:02.438303551 +0000 UTC m=+967.594483917" lastFinishedPulling="2026-03-20 08:57:10.928161732 +0000 UTC m=+976.084342088" observedRunningTime="2026-03-20 08:57:11.539057666 +0000 UTC m=+976.695238022" watchObservedRunningTime="2026-03-20 08:57:11.544493611 +0000 UTC m=+976.700673967" Mar 20 08:57:11.914479 master-0 kubenswrapper[18707]: I0320 08:57:11.914382 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" Mar 20 08:57:11.914821 master-0 kubenswrapper[18707]: I0320 08:57:11.914684 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" Mar 20 08:57:12.183209 master-0 kubenswrapper[18707]: I0320 08:57:12.182944 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" Mar 20 08:57:12.527632 master-0 kubenswrapper[18707]: I0320 08:57:12.527400 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-59477995f9-mk4kq" Mar 20 08:57:31.547718 master-0 kubenswrapper[18707]: I0320 08:57:31.543177 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-poller-557b677b49-vslxq"] Mar 20 08:57:31.547718 master-0 kubenswrapper[18707]: I0320 08:57:31.545151 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-557b677b49-vslxq" Mar 20 08:57:31.653736 master-0 kubenswrapper[18707]: I0320 08:57:31.650386 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-557b677b49-vslxq"] Mar 20 08:57:31.691976 master-0 kubenswrapper[18707]: I0320 08:57:31.691507 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cd97\" (UniqueName: \"kubernetes.io/projected/995783f4-f91b-4354-95e0-5454cd01b4de-kube-api-access-9cd97\") pod \"nova-console-poller-557b677b49-vslxq\" (UID: \"995783f4-f91b-4354-95e0-5454cd01b4de\") " pod="sushy-emulator/nova-console-poller-557b677b49-vslxq" Mar 20 08:57:31.691976 master-0 kubenswrapper[18707]: I0320 08:57:31.691726 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/995783f4-f91b-4354-95e0-5454cd01b4de-os-client-config\") pod \"nova-console-poller-557b677b49-vslxq\" (UID: \"995783f4-f91b-4354-95e0-5454cd01b4de\") " pod="sushy-emulator/nova-console-poller-557b677b49-vslxq" Mar 20 08:57:31.793230 master-0 kubenswrapper[18707]: I0320 08:57:31.793142 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/995783f4-f91b-4354-95e0-5454cd01b4de-os-client-config\") pod \"nova-console-poller-557b677b49-vslxq\" (UID: \"995783f4-f91b-4354-95e0-5454cd01b4de\") " pod="sushy-emulator/nova-console-poller-557b677b49-vslxq" Mar 20 08:57:31.793492 master-0 kubenswrapper[18707]: I0320 08:57:31.793278 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cd97\" (UniqueName: \"kubernetes.io/projected/995783f4-f91b-4354-95e0-5454cd01b4de-kube-api-access-9cd97\") pod \"nova-console-poller-557b677b49-vslxq\" (UID: \"995783f4-f91b-4354-95e0-5454cd01b4de\") " pod="sushy-emulator/nova-console-poller-557b677b49-vslxq" Mar 20 08:57:31.798762 master-0 kubenswrapper[18707]: I0320 08:57:31.798615 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/995783f4-f91b-4354-95e0-5454cd01b4de-os-client-config\") pod \"nova-console-poller-557b677b49-vslxq\" (UID: \"995783f4-f91b-4354-95e0-5454cd01b4de\") " pod="sushy-emulator/nova-console-poller-557b677b49-vslxq" Mar 20 08:57:31.812564 master-0 kubenswrapper[18707]: I0320 08:57:31.812503 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cd97\" (UniqueName: \"kubernetes.io/projected/995783f4-f91b-4354-95e0-5454cd01b4de-kube-api-access-9cd97\") pod \"nova-console-poller-557b677b49-vslxq\" (UID: \"995783f4-f91b-4354-95e0-5454cd01b4de\") " pod="sushy-emulator/nova-console-poller-557b677b49-vslxq" Mar 20 08:57:31.935303 master-0 kubenswrapper[18707]: I0320 08:57:31.935162 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-557b677b49-vslxq" Mar 20 08:57:32.381737 master-0 kubenswrapper[18707]: I0320 08:57:32.381679 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-557b677b49-vslxq"] Mar 20 08:57:32.387701 master-0 kubenswrapper[18707]: W0320 08:57:32.387660 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod995783f4_f91b_4354_95e0_5454cd01b4de.slice/crio-ecbd6294dd3b600175359196d62af8715eb19783c410180aaeabb36af31d7afb WatchSource:0}: Error finding container ecbd6294dd3b600175359196d62af8715eb19783c410180aaeabb36af31d7afb: Status 404 returned error can't find the container with id ecbd6294dd3b600175359196d62af8715eb19783c410180aaeabb36af31d7afb Mar 20 08:57:32.722003 master-0 kubenswrapper[18707]: I0320 08:57:32.721807 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-557b677b49-vslxq" event={"ID":"995783f4-f91b-4354-95e0-5454cd01b4de","Type":"ContainerStarted","Data":"ecbd6294dd3b600175359196d62af8715eb19783c410180aaeabb36af31d7afb"} Mar 20 08:57:38.784316 master-0 kubenswrapper[18707]: I0320 08:57:38.784260 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-557b677b49-vslxq" event={"ID":"995783f4-f91b-4354-95e0-5454cd01b4de","Type":"ContainerStarted","Data":"071ab23707f0e615b901c62feb26354450c85cb8907fa7e8759d065aeacdd4bd"} Mar 20 08:57:38.784316 master-0 kubenswrapper[18707]: I0320 08:57:38.784311 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-557b677b49-vslxq" event={"ID":"995783f4-f91b-4354-95e0-5454cd01b4de","Type":"ContainerStarted","Data":"247fd18dd2f2fe31e8995a8f363c315c6df854e72671b71543a7b1a58c5b33a0"} Mar 20 08:57:38.811947 master-0 kubenswrapper[18707]: I0320 08:57:38.811829 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-poller-557b677b49-vslxq" podStartSLOduration=1.5843125420000002 podStartE2EDuration="7.811804243s" podCreationTimestamp="2026-03-20 08:57:31 +0000 UTC" firstStartedPulling="2026-03-20 08:57:32.390178655 +0000 UTC m=+997.546359011" lastFinishedPulling="2026-03-20 08:57:38.617670346 +0000 UTC m=+1003.773850712" observedRunningTime="2026-03-20 08:57:38.808157758 +0000 UTC m=+1003.964338114" watchObservedRunningTime="2026-03-20 08:57:38.811804243 +0000 UTC m=+1003.967984599" Mar 20 08:57:56.076991 master-0 kubenswrapper[18707]: I0320 08:57:56.076898 18707 scope.go:117] "RemoveContainer" containerID="eaf8f3bc5689088b8fa94827b5652f8c3849d77cca5fef2e6de82d175214a2f2" Mar 20 08:57:56.101555 master-0 kubenswrapper[18707]: I0320 08:57:56.101457 18707 scope.go:117] "RemoveContainer" containerID="d633def5644d2939cf76b3ff1ca50d431eefd60fe6e3c53fdac7aa954b91c3d0" Mar 20 08:57:56.127685 master-0 kubenswrapper[18707]: I0320 08:57:56.127569 18707 scope.go:117] "RemoveContainer" containerID="eee3c019520db05254889557f747480d6daa08c439db039b3084cb22d52227b7" Mar 20 08:57:56.151450 master-0 kubenswrapper[18707]: I0320 08:57:56.151372 18707 scope.go:117] "RemoveContainer" containerID="4f6ae1640c74b96a0df0629b9b63fe65ebaa1ff15f95802689673b759615b737" Mar 20 08:58:03.781999 master-0 kubenswrapper[18707]: I0320 08:58:03.781895 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-recorder-6766d57b69-2wgj4"] Mar 20 08:58:03.783912 master-0 kubenswrapper[18707]: I0320 08:58:03.783857 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-6766d57b69-2wgj4" Mar 20 08:58:03.802926 master-0 kubenswrapper[18707]: I0320 08:58:03.802857 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-6766d57b69-2wgj4"] Mar 20 08:58:03.821428 master-0 kubenswrapper[18707]: I0320 08:58:03.821337 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/a3eb0a3c-0e6e-4296-9011-c8392a188903-os-client-config\") pod \"nova-console-recorder-6766d57b69-2wgj4\" (UID: \"a3eb0a3c-0e6e-4296-9011-c8392a188903\") " pod="sushy-emulator/nova-console-recorder-6766d57b69-2wgj4" Mar 20 08:58:03.922776 master-0 kubenswrapper[18707]: I0320 08:58:03.922691 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b5cg\" (UniqueName: \"kubernetes.io/projected/a3eb0a3c-0e6e-4296-9011-c8392a188903-kube-api-access-2b5cg\") pod \"nova-console-recorder-6766d57b69-2wgj4\" (UID: \"a3eb0a3c-0e6e-4296-9011-c8392a188903\") " pod="sushy-emulator/nova-console-recorder-6766d57b69-2wgj4" Mar 20 08:58:03.922776 master-0 kubenswrapper[18707]: I0320 08:58:03.922762 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/a3eb0a3c-0e6e-4296-9011-c8392a188903-nova-console-recordings-pv\") pod \"nova-console-recorder-6766d57b69-2wgj4\" (UID: \"a3eb0a3c-0e6e-4296-9011-c8392a188903\") " pod="sushy-emulator/nova-console-recorder-6766d57b69-2wgj4" Mar 20 08:58:03.923069 master-0 kubenswrapper[18707]: I0320 08:58:03.923032 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/a3eb0a3c-0e6e-4296-9011-c8392a188903-os-client-config\") pod \"nova-console-recorder-6766d57b69-2wgj4\" (UID: \"a3eb0a3c-0e6e-4296-9011-c8392a188903\") " pod="sushy-emulator/nova-console-recorder-6766d57b69-2wgj4" Mar 20 08:58:03.928277 master-0 kubenswrapper[18707]: I0320 08:58:03.928220 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/a3eb0a3c-0e6e-4296-9011-c8392a188903-os-client-config\") pod \"nova-console-recorder-6766d57b69-2wgj4\" (UID: \"a3eb0a3c-0e6e-4296-9011-c8392a188903\") " pod="sushy-emulator/nova-console-recorder-6766d57b69-2wgj4" Mar 20 08:58:04.029272 master-0 kubenswrapper[18707]: I0320 08:58:04.029169 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b5cg\" (UniqueName: \"kubernetes.io/projected/a3eb0a3c-0e6e-4296-9011-c8392a188903-kube-api-access-2b5cg\") pod \"nova-console-recorder-6766d57b69-2wgj4\" (UID: \"a3eb0a3c-0e6e-4296-9011-c8392a188903\") " pod="sushy-emulator/nova-console-recorder-6766d57b69-2wgj4" Mar 20 08:58:04.029465 master-0 kubenswrapper[18707]: I0320 08:58:04.029356 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/a3eb0a3c-0e6e-4296-9011-c8392a188903-nova-console-recordings-pv\") pod \"nova-console-recorder-6766d57b69-2wgj4\" (UID: \"a3eb0a3c-0e6e-4296-9011-c8392a188903\") " pod="sushy-emulator/nova-console-recorder-6766d57b69-2wgj4" Mar 20 08:58:04.044942 master-0 kubenswrapper[18707]: I0320 08:58:04.044878 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b5cg\" (UniqueName: \"kubernetes.io/projected/a3eb0a3c-0e6e-4296-9011-c8392a188903-kube-api-access-2b5cg\") pod \"nova-console-recorder-6766d57b69-2wgj4\" (UID: \"a3eb0a3c-0e6e-4296-9011-c8392a188903\") " pod="sushy-emulator/nova-console-recorder-6766d57b69-2wgj4" Mar 20 08:58:04.684516 master-0 kubenswrapper[18707]: I0320 08:58:04.684379 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/a3eb0a3c-0e6e-4296-9011-c8392a188903-nova-console-recordings-pv\") pod \"nova-console-recorder-6766d57b69-2wgj4\" (UID: \"a3eb0a3c-0e6e-4296-9011-c8392a188903\") " pod="sushy-emulator/nova-console-recorder-6766d57b69-2wgj4" Mar 20 08:58:04.711588 master-0 kubenswrapper[18707]: I0320 08:58:04.711522 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-6766d57b69-2wgj4" Mar 20 08:58:05.172067 master-0 kubenswrapper[18707]: I0320 08:58:05.172002 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-6766d57b69-2wgj4"] Mar 20 08:58:05.176428 master-0 kubenswrapper[18707]: W0320 08:58:05.176379 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3eb0a3c_0e6e_4296_9011_c8392a188903.slice/crio-536b2346e5e77e5378ecef638c59ea1981ce4052267ed6b5f359301b257ae31f WatchSource:0}: Error finding container 536b2346e5e77e5378ecef638c59ea1981ce4052267ed6b5f359301b257ae31f: Status 404 returned error can't find the container with id 536b2346e5e77e5378ecef638c59ea1981ce4052267ed6b5f359301b257ae31f Mar 20 08:58:06.060727 master-0 kubenswrapper[18707]: I0320 08:58:06.060644 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-6766d57b69-2wgj4" event={"ID":"a3eb0a3c-0e6e-4296-9011-c8392a188903","Type":"ContainerStarted","Data":"536b2346e5e77e5378ecef638c59ea1981ce4052267ed6b5f359301b257ae31f"} Mar 20 08:58:14.134851 master-0 kubenswrapper[18707]: I0320 08:58:14.133676 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-6766d57b69-2wgj4" event={"ID":"a3eb0a3c-0e6e-4296-9011-c8392a188903","Type":"ContainerStarted","Data":"9e04802e43f00e084d6d4f14555babf36e2ec665b922413d3aa7a7e29891e704"} Mar 20 08:58:15.147575 master-0 kubenswrapper[18707]: I0320 08:58:15.147346 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-6766d57b69-2wgj4" event={"ID":"a3eb0a3c-0e6e-4296-9011-c8392a188903","Type":"ContainerStarted","Data":"2d9a1840fa009ed55652cfa8fa3c8d07917a3128e6170e51c52a9f2076bf0ea2"} Mar 20 08:58:15.171591 master-0 kubenswrapper[18707]: I0320 08:58:15.171494 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-recorder-6766d57b69-2wgj4" podStartSLOduration=3.1975957839999998 podStartE2EDuration="12.171473487s" podCreationTimestamp="2026-03-20 08:58:03 +0000 UTC" firstStartedPulling="2026-03-20 08:58:05.178890699 +0000 UTC m=+1030.335071045" lastFinishedPulling="2026-03-20 08:58:14.152768352 +0000 UTC m=+1039.308948748" observedRunningTime="2026-03-20 08:58:15.169855311 +0000 UTC m=+1040.326035667" watchObservedRunningTime="2026-03-20 08:58:15.171473487 +0000 UTC m=+1040.327653843" Mar 20 08:58:47.008686 master-0 kubenswrapper[18707]: I0320 08:58:47.007167 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/metal3-c957566cd-lgms8"] Mar 20 08:58:47.011122 master-0 kubenswrapper[18707]: I0320 08:58:47.011090 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.013578 master-0 kubenswrapper[18707]: I0320 08:58:47.013521 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"metal3-ironic-tls" Mar 20 08:58:47.013815 master-0 kubenswrapper[18707]: I0320 08:58:47.013777 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"baremetal-operator-webhook-server-cert" Mar 20 08:58:47.017954 master-0 kubenswrapper[18707]: I0320 08:58:47.017797 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"metal3-ironic-password" Mar 20 08:58:47.020674 master-0 kubenswrapper[18707]: I0320 08:58:47.020619 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cbo-trusted-ca" Mar 20 08:58:47.047430 master-0 kubenswrapper[18707]: I0320 08:58:47.047359 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-shared\" (UniqueName: \"kubernetes.io/empty-dir/66940526-3f1e-4610-9d67-acea8496a04d-metal3-shared\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.047430 master-0 kubenswrapper[18707]: I0320 08:58:47.047419 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66940526-3f1e-4610-9d67-acea8496a04d-trusted-ca\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.047842 master-0 kubenswrapper[18707]: I0320 08:58:47.047468 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-ironic-basic-auth\" (UniqueName: \"kubernetes.io/secret/66940526-3f1e-4610-9d67-acea8496a04d-metal3-ironic-basic-auth\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.047842 master-0 kubenswrapper[18707]: I0320 08:58:47.047505 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-vmedia-tls\" (UniqueName: \"kubernetes.io/secret/66940526-3f1e-4610-9d67-acea8496a04d-metal3-vmedia-tls\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.047842 master-0 kubenswrapper[18707]: I0320 08:58:47.047548 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-shared-image-cache\" (UniqueName: \"kubernetes.io/host-path/66940526-3f1e-4610-9d67-acea8496a04d-metal3-shared-image-cache\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.047842 master-0 kubenswrapper[18707]: I0320 08:58:47.047630 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7pgd\" (UniqueName: \"kubernetes.io/projected/66940526-3f1e-4610-9d67-acea8496a04d-kube-api-access-m7pgd\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.047842 master-0 kubenswrapper[18707]: I0320 08:58:47.047694 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/66940526-3f1e-4610-9d67-acea8496a04d-metal3-ironic-tls\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.149288 master-0 kubenswrapper[18707]: I0320 08:58:47.149128 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-shared\" (UniqueName: \"kubernetes.io/empty-dir/66940526-3f1e-4610-9d67-acea8496a04d-metal3-shared\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.149600 master-0 kubenswrapper[18707]: I0320 08:58:47.149372 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66940526-3f1e-4610-9d67-acea8496a04d-trusted-ca\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.149600 master-0 kubenswrapper[18707]: I0320 08:58:47.149432 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-ironic-basic-auth\" (UniqueName: \"kubernetes.io/secret/66940526-3f1e-4610-9d67-acea8496a04d-metal3-ironic-basic-auth\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.149600 master-0 kubenswrapper[18707]: I0320 08:58:47.149476 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-vmedia-tls\" (UniqueName: \"kubernetes.io/secret/66940526-3f1e-4610-9d67-acea8496a04d-metal3-vmedia-tls\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.149600 master-0 kubenswrapper[18707]: I0320 08:58:47.149531 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-shared-image-cache\" (UniqueName: \"kubernetes.io/host-path/66940526-3f1e-4610-9d67-acea8496a04d-metal3-shared-image-cache\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.149898 master-0 kubenswrapper[18707]: I0320 08:58:47.149617 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-shared\" (UniqueName: \"kubernetes.io/empty-dir/66940526-3f1e-4610-9d67-acea8496a04d-metal3-shared\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.150178 master-0 kubenswrapper[18707]: I0320 08:58:47.150136 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7pgd\" (UniqueName: \"kubernetes.io/projected/66940526-3f1e-4610-9d67-acea8496a04d-kube-api-access-m7pgd\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.150530 master-0 kubenswrapper[18707]: I0320 08:58:47.150358 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-shared-image-cache\" (UniqueName: \"kubernetes.io/host-path/66940526-3f1e-4610-9d67-acea8496a04d-metal3-shared-image-cache\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.150775 master-0 kubenswrapper[18707]: I0320 08:58:47.150742 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66940526-3f1e-4610-9d67-acea8496a04d-trusted-ca\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.150965 master-0 kubenswrapper[18707]: I0320 08:58:47.150932 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/66940526-3f1e-4610-9d67-acea8496a04d-metal3-ironic-tls\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.153294 master-0 kubenswrapper[18707]: I0320 08:58:47.153248 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-vmedia-tls\" (UniqueName: \"kubernetes.io/secret/66940526-3f1e-4610-9d67-acea8496a04d-metal3-vmedia-tls\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.153463 master-0 kubenswrapper[18707]: I0320 08:58:47.153407 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/66940526-3f1e-4610-9d67-acea8496a04d-metal3-ironic-tls\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.154215 master-0 kubenswrapper[18707]: I0320 08:58:47.154123 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-ironic-basic-auth\" (UniqueName: \"kubernetes.io/secret/66940526-3f1e-4610-9d67-acea8496a04d-metal3-ironic-basic-auth\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.181072 master-0 kubenswrapper[18707]: I0320 08:58:47.180999 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7pgd\" (UniqueName: \"kubernetes.io/projected/66940526-3f1e-4610-9d67-acea8496a04d-kube-api-access-m7pgd\") pod \"metal3-c957566cd-lgms8\" (UID: \"66940526-3f1e-4610-9d67-acea8496a04d\") " pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.345794 master-0 kubenswrapper[18707]: I0320 08:58:47.345722 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/metal3-c957566cd-lgms8" Mar 20 08:58:47.406143 master-0 kubenswrapper[18707]: W0320 08:58:47.406095 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66940526_3f1e_4610_9d67_acea8496a04d.slice/crio-cd154a8ca5e31d6da7c68790581fcbc559ca14a4044e6708b8147d9057fc9601 WatchSource:0}: Error finding container cd154a8ca5e31d6da7c68790581fcbc559ca14a4044e6708b8147d9057fc9601: Status 404 returned error can't find the container with id cd154a8ca5e31d6da7c68790581fcbc559ca14a4044e6708b8147d9057fc9601 Mar 20 08:58:47.409533 master-0 kubenswrapper[18707]: I0320 08:58:47.409460 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c"] Mar 20 08:58:47.410709 master-0 kubenswrapper[18707]: I0320 08:58:47.410686 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:47.416384 master-0 kubenswrapper[18707]: I0320 08:58:47.416335 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c"] Mar 20 08:58:47.446389 master-0 kubenswrapper[18707]: I0320 08:58:47.445598 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-c957566cd-lgms8" event={"ID":"66940526-3f1e-4610-9d67-acea8496a04d","Type":"ContainerStarted","Data":"cd154a8ca5e31d6da7c68790581fcbc559ca14a4044e6708b8147d9057fc9601"} Mar 20 08:58:47.455545 master-0 kubenswrapper[18707]: I0320 08:58:47.455487 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96xfp\" (UniqueName: \"kubernetes.io/projected/833281c3-2691-4d1b-852c-b02ab69e5f9f-kube-api-access-96xfp\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:47.455545 master-0 kubenswrapper[18707]: I0320 08:58:47.455541 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-ironic-basic-auth\" (UniqueName: \"kubernetes.io/secret/833281c3-2691-4d1b-852c-b02ab69e5f9f-metal3-ironic-basic-auth\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:47.455831 master-0 kubenswrapper[18707]: I0320 08:58:47.455571 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/833281c3-2691-4d1b-852c-b02ab69e5f9f-cert\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:47.455831 master-0 kubenswrapper[18707]: I0320 08:58:47.455765 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/833281c3-2691-4d1b-852c-b02ab69e5f9f-metal3-ironic-tls\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:47.455967 master-0 kubenswrapper[18707]: I0320 08:58:47.455881 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/833281c3-2691-4d1b-852c-b02ab69e5f9f-trusted-ca\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:47.558024 master-0 kubenswrapper[18707]: I0320 08:58:47.557917 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96xfp\" (UniqueName: \"kubernetes.io/projected/833281c3-2691-4d1b-852c-b02ab69e5f9f-kube-api-access-96xfp\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:47.558024 master-0 kubenswrapper[18707]: I0320 08:58:47.558031 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-ironic-basic-auth\" (UniqueName: \"kubernetes.io/secret/833281c3-2691-4d1b-852c-b02ab69e5f9f-metal3-ironic-basic-auth\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:47.558456 master-0 kubenswrapper[18707]: I0320 08:58:47.558232 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/833281c3-2691-4d1b-852c-b02ab69e5f9f-cert\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:47.558456 master-0 kubenswrapper[18707]: I0320 08:58:47.558331 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/833281c3-2691-4d1b-852c-b02ab69e5f9f-metal3-ironic-tls\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:47.558559 master-0 kubenswrapper[18707]: I0320 08:58:47.558453 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/833281c3-2691-4d1b-852c-b02ab69e5f9f-trusted-ca\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:47.558610 master-0 kubenswrapper[18707]: E0320 08:58:47.558572 18707 secret.go:189] Couldn't get secret openshift-machine-api/baremetal-operator-webhook-server-cert: secret "baremetal-operator-webhook-server-cert" not found Mar 20 08:58:47.558689 master-0 kubenswrapper[18707]: E0320 08:58:47.558650 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/833281c3-2691-4d1b-852c-b02ab69e5f9f-cert podName:833281c3-2691-4d1b-852c-b02ab69e5f9f nodeName:}" failed. No retries permitted until 2026-03-20 08:58:48.058627545 +0000 UTC m=+1073.214807911 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/833281c3-2691-4d1b-852c-b02ab69e5f9f-cert") pod "metal3-baremetal-operator-78474bdc48-ndj7c" (UID: "833281c3-2691-4d1b-852c-b02ab69e5f9f") : secret "baremetal-operator-webhook-server-cert" not found Mar 20 08:58:47.561265 master-0 kubenswrapper[18707]: I0320 08:58:47.561214 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/833281c3-2691-4d1b-852c-b02ab69e5f9f-trusted-ca\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:47.566375 master-0 kubenswrapper[18707]: I0320 08:58:47.566327 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/833281c3-2691-4d1b-852c-b02ab69e5f9f-metal3-ironic-tls\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:47.567411 master-0 kubenswrapper[18707]: I0320 08:58:47.567363 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-ironic-basic-auth\" (UniqueName: \"kubernetes.io/secret/833281c3-2691-4d1b-852c-b02ab69e5f9f-metal3-ironic-basic-auth\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:47.580541 master-0 kubenswrapper[18707]: I0320 08:58:47.580472 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96xfp\" (UniqueName: \"kubernetes.io/projected/833281c3-2691-4d1b-852c-b02ab69e5f9f-kube-api-access-96xfp\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:48.070896 master-0 kubenswrapper[18707]: I0320 08:58:48.070806 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/833281c3-2691-4d1b-852c-b02ab69e5f9f-cert\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:48.071807 master-0 kubenswrapper[18707]: E0320 08:58:48.070957 18707 secret.go:189] Couldn't get secret openshift-machine-api/baremetal-operator-webhook-server-cert: secret "baremetal-operator-webhook-server-cert" not found Mar 20 08:58:48.071807 master-0 kubenswrapper[18707]: E0320 08:58:48.071013 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/833281c3-2691-4d1b-852c-b02ab69e5f9f-cert podName:833281c3-2691-4d1b-852c-b02ab69e5f9f nodeName:}" failed. No retries permitted until 2026-03-20 08:58:49.070997737 +0000 UTC m=+1074.227178093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/833281c3-2691-4d1b-852c-b02ab69e5f9f-cert") pod "metal3-baremetal-operator-78474bdc48-ndj7c" (UID: "833281c3-2691-4d1b-852c-b02ab69e5f9f") : secret "baremetal-operator-webhook-server-cert" not found Mar 20 08:58:49.088432 master-0 kubenswrapper[18707]: I0320 08:58:49.088309 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/833281c3-2691-4d1b-852c-b02ab69e5f9f-cert\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:49.093970 master-0 kubenswrapper[18707]: I0320 08:58:49.093905 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/833281c3-2691-4d1b-852c-b02ab69e5f9f-cert\") pod \"metal3-baremetal-operator-78474bdc48-ndj7c\" (UID: \"833281c3-2691-4d1b-852c-b02ab69e5f9f\") " pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:49.269198 master-0 kubenswrapper[18707]: I0320 08:58:49.269088 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" Mar 20 08:58:49.760606 master-0 kubenswrapper[18707]: I0320 08:58:49.760533 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c"] Mar 20 08:58:49.764355 master-0 kubenswrapper[18707]: W0320 08:58:49.764103 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod833281c3_2691_4d1b_852c_b02ab69e5f9f.slice/crio-8449ed95d6329cc85f58a7045ba5929527ae86c5cad3c700d4a91bd3bc4d12ca WatchSource:0}: Error finding container 8449ed95d6329cc85f58a7045ba5929527ae86c5cad3c700d4a91bd3bc4d12ca: Status 404 returned error can't find the container with id 8449ed95d6329cc85f58a7045ba5929527ae86c5cad3c700d4a91bd3bc4d12ca Mar 20 08:58:50.117889 master-0 kubenswrapper[18707]: I0320 08:58:50.117831 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq"] Mar 20 08:58:50.120586 master-0 kubenswrapper[18707]: I0320 08:58:50.120276 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.122805 master-0 kubenswrapper[18707]: I0320 08:58:50.122718 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"pull-secret" Mar 20 08:58:50.130768 master-0 kubenswrapper[18707]: I0320 08:58:50.130255 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq"] Mar 20 08:58:50.217195 master-0 kubenswrapper[18707]: I0320 08:58:50.217109 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-shared-image-cache\" (UniqueName: \"kubernetes.io/host-path/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-metal3-shared-image-cache\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.217195 master-0 kubenswrapper[18707]: I0320 08:58:50.217168 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ironic-agent-pull-secret\" (UniqueName: \"kubernetes.io/secret/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-ironic-agent-pull-secret\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.217483 master-0 kubenswrapper[18707]: I0320 08:58:50.217226 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-trusted-ca\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.217518 master-0 kubenswrapper[18707]: I0320 08:58:50.217464 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbd6w\" (UniqueName: \"kubernetes.io/projected/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-kube-api-access-jbd6w\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.217593 master-0 kubenswrapper[18707]: I0320 08:58:50.217563 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"user-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-user-ca-bundle\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.217641 master-0 kubenswrapper[18707]: I0320 08:58:50.217598 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-image-customization-volume\" (UniqueName: \"kubernetes.io/host-path/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-metal3-image-customization-volume\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.319640 master-0 kubenswrapper[18707]: I0320 08:58:50.319561 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-shared-image-cache\" (UniqueName: \"kubernetes.io/host-path/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-metal3-shared-image-cache\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.319640 master-0 kubenswrapper[18707]: I0320 08:58:50.319622 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ironic-agent-pull-secret\" (UniqueName: \"kubernetes.io/secret/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-ironic-agent-pull-secret\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.319991 master-0 kubenswrapper[18707]: I0320 08:58:50.319824 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-shared-image-cache\" (UniqueName: \"kubernetes.io/host-path/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-metal3-shared-image-cache\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.319991 master-0 kubenswrapper[18707]: I0320 08:58:50.319823 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-trusted-ca\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.320861 master-0 kubenswrapper[18707]: I0320 08:58:50.320342 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbd6w\" (UniqueName: \"kubernetes.io/projected/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-kube-api-access-jbd6w\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.320861 master-0 kubenswrapper[18707]: I0320 08:58:50.320453 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"user-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-user-ca-bundle\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.320861 master-0 kubenswrapper[18707]: I0320 08:58:50.320521 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-image-customization-volume\" (UniqueName: \"kubernetes.io/host-path/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-metal3-image-customization-volume\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.320861 master-0 kubenswrapper[18707]: I0320 08:58:50.320687 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-image-customization-volume\" (UniqueName: \"kubernetes.io/host-path/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-metal3-image-customization-volume\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.321095 master-0 kubenswrapper[18707]: I0320 08:58:50.320838 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"user-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-user-ca-bundle\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.322107 master-0 kubenswrapper[18707]: I0320 08:58:50.322070 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-trusted-ca\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.336011 master-0 kubenswrapper[18707]: I0320 08:58:50.335941 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ironic-agent-pull-secret\" (UniqueName: \"kubernetes.io/secret/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-ironic-agent-pull-secret\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.342346 master-0 kubenswrapper[18707]: I0320 08:58:50.342298 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbd6w\" (UniqueName: \"kubernetes.io/projected/a78acf22-ffe5-4b13-9ec6-a4b212930d6a-kube-api-access-jbd6w\") pod \"metal3-image-customization-7dd5d8c865-5ppkq\" (UID: \"a78acf22-ffe5-4b13-9ec6-a4b212930d6a\") " pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.449084 master-0 kubenswrapper[18707]: I0320 08:58:50.448945 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" Mar 20 08:58:50.483549 master-0 kubenswrapper[18707]: I0320 08:58:50.483485 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" event={"ID":"833281c3-2691-4d1b-852c-b02ab69e5f9f","Type":"ContainerStarted","Data":"8449ed95d6329cc85f58a7045ba5929527ae86c5cad3c700d4a91bd3bc4d12ca"} Mar 20 08:58:50.983693 master-0 kubenswrapper[18707]: I0320 08:58:50.982823 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq"] Mar 20 08:58:51.114740 master-0 kubenswrapper[18707]: I0320 08:58:51.114656 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/ironic-proxy-28zbq"] Mar 20 08:58:51.116827 master-0 kubenswrapper[18707]: I0320 08:58:51.116798 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/ironic-proxy-28zbq" Mar 20 08:58:51.134818 master-0 kubenswrapper[18707]: I0320 08:58:51.134750 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx47l\" (UniqueName: \"kubernetes.io/projected/a94e327e-399f-403f-988c-9399f601171a-kube-api-access-kx47l\") pod \"ironic-proxy-28zbq\" (UID: \"a94e327e-399f-403f-988c-9399f601171a\") " pod="openshift-machine-api/ironic-proxy-28zbq" Mar 20 08:58:51.135701 master-0 kubenswrapper[18707]: I0320 08:58:51.134963 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a94e327e-399f-403f-988c-9399f601171a-trusted-ca\") pod \"ironic-proxy-28zbq\" (UID: \"a94e327e-399f-403f-988c-9399f601171a\") " pod="openshift-machine-api/ironic-proxy-28zbq" Mar 20 08:58:51.135701 master-0 kubenswrapper[18707]: I0320 08:58:51.134997 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/a94e327e-399f-403f-988c-9399f601171a-metal3-ironic-tls\") pod \"ironic-proxy-28zbq\" (UID: \"a94e327e-399f-403f-988c-9399f601171a\") " pod="openshift-machine-api/ironic-proxy-28zbq" Mar 20 08:58:51.237936 master-0 kubenswrapper[18707]: I0320 08:58:51.237511 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx47l\" (UniqueName: \"kubernetes.io/projected/a94e327e-399f-403f-988c-9399f601171a-kube-api-access-kx47l\") pod \"ironic-proxy-28zbq\" (UID: \"a94e327e-399f-403f-988c-9399f601171a\") " pod="openshift-machine-api/ironic-proxy-28zbq" Mar 20 08:58:51.237936 master-0 kubenswrapper[18707]: I0320 08:58:51.237732 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/a94e327e-399f-403f-988c-9399f601171a-metal3-ironic-tls\") pod \"ironic-proxy-28zbq\" (UID: \"a94e327e-399f-403f-988c-9399f601171a\") " pod="openshift-machine-api/ironic-proxy-28zbq" Mar 20 08:58:51.237936 master-0 kubenswrapper[18707]: I0320 08:58:51.237756 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a94e327e-399f-403f-988c-9399f601171a-trusted-ca\") pod \"ironic-proxy-28zbq\" (UID: \"a94e327e-399f-403f-988c-9399f601171a\") " pod="openshift-machine-api/ironic-proxy-28zbq" Mar 20 08:58:51.239220 master-0 kubenswrapper[18707]: I0320 08:58:51.239159 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a94e327e-399f-403f-988c-9399f601171a-trusted-ca\") pod \"ironic-proxy-28zbq\" (UID: \"a94e327e-399f-403f-988c-9399f601171a\") " pod="openshift-machine-api/ironic-proxy-28zbq" Mar 20 08:58:51.244127 master-0 kubenswrapper[18707]: I0320 08:58:51.244064 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metal3-ironic-tls\" (UniqueName: \"kubernetes.io/secret/a94e327e-399f-403f-988c-9399f601171a-metal3-ironic-tls\") pod \"ironic-proxy-28zbq\" (UID: \"a94e327e-399f-403f-988c-9399f601171a\") " pod="openshift-machine-api/ironic-proxy-28zbq" Mar 20 08:58:51.258050 master-0 kubenswrapper[18707]: I0320 08:58:51.257934 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx47l\" (UniqueName: \"kubernetes.io/projected/a94e327e-399f-403f-988c-9399f601171a-kube-api-access-kx47l\") pod \"ironic-proxy-28zbq\" (UID: \"a94e327e-399f-403f-988c-9399f601171a\") " pod="openshift-machine-api/ironic-proxy-28zbq" Mar 20 08:58:51.451508 master-0 kubenswrapper[18707]: I0320 08:58:51.450931 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/ironic-proxy-28zbq" Mar 20 08:58:51.500402 master-0 kubenswrapper[18707]: I0320 08:58:51.499862 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" event={"ID":"a78acf22-ffe5-4b13-9ec6-a4b212930d6a","Type":"ContainerStarted","Data":"418bf701a83765186b56963055f653cacf4185fcde349ca5acc524ee0a0c40da"} Mar 20 08:58:52.515805 master-0 kubenswrapper[18707]: I0320 08:58:52.515744 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/ironic-proxy-28zbq" event={"ID":"a94e327e-399f-403f-988c-9399f601171a","Type":"ContainerStarted","Data":"4ed2e9920baa2aed5798ab269402d9a795400f01f32d8e4b7c288acdf5a49db8"} Mar 20 08:58:53.536296 master-0 kubenswrapper[18707]: I0320 08:58:53.536181 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" event={"ID":"833281c3-2691-4d1b-852c-b02ab69e5f9f","Type":"ContainerStarted","Data":"46d14571073011e493b10101177f3f8c7cda204db7876fcedecfce087026187b"} Mar 20 08:58:53.562158 master-0 kubenswrapper[18707]: I0320 08:58:53.562055 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/metal3-baremetal-operator-78474bdc48-ndj7c" podStartSLOduration=3.464828042 podStartE2EDuration="6.562032658s" podCreationTimestamp="2026-03-20 08:58:47 +0000 UTC" firstStartedPulling="2026-03-20 08:58:49.769315007 +0000 UTC m=+1074.925495363" lastFinishedPulling="2026-03-20 08:58:52.866519623 +0000 UTC m=+1078.022699979" observedRunningTime="2026-03-20 08:58:53.556575502 +0000 UTC m=+1078.712755878" watchObservedRunningTime="2026-03-20 08:58:53.562032658 +0000 UTC m=+1078.718213024" Mar 20 08:59:02.642492 master-0 kubenswrapper[18707]: I0320 08:59:02.642424 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/ironic-proxy-28zbq" event={"ID":"a94e327e-399f-403f-988c-9399f601171a","Type":"ContainerStarted","Data":"d7fdc125f5323bb4c3f0962205d1b3dad7bcca2b5a935dab07c8391e964d0e7e"} Mar 20 08:59:02.665591 master-0 kubenswrapper[18707]: I0320 08:59:02.665492 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/ironic-proxy-28zbq" podStartSLOduration=1.5743203540000001 podStartE2EDuration="11.665475149s" podCreationTimestamp="2026-03-20 08:58:51 +0000 UTC" firstStartedPulling="2026-03-20 08:58:51.488521416 +0000 UTC m=+1076.644701772" lastFinishedPulling="2026-03-20 08:59:01.579676211 +0000 UTC m=+1086.735856567" observedRunningTime="2026-03-20 08:59:02.660921709 +0000 UTC m=+1087.817102065" watchObservedRunningTime="2026-03-20 08:59:02.665475149 +0000 UTC m=+1087.821655505" Mar 20 08:59:22.810210 master-0 kubenswrapper[18707]: I0320 08:59:22.810126 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-image-customization-7dd5d8c865-5ppkq_a78acf22-ffe5-4b13-9ec6-a4b212930d6a/machine-os-images/0.log" Mar 20 08:59:22.810896 master-0 kubenswrapper[18707]: I0320 08:59:22.810330 18707 generic.go:334] "Generic (PLEG): container finished" podID="a78acf22-ffe5-4b13-9ec6-a4b212930d6a" containerID="590459a66daa605f16c1b0820e1a8d7c1f729dc6005e7adfb44486d338cf7535" exitCode=1 Mar 20 08:59:22.810896 master-0 kubenswrapper[18707]: I0320 08:59:22.810525 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" event={"ID":"a78acf22-ffe5-4b13-9ec6-a4b212930d6a","Type":"ContainerDied","Data":"590459a66daa605f16c1b0820e1a8d7c1f729dc6005e7adfb44486d338cf7535"} Mar 20 08:59:22.814998 master-0 kubenswrapper[18707]: I0320 08:59:22.814934 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-c957566cd-lgms8" event={"ID":"66940526-3f1e-4610-9d67-acea8496a04d","Type":"ContainerStarted","Data":"8816ad105b5bad68dd32058052608cbc7b6133fab0c5c7ede99fe40a809fbe04"} Mar 20 08:59:23.837126 master-0 kubenswrapper[18707]: I0320 08:59:23.837041 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-image-customization-7dd5d8c865-5ppkq_a78acf22-ffe5-4b13-9ec6-a4b212930d6a/machine-os-images/0.log" Mar 20 08:59:23.839725 master-0 kubenswrapper[18707]: I0320 08:59:23.839625 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" event={"ID":"a78acf22-ffe5-4b13-9ec6-a4b212930d6a","Type":"ContainerStarted","Data":"af2636d936cb81a7935b3dc84fb211c29269a2d465c409a8a6820d9154d60206"} Mar 20 08:59:24.847160 master-0 kubenswrapper[18707]: I0320 08:59:24.847075 18707 generic.go:334] "Generic (PLEG): container finished" podID="66940526-3f1e-4610-9d67-acea8496a04d" containerID="8816ad105b5bad68dd32058052608cbc7b6133fab0c5c7ede99fe40a809fbe04" exitCode=0 Mar 20 08:59:24.848245 master-0 kubenswrapper[18707]: I0320 08:59:24.848174 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-c957566cd-lgms8" event={"ID":"66940526-3f1e-4610-9d67-acea8496a04d","Type":"ContainerDied","Data":"8816ad105b5bad68dd32058052608cbc7b6133fab0c5c7ede99fe40a809fbe04"} Mar 20 08:59:26.867445 master-0 kubenswrapper[18707]: I0320 08:59:26.867400 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-image-customization-7dd5d8c865-5ppkq_a78acf22-ffe5-4b13-9ec6-a4b212930d6a/machine-os-images/1.log" Mar 20 08:59:26.867973 master-0 kubenswrapper[18707]: I0320 08:59:26.867943 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-image-customization-7dd5d8c865-5ppkq_a78acf22-ffe5-4b13-9ec6-a4b212930d6a/machine-os-images/0.log" Mar 20 08:59:26.868017 master-0 kubenswrapper[18707]: I0320 08:59:26.867985 18707 generic.go:334] "Generic (PLEG): container finished" podID="a78acf22-ffe5-4b13-9ec6-a4b212930d6a" containerID="af2636d936cb81a7935b3dc84fb211c29269a2d465c409a8a6820d9154d60206" exitCode=1 Mar 20 08:59:26.868068 master-0 kubenswrapper[18707]: I0320 08:59:26.868045 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" event={"ID":"a78acf22-ffe5-4b13-9ec6-a4b212930d6a","Type":"ContainerDied","Data":"af2636d936cb81a7935b3dc84fb211c29269a2d465c409a8a6820d9154d60206"} Mar 20 08:59:26.868103 master-0 kubenswrapper[18707]: I0320 08:59:26.868090 18707 scope.go:117] "RemoveContainer" containerID="590459a66daa605f16c1b0820e1a8d7c1f729dc6005e7adfb44486d338cf7535" Mar 20 08:59:26.870081 master-0 kubenswrapper[18707]: I0320 08:59:26.870036 18707 scope.go:117] "RemoveContainer" containerID="590459a66daa605f16c1b0820e1a8d7c1f729dc6005e7adfb44486d338cf7535" Mar 20 08:59:26.870756 master-0 kubenswrapper[18707]: I0320 08:59:26.870685 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-c957566cd-lgms8" event={"ID":"66940526-3f1e-4610-9d67-acea8496a04d","Type":"ContainerStarted","Data":"36a83cfb2eb1a3b43bd690cd5ad3fa88b09c580692252fdfc6f585aaa30c2860"} Mar 20 08:59:27.556908 master-0 kubenswrapper[18707]: E0320 08:59:27.555221 18707 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_machine-os-images_metal3-image-customization-7dd5d8c865-5ppkq_openshift-machine-api_a78acf22-ffe5-4b13-9ec6-a4b212930d6a_0 in pod sandbox 418bf701a83765186b56963055f653cacf4185fcde349ca5acc524ee0a0c40da from index: no such id: '590459a66daa605f16c1b0820e1a8d7c1f729dc6005e7adfb44486d338cf7535'" containerID="590459a66daa605f16c1b0820e1a8d7c1f729dc6005e7adfb44486d338cf7535" Mar 20 08:59:27.556908 master-0 kubenswrapper[18707]: E0320 08:59:27.555347 18707 kuberuntime_container.go:896] "Unhandled Error" err="failed to remove pod init container \"machine-os-images\": rpc error: code = Unknown desc = failed to delete container k8s_machine-os-images_metal3-image-customization-7dd5d8c865-5ppkq_openshift-machine-api_a78acf22-ffe5-4b13-9ec6-a4b212930d6a_0 in pod sandbox 418bf701a83765186b56963055f653cacf4185fcde349ca5acc524ee0a0c40da from index: no such id: '590459a66daa605f16c1b0820e1a8d7c1f729dc6005e7adfb44486d338cf7535'; Skipping pod \"metal3-image-customization-7dd5d8c865-5ppkq_openshift-machine-api(a78acf22-ffe5-4b13-9ec6-a4b212930d6a)\"" logger="UnhandledError" Mar 20 08:59:27.557278 master-0 kubenswrapper[18707]: E0320 08:59:27.557158 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-os-images\" with CrashLoopBackOff: \"back-off 10s restarting failed container=machine-os-images pod=metal3-image-customization-7dd5d8c865-5ppkq_openshift-machine-api(a78acf22-ffe5-4b13-9ec6-a4b212930d6a)\"" pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" podUID="a78acf22-ffe5-4b13-9ec6-a4b212930d6a" Mar 20 08:59:27.890043 master-0 kubenswrapper[18707]: I0320 08:59:27.889870 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-c957566cd-lgms8" event={"ID":"66940526-3f1e-4610-9d67-acea8496a04d","Type":"ContainerStarted","Data":"2c155afab473e3acdaac2c07dc119cd65b3d38ad1aa3084f2f3aeff954d5269d"} Mar 20 08:59:27.892041 master-0 kubenswrapper[18707]: I0320 08:59:27.891991 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-image-customization-7dd5d8c865-5ppkq_a78acf22-ffe5-4b13-9ec6-a4b212930d6a/machine-os-images/1.log" Mar 20 08:59:29.924682 master-0 kubenswrapper[18707]: I0320 08:59:29.924560 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-c957566cd-lgms8" event={"ID":"66940526-3f1e-4610-9d67-acea8496a04d","Type":"ContainerStarted","Data":"cccd0ad038ec8de56997d4b0db2fb804c4190cdc588a49a030c102b8f6f0757f"} Mar 20 08:59:30.238392 master-0 kubenswrapper[18707]: I0320 08:59:30.238169 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/metal3-c957566cd-lgms8" podStartSLOduration=9.985558303 podStartE2EDuration="44.238134614s" podCreationTimestamp="2026-03-20 08:58:46 +0000 UTC" firstStartedPulling="2026-03-20 08:58:47.409532855 +0000 UTC m=+1072.565713251" lastFinishedPulling="2026-03-20 08:59:21.662109195 +0000 UTC m=+1106.818289562" observedRunningTime="2026-03-20 08:59:30.229665492 +0000 UTC m=+1115.385845858" watchObservedRunningTime="2026-03-20 08:59:30.238134614 +0000 UTC m=+1115.394315000" Mar 20 08:59:38.335935 master-0 kubenswrapper[18707]: I0320 08:59:38.335884 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg"] Mar 20 08:59:38.339271 master-0 kubenswrapper[18707]: I0320 08:59:38.339232 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" Mar 20 08:59:38.348753 master-0 kubenswrapper[18707]: I0320 08:59:38.348630 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg"] Mar 20 08:59:38.364075 master-0 kubenswrapper[18707]: I0320 08:59:38.364000 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb2b48e9-577d-453b-8213-38b4835d6cff-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg\" (UID: \"cb2b48e9-577d-453b-8213-38b4835d6cff\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" Mar 20 08:59:38.364075 master-0 kubenswrapper[18707]: I0320 08:59:38.364066 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj9nq\" (UniqueName: \"kubernetes.io/projected/cb2b48e9-577d-453b-8213-38b4835d6cff-kube-api-access-lj9nq\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg\" (UID: \"cb2b48e9-577d-453b-8213-38b4835d6cff\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" Mar 20 08:59:38.364333 master-0 kubenswrapper[18707]: I0320 08:59:38.364115 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb2b48e9-577d-453b-8213-38b4835d6cff-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg\" (UID: \"cb2b48e9-577d-453b-8213-38b4835d6cff\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" Mar 20 08:59:38.466147 master-0 kubenswrapper[18707]: I0320 08:59:38.466065 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb2b48e9-577d-453b-8213-38b4835d6cff-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg\" (UID: \"cb2b48e9-577d-453b-8213-38b4835d6cff\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" Mar 20 08:59:38.466478 master-0 kubenswrapper[18707]: I0320 08:59:38.466291 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb2b48e9-577d-453b-8213-38b4835d6cff-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg\" (UID: \"cb2b48e9-577d-453b-8213-38b4835d6cff\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" Mar 20 08:59:38.466478 master-0 kubenswrapper[18707]: I0320 08:59:38.466338 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj9nq\" (UniqueName: \"kubernetes.io/projected/cb2b48e9-577d-453b-8213-38b4835d6cff-kube-api-access-lj9nq\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg\" (UID: \"cb2b48e9-577d-453b-8213-38b4835d6cff\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" Mar 20 08:59:38.466896 master-0 kubenswrapper[18707]: I0320 08:59:38.466849 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb2b48e9-577d-453b-8213-38b4835d6cff-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg\" (UID: \"cb2b48e9-577d-453b-8213-38b4835d6cff\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" Mar 20 08:59:38.467094 master-0 kubenswrapper[18707]: I0320 08:59:38.467066 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb2b48e9-577d-453b-8213-38b4835d6cff-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg\" (UID: \"cb2b48e9-577d-453b-8213-38b4835d6cff\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" Mar 20 08:59:38.484304 master-0 kubenswrapper[18707]: I0320 08:59:38.483963 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj9nq\" (UniqueName: \"kubernetes.io/projected/cb2b48e9-577d-453b-8213-38b4835d6cff-kube-api-access-lj9nq\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg\" (UID: \"cb2b48e9-577d-453b-8213-38b4835d6cff\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" Mar 20 08:59:38.672431 master-0 kubenswrapper[18707]: I0320 08:59:38.672239 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" Mar 20 08:59:39.001739 master-0 kubenswrapper[18707]: I0320 08:59:39.001551 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_metal3-image-customization-7dd5d8c865-5ppkq_a78acf22-ffe5-4b13-9ec6-a4b212930d6a/machine-os-images/1.log" Mar 20 08:59:39.001739 master-0 kubenswrapper[18707]: I0320 08:59:39.001637 18707 generic.go:334] "Generic (PLEG): container finished" podID="a78acf22-ffe5-4b13-9ec6-a4b212930d6a" containerID="ec26c34b7287b8d2f79420e74d8932d2d33d757c53cc49366a982ce2f281d552" exitCode=0 Mar 20 08:59:39.001739 master-0 kubenswrapper[18707]: I0320 08:59:39.001682 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" event={"ID":"a78acf22-ffe5-4b13-9ec6-a4b212930d6a","Type":"ContainerDied","Data":"ec26c34b7287b8d2f79420e74d8932d2d33d757c53cc49366a982ce2f281d552"} Mar 20 08:59:39.001739 master-0 kubenswrapper[18707]: I0320 08:59:39.001739 18707 scope.go:117] "RemoveContainer" containerID="af2636d936cb81a7935b3dc84fb211c29269a2d465c409a8a6820d9154d60206" Mar 20 08:59:39.003691 master-0 kubenswrapper[18707]: I0320 08:59:39.003629 18707 scope.go:117] "RemoveContainer" containerID="af2636d936cb81a7935b3dc84fb211c29269a2d465c409a8a6820d9154d60206" Mar 20 08:59:39.027243 master-0 kubenswrapper[18707]: E0320 08:59:39.027197 18707 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_machine-os-images_metal3-image-customization-7dd5d8c865-5ppkq_openshift-machine-api_a78acf22-ffe5-4b13-9ec6-a4b212930d6a_1 in pod sandbox 418bf701a83765186b56963055f653cacf4185fcde349ca5acc524ee0a0c40da from index: no such id: 'af2636d936cb81a7935b3dc84fb211c29269a2d465c409a8a6820d9154d60206'" containerID="af2636d936cb81a7935b3dc84fb211c29269a2d465c409a8a6820d9154d60206" Mar 20 08:59:39.027352 master-0 kubenswrapper[18707]: E0320 08:59:39.027273 18707 kuberuntime_container.go:896] "Unhandled Error" err="failed to remove pod init container \"machine-os-images\": rpc error: code = Unknown desc = failed to delete container k8s_machine-os-images_metal3-image-customization-7dd5d8c865-5ppkq_openshift-machine-api_a78acf22-ffe5-4b13-9ec6-a4b212930d6a_1 in pod sandbox 418bf701a83765186b56963055f653cacf4185fcde349ca5acc524ee0a0c40da from index: no such id: 'af2636d936cb81a7935b3dc84fb211c29269a2d465c409a8a6820d9154d60206'; Skipping pod \"metal3-image-customization-7dd5d8c865-5ppkq_openshift-machine-api(a78acf22-ffe5-4b13-9ec6-a4b212930d6a)\"" logger="UnhandledError" Mar 20 08:59:39.110813 master-0 kubenswrapper[18707]: I0320 08:59:39.110730 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg"] Mar 20 08:59:39.110991 master-0 kubenswrapper[18707]: W0320 08:59:39.110926 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb2b48e9_577d_453b_8213_38b4835d6cff.slice/crio-0e8f4413f1ab08808fc50cee6a3f3f12ceb3b14a2b7e8298b619f29e668f1ca0 WatchSource:0}: Error finding container 0e8f4413f1ab08808fc50cee6a3f3f12ceb3b14a2b7e8298b619f29e668f1ca0: Status 404 returned error can't find the container with id 0e8f4413f1ab08808fc50cee6a3f3f12ceb3b14a2b7e8298b619f29e668f1ca0 Mar 20 08:59:40.018905 master-0 kubenswrapper[18707]: I0320 08:59:40.013922 18707 generic.go:334] "Generic (PLEG): container finished" podID="cb2b48e9-577d-453b-8213-38b4835d6cff" containerID="81f90b4f7bf8c0b6564c2108e0f07003ed809bdc5193960e0b2f7622b684728b" exitCode=0 Mar 20 08:59:40.018905 master-0 kubenswrapper[18707]: I0320 08:59:40.014033 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" event={"ID":"cb2b48e9-577d-453b-8213-38b4835d6cff","Type":"ContainerDied","Data":"81f90b4f7bf8c0b6564c2108e0f07003ed809bdc5193960e0b2f7622b684728b"} Mar 20 08:59:40.018905 master-0 kubenswrapper[18707]: I0320 08:59:40.014105 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" event={"ID":"cb2b48e9-577d-453b-8213-38b4835d6cff","Type":"ContainerStarted","Data":"0e8f4413f1ab08808fc50cee6a3f3f12ceb3b14a2b7e8298b619f29e668f1ca0"} Mar 20 08:59:42.049367 master-0 kubenswrapper[18707]: I0320 08:59:42.048594 18707 generic.go:334] "Generic (PLEG): container finished" podID="cb2b48e9-577d-453b-8213-38b4835d6cff" containerID="326fa8098a43af4d9775fb6ee6a466a06f162e8b88314bbfd0a356c7e8099425" exitCode=0 Mar 20 08:59:42.049367 master-0 kubenswrapper[18707]: I0320 08:59:42.048685 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" event={"ID":"cb2b48e9-577d-453b-8213-38b4835d6cff","Type":"ContainerDied","Data":"326fa8098a43af4d9775fb6ee6a466a06f162e8b88314bbfd0a356c7e8099425"} Mar 20 08:59:43.062805 master-0 kubenswrapper[18707]: I0320 08:59:43.062750 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" event={"ID":"cb2b48e9-577d-453b-8213-38b4835d6cff","Type":"ContainerStarted","Data":"f9a1f22209e0d5570354469d10a9a640ac098fc7ef19f3e22dbbe477e62d4504"} Mar 20 08:59:43.088372 master-0 kubenswrapper[18707]: I0320 08:59:43.088265 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" podStartSLOduration=3.949556849 podStartE2EDuration="5.088236088s" podCreationTimestamp="2026-03-20 08:59:38 +0000 UTC" firstStartedPulling="2026-03-20 08:59:40.018275021 +0000 UTC m=+1125.174455387" lastFinishedPulling="2026-03-20 08:59:41.15695426 +0000 UTC m=+1126.313134626" observedRunningTime="2026-03-20 08:59:43.079463577 +0000 UTC m=+1128.235643953" watchObservedRunningTime="2026-03-20 08:59:43.088236088 +0000 UTC m=+1128.244416444" Mar 20 08:59:44.077922 master-0 kubenswrapper[18707]: I0320 08:59:44.077382 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" event={"ID":"a78acf22-ffe5-4b13-9ec6-a4b212930d6a","Type":"ContainerStarted","Data":"055f55516659102ae6912a7f2b285d4cf811f3d74a6f2982b6d28321afed28cb"} Mar 20 08:59:44.083221 master-0 kubenswrapper[18707]: I0320 08:59:44.083129 18707 generic.go:334] "Generic (PLEG): container finished" podID="cb2b48e9-577d-453b-8213-38b4835d6cff" containerID="f9a1f22209e0d5570354469d10a9a640ac098fc7ef19f3e22dbbe477e62d4504" exitCode=0 Mar 20 08:59:44.083414 master-0 kubenswrapper[18707]: I0320 08:59:44.083230 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" event={"ID":"cb2b48e9-577d-453b-8213-38b4835d6cff","Type":"ContainerDied","Data":"f9a1f22209e0d5570354469d10a9a640ac098fc7ef19f3e22dbbe477e62d4504"} Mar 20 08:59:44.124228 master-0 kubenswrapper[18707]: I0320 08:59:44.124015 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/metal3-image-customization-7dd5d8c865-5ppkq" podStartSLOduration=2.31382411 podStartE2EDuration="54.123979806s" podCreationTimestamp="2026-03-20 08:58:50 +0000 UTC" firstStartedPulling="2026-03-20 08:58:51.004506875 +0000 UTC m=+1076.160687241" lastFinishedPulling="2026-03-20 08:59:42.814662581 +0000 UTC m=+1127.970842937" observedRunningTime="2026-03-20 08:59:44.112539559 +0000 UTC m=+1129.268719955" watchObservedRunningTime="2026-03-20 08:59:44.123979806 +0000 UTC m=+1129.280160192" Mar 20 08:59:45.538809 master-0 kubenswrapper[18707]: I0320 08:59:45.538692 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" Mar 20 08:59:45.712901 master-0 kubenswrapper[18707]: I0320 08:59:45.712816 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj9nq\" (UniqueName: \"kubernetes.io/projected/cb2b48e9-577d-453b-8213-38b4835d6cff-kube-api-access-lj9nq\") pod \"cb2b48e9-577d-453b-8213-38b4835d6cff\" (UID: \"cb2b48e9-577d-453b-8213-38b4835d6cff\") " Mar 20 08:59:45.713452 master-0 kubenswrapper[18707]: I0320 08:59:45.713216 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb2b48e9-577d-453b-8213-38b4835d6cff-util\") pod \"cb2b48e9-577d-453b-8213-38b4835d6cff\" (UID: \"cb2b48e9-577d-453b-8213-38b4835d6cff\") " Mar 20 08:59:45.713452 master-0 kubenswrapper[18707]: I0320 08:59:45.713415 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb2b48e9-577d-453b-8213-38b4835d6cff-bundle\") pod \"cb2b48e9-577d-453b-8213-38b4835d6cff\" (UID: \"cb2b48e9-577d-453b-8213-38b4835d6cff\") " Mar 20 08:59:45.715654 master-0 kubenswrapper[18707]: I0320 08:59:45.715516 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb2b48e9-577d-453b-8213-38b4835d6cff-bundle" (OuterVolumeSpecName: "bundle") pod "cb2b48e9-577d-453b-8213-38b4835d6cff" (UID: "cb2b48e9-577d-453b-8213-38b4835d6cff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:59:45.718839 master-0 kubenswrapper[18707]: I0320 08:59:45.718764 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb2b48e9-577d-453b-8213-38b4835d6cff-kube-api-access-lj9nq" (OuterVolumeSpecName: "kube-api-access-lj9nq") pod "cb2b48e9-577d-453b-8213-38b4835d6cff" (UID: "cb2b48e9-577d-453b-8213-38b4835d6cff"). InnerVolumeSpecName "kube-api-access-lj9nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:59:45.741801 master-0 kubenswrapper[18707]: I0320 08:59:45.741696 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb2b48e9-577d-453b-8213-38b4835d6cff-util" (OuterVolumeSpecName: "util") pod "cb2b48e9-577d-453b-8213-38b4835d6cff" (UID: "cb2b48e9-577d-453b-8213-38b4835d6cff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:59:45.817434 master-0 kubenswrapper[18707]: I0320 08:59:45.816493 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj9nq\" (UniqueName: \"kubernetes.io/projected/cb2b48e9-577d-453b-8213-38b4835d6cff-kube-api-access-lj9nq\") on node \"master-0\" DevicePath \"\"" Mar 20 08:59:45.817434 master-0 kubenswrapper[18707]: I0320 08:59:45.816583 18707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb2b48e9-577d-453b-8213-38b4835d6cff-util\") on node \"master-0\" DevicePath \"\"" Mar 20 08:59:45.817434 master-0 kubenswrapper[18707]: I0320 08:59:45.816616 18707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb2b48e9-577d-453b-8213-38b4835d6cff-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 08:59:46.105388 master-0 kubenswrapper[18707]: I0320 08:59:46.105288 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" event={"ID":"cb2b48e9-577d-453b-8213-38b4835d6cff","Type":"ContainerDied","Data":"0e8f4413f1ab08808fc50cee6a3f3f12ceb3b14a2b7e8298b619f29e668f1ca0"} Mar 20 08:59:46.105388 master-0 kubenswrapper[18707]: I0320 08:59:46.105359 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e8f4413f1ab08808fc50cee6a3f3f12ceb3b14a2b7e8298b619f29e668f1ca0" Mar 20 08:59:46.105798 master-0 kubenswrapper[18707]: I0320 08:59:46.105416 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gkzrg" Mar 20 08:59:51.217783 master-0 kubenswrapper[18707]: I0320 08:59:51.217722 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-dd6b59969-zzdqx"] Mar 20 08:59:51.218520 master-0 kubenswrapper[18707]: E0320 08:59:51.218061 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb2b48e9-577d-453b-8213-38b4835d6cff" containerName="util" Mar 20 08:59:51.218520 master-0 kubenswrapper[18707]: I0320 08:59:51.218074 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb2b48e9-577d-453b-8213-38b4835d6cff" containerName="util" Mar 20 08:59:51.218520 master-0 kubenswrapper[18707]: E0320 08:59:51.218091 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb2b48e9-577d-453b-8213-38b4835d6cff" containerName="pull" Mar 20 08:59:51.218520 master-0 kubenswrapper[18707]: I0320 08:59:51.218097 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb2b48e9-577d-453b-8213-38b4835d6cff" containerName="pull" Mar 20 08:59:51.218520 master-0 kubenswrapper[18707]: E0320 08:59:51.218129 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb2b48e9-577d-453b-8213-38b4835d6cff" containerName="extract" Mar 20 08:59:51.218520 master-0 kubenswrapper[18707]: I0320 08:59:51.218136 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb2b48e9-577d-453b-8213-38b4835d6cff" containerName="extract" Mar 20 08:59:51.218520 master-0 kubenswrapper[18707]: I0320 08:59:51.218348 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb2b48e9-577d-453b-8213-38b4835d6cff" containerName="extract" Mar 20 08:59:51.218989 master-0 kubenswrapper[18707]: I0320 08:59:51.218963 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.223252 master-0 kubenswrapper[18707]: I0320 08:59:51.220831 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Mar 20 08:59:51.223252 master-0 kubenswrapper[18707]: I0320 08:59:51.221127 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Mar 20 08:59:51.223252 master-0 kubenswrapper[18707]: I0320 08:59:51.221400 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Mar 20 08:59:51.223252 master-0 kubenswrapper[18707]: I0320 08:59:51.221534 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Mar 20 08:59:51.225657 master-0 kubenswrapper[18707]: I0320 08:59:51.225613 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Mar 20 08:59:51.236168 master-0 kubenswrapper[18707]: I0320 08:59:51.236096 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-dd6b59969-zzdqx"] Mar 20 08:59:51.325694 master-0 kubenswrapper[18707]: I0320 08:59:51.325618 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e78c8599-18be-4718-95ab-6386511ee275-metrics-cert\") pod \"lvms-operator-dd6b59969-zzdqx\" (UID: \"e78c8599-18be-4718-95ab-6386511ee275\") " pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.325944 master-0 kubenswrapper[18707]: I0320 08:59:51.325762 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e78c8599-18be-4718-95ab-6386511ee275-apiservice-cert\") pod \"lvms-operator-dd6b59969-zzdqx\" (UID: \"e78c8599-18be-4718-95ab-6386511ee275\") " pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.325944 master-0 kubenswrapper[18707]: I0320 08:59:51.325795 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e78c8599-18be-4718-95ab-6386511ee275-webhook-cert\") pod \"lvms-operator-dd6b59969-zzdqx\" (UID: \"e78c8599-18be-4718-95ab-6386511ee275\") " pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.325944 master-0 kubenswrapper[18707]: I0320 08:59:51.325861 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e78c8599-18be-4718-95ab-6386511ee275-socket-dir\") pod \"lvms-operator-dd6b59969-zzdqx\" (UID: \"e78c8599-18be-4718-95ab-6386511ee275\") " pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.326242 master-0 kubenswrapper[18707]: I0320 08:59:51.326175 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbwtt\" (UniqueName: \"kubernetes.io/projected/e78c8599-18be-4718-95ab-6386511ee275-kube-api-access-cbwtt\") pod \"lvms-operator-dd6b59969-zzdqx\" (UID: \"e78c8599-18be-4718-95ab-6386511ee275\") " pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.427665 master-0 kubenswrapper[18707]: I0320 08:59:51.427607 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e78c8599-18be-4718-95ab-6386511ee275-socket-dir\") pod \"lvms-operator-dd6b59969-zzdqx\" (UID: \"e78c8599-18be-4718-95ab-6386511ee275\") " pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.427941 master-0 kubenswrapper[18707]: I0320 08:59:51.427926 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbwtt\" (UniqueName: \"kubernetes.io/projected/e78c8599-18be-4718-95ab-6386511ee275-kube-api-access-cbwtt\") pod \"lvms-operator-dd6b59969-zzdqx\" (UID: \"e78c8599-18be-4718-95ab-6386511ee275\") " pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.428038 master-0 kubenswrapper[18707]: I0320 08:59:51.428024 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e78c8599-18be-4718-95ab-6386511ee275-metrics-cert\") pod \"lvms-operator-dd6b59969-zzdqx\" (UID: \"e78c8599-18be-4718-95ab-6386511ee275\") " pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.428204 master-0 kubenswrapper[18707]: I0320 08:59:51.428176 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e78c8599-18be-4718-95ab-6386511ee275-apiservice-cert\") pod \"lvms-operator-dd6b59969-zzdqx\" (UID: \"e78c8599-18be-4718-95ab-6386511ee275\") " pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.428302 master-0 kubenswrapper[18707]: I0320 08:59:51.428256 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e78c8599-18be-4718-95ab-6386511ee275-socket-dir\") pod \"lvms-operator-dd6b59969-zzdqx\" (UID: \"e78c8599-18be-4718-95ab-6386511ee275\") " pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.428351 master-0 kubenswrapper[18707]: I0320 08:59:51.428271 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e78c8599-18be-4718-95ab-6386511ee275-webhook-cert\") pod \"lvms-operator-dd6b59969-zzdqx\" (UID: \"e78c8599-18be-4718-95ab-6386511ee275\") " pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.431614 master-0 kubenswrapper[18707]: I0320 08:59:51.431572 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e78c8599-18be-4718-95ab-6386511ee275-metrics-cert\") pod \"lvms-operator-dd6b59969-zzdqx\" (UID: \"e78c8599-18be-4718-95ab-6386511ee275\") " pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.432907 master-0 kubenswrapper[18707]: I0320 08:59:51.432868 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e78c8599-18be-4718-95ab-6386511ee275-webhook-cert\") pod \"lvms-operator-dd6b59969-zzdqx\" (UID: \"e78c8599-18be-4718-95ab-6386511ee275\") " pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.433439 master-0 kubenswrapper[18707]: I0320 08:59:51.433363 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e78c8599-18be-4718-95ab-6386511ee275-apiservice-cert\") pod \"lvms-operator-dd6b59969-zzdqx\" (UID: \"e78c8599-18be-4718-95ab-6386511ee275\") " pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.448304 master-0 kubenswrapper[18707]: I0320 08:59:51.448229 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbwtt\" (UniqueName: \"kubernetes.io/projected/e78c8599-18be-4718-95ab-6386511ee275-kube-api-access-cbwtt\") pod \"lvms-operator-dd6b59969-zzdqx\" (UID: \"e78c8599-18be-4718-95ab-6386511ee275\") " pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.555105 master-0 kubenswrapper[18707]: I0320 08:59:51.555049 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:51.995088 master-0 kubenswrapper[18707]: I0320 08:59:51.995027 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-dd6b59969-zzdqx"] Mar 20 08:59:52.002267 master-0 kubenswrapper[18707]: W0320 08:59:52.001765 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode78c8599_18be_4718_95ab_6386511ee275.slice/crio-c0f35354b1a2b62795174f4a437f860dbbd73a4311f0df6fa53ab756b25509f5 WatchSource:0}: Error finding container c0f35354b1a2b62795174f4a437f860dbbd73a4311f0df6fa53ab756b25509f5: Status 404 returned error can't find the container with id c0f35354b1a2b62795174f4a437f860dbbd73a4311f0df6fa53ab756b25509f5 Mar 20 08:59:52.163709 master-0 kubenswrapper[18707]: I0320 08:59:52.163625 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" event={"ID":"e78c8599-18be-4718-95ab-6386511ee275","Type":"ContainerStarted","Data":"c0f35354b1a2b62795174f4a437f860dbbd73a4311f0df6fa53ab756b25509f5"} Mar 20 08:59:57.207019 master-0 kubenswrapper[18707]: I0320 08:59:57.206926 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" event={"ID":"e78c8599-18be-4718-95ab-6386511ee275","Type":"ContainerStarted","Data":"6e64317efd7b63366a60af322c85cf4eeccb4a14c3af51011fcc94b484209afe"} Mar 20 08:59:57.207945 master-0 kubenswrapper[18707]: I0320 08:59:57.207438 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:57.211764 master-0 kubenswrapper[18707]: I0320 08:59:57.211713 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" Mar 20 08:59:57.237075 master-0 kubenswrapper[18707]: I0320 08:59:57.236980 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-dd6b59969-zzdqx" podStartSLOduration=1.500975326 podStartE2EDuration="6.236956221s" podCreationTimestamp="2026-03-20 08:59:51 +0000 UTC" firstStartedPulling="2026-03-20 08:59:52.005712354 +0000 UTC m=+1137.161892700" lastFinishedPulling="2026-03-20 08:59:56.741693239 +0000 UTC m=+1141.897873595" observedRunningTime="2026-03-20 08:59:57.228338115 +0000 UTC m=+1142.384518511" watchObservedRunningTime="2026-03-20 08:59:57.236956221 +0000 UTC m=+1142.393136577" Mar 20 09:00:00.735561 master-0 kubenswrapper[18707]: I0320 09:00:00.735472 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf"] Mar 20 09:00:00.738252 master-0 kubenswrapper[18707]: I0320 09:00:00.738209 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" Mar 20 09:00:00.752603 master-0 kubenswrapper[18707]: I0320 09:00:00.752512 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf"] Mar 20 09:00:00.899837 master-0 kubenswrapper[18707]: I0320 09:00:00.899750 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b062733-8620-41cd-b96f-96083e9837d0-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf\" (UID: \"9b062733-8620-41cd-b96f-96083e9837d0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" Mar 20 09:00:00.900141 master-0 kubenswrapper[18707]: I0320 09:00:00.899898 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b062733-8620-41cd-b96f-96083e9837d0-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf\" (UID: \"9b062733-8620-41cd-b96f-96083e9837d0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" Mar 20 09:00:00.900730 master-0 kubenswrapper[18707]: I0320 09:00:00.900667 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-925bh\" (UniqueName: \"kubernetes.io/projected/9b062733-8620-41cd-b96f-96083e9837d0-kube-api-access-925bh\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf\" (UID: \"9b062733-8620-41cd-b96f-96083e9837d0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" Mar 20 09:00:01.003349 master-0 kubenswrapper[18707]: I0320 09:00:01.003141 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b062733-8620-41cd-b96f-96083e9837d0-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf\" (UID: \"9b062733-8620-41cd-b96f-96083e9837d0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" Mar 20 09:00:01.003629 master-0 kubenswrapper[18707]: I0320 09:00:01.003312 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b062733-8620-41cd-b96f-96083e9837d0-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf\" (UID: \"9b062733-8620-41cd-b96f-96083e9837d0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" Mar 20 09:00:01.003629 master-0 kubenswrapper[18707]: I0320 09:00:01.003574 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-925bh\" (UniqueName: \"kubernetes.io/projected/9b062733-8620-41cd-b96f-96083e9837d0-kube-api-access-925bh\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf\" (UID: \"9b062733-8620-41cd-b96f-96083e9837d0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" Mar 20 09:00:01.004827 master-0 kubenswrapper[18707]: I0320 09:00:01.004766 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b062733-8620-41cd-b96f-96083e9837d0-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf\" (UID: \"9b062733-8620-41cd-b96f-96083e9837d0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" Mar 20 09:00:01.005075 master-0 kubenswrapper[18707]: I0320 09:00:01.005009 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b062733-8620-41cd-b96f-96083e9837d0-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf\" (UID: \"9b062733-8620-41cd-b96f-96083e9837d0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" Mar 20 09:00:01.035144 master-0 kubenswrapper[18707]: I0320 09:00:01.035071 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-925bh\" (UniqueName: \"kubernetes.io/projected/9b062733-8620-41cd-b96f-96083e9837d0-kube-api-access-925bh\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf\" (UID: \"9b062733-8620-41cd-b96f-96083e9837d0\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" Mar 20 09:00:01.065274 master-0 kubenswrapper[18707]: I0320 09:00:01.065147 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" Mar 20 09:00:01.582036 master-0 kubenswrapper[18707]: W0320 09:00:01.581883 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b062733_8620_41cd_b96f_96083e9837d0.slice/crio-0b541f580c67b1e9f59800ccab19c39e1b06afd84f41be221409972b5f12c1b3 WatchSource:0}: Error finding container 0b541f580c67b1e9f59800ccab19c39e1b06afd84f41be221409972b5f12c1b3: Status 404 returned error can't find the container with id 0b541f580c67b1e9f59800ccab19c39e1b06afd84f41be221409972b5f12c1b3 Mar 20 09:00:01.585479 master-0 kubenswrapper[18707]: I0320 09:00:01.585432 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf"] Mar 20 09:00:02.254875 master-0 kubenswrapper[18707]: I0320 09:00:02.254747 18707 generic.go:334] "Generic (PLEG): container finished" podID="9b062733-8620-41cd-b96f-96083e9837d0" containerID="e68ac62a0b94766b928b3d0b29ea3875b5539c1223764784ed1d7ec22e51bed3" exitCode=0 Mar 20 09:00:02.254875 master-0 kubenswrapper[18707]: I0320 09:00:02.254815 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" event={"ID":"9b062733-8620-41cd-b96f-96083e9837d0","Type":"ContainerDied","Data":"e68ac62a0b94766b928b3d0b29ea3875b5539c1223764784ed1d7ec22e51bed3"} Mar 20 09:00:02.255631 master-0 kubenswrapper[18707]: I0320 09:00:02.254887 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" event={"ID":"9b062733-8620-41cd-b96f-96083e9837d0","Type":"ContainerStarted","Data":"0b541f580c67b1e9f59800ccab19c39e1b06afd84f41be221409972b5f12c1b3"} Mar 20 09:00:03.328910 master-0 kubenswrapper[18707]: I0320 09:00:03.328810 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9"] Mar 20 09:00:03.332640 master-0 kubenswrapper[18707]: I0320 09:00:03.332589 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" Mar 20 09:00:03.347485 master-0 kubenswrapper[18707]: I0320 09:00:03.347389 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9"] Mar 20 09:00:03.351073 master-0 kubenswrapper[18707]: I0320 09:00:03.351006 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9\" (UID: \"9b58d9e2-20c6-4741-a558-25f3ce37fd2f\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" Mar 20 09:00:03.351177 master-0 kubenswrapper[18707]: I0320 09:00:03.351103 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7xcb\" (UniqueName: \"kubernetes.io/projected/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-kube-api-access-w7xcb\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9\" (UID: \"9b58d9e2-20c6-4741-a558-25f3ce37fd2f\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" Mar 20 09:00:03.351554 master-0 kubenswrapper[18707]: I0320 09:00:03.351499 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9\" (UID: \"9b58d9e2-20c6-4741-a558-25f3ce37fd2f\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" Mar 20 09:00:03.453011 master-0 kubenswrapper[18707]: I0320 09:00:03.452935 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9\" (UID: \"9b58d9e2-20c6-4741-a558-25f3ce37fd2f\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" Mar 20 09:00:03.453273 master-0 kubenswrapper[18707]: I0320 09:00:03.453150 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7xcb\" (UniqueName: \"kubernetes.io/projected/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-kube-api-access-w7xcb\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9\" (UID: \"9b58d9e2-20c6-4741-a558-25f3ce37fd2f\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" Mar 20 09:00:03.453367 master-0 kubenswrapper[18707]: I0320 09:00:03.453331 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9\" (UID: \"9b58d9e2-20c6-4741-a558-25f3ce37fd2f\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" Mar 20 09:00:03.453923 master-0 kubenswrapper[18707]: I0320 09:00:03.453855 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9\" (UID: \"9b58d9e2-20c6-4741-a558-25f3ce37fd2f\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" Mar 20 09:00:03.453989 master-0 kubenswrapper[18707]: I0320 09:00:03.453878 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9\" (UID: \"9b58d9e2-20c6-4741-a558-25f3ce37fd2f\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" Mar 20 09:00:03.478119 master-0 kubenswrapper[18707]: I0320 09:00:03.478061 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7xcb\" (UniqueName: \"kubernetes.io/projected/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-kube-api-access-w7xcb\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9\" (UID: \"9b58d9e2-20c6-4741-a558-25f3ce37fd2f\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" Mar 20 09:00:03.668693 master-0 kubenswrapper[18707]: I0320 09:00:03.668557 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" Mar 20 09:00:03.769918 master-0 kubenswrapper[18707]: I0320 09:00:03.766223 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm"] Mar 20 09:00:03.771880 master-0 kubenswrapper[18707]: I0320 09:00:03.771848 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" Mar 20 09:00:03.793430 master-0 kubenswrapper[18707]: I0320 09:00:03.793358 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm"] Mar 20 09:00:03.859023 master-0 kubenswrapper[18707]: I0320 09:00:03.858969 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxcld\" (UniqueName: \"kubernetes.io/projected/8882dc2b-511a-464d-8a19-76286dcc6feb-kube-api-access-cxcld\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm\" (UID: \"8882dc2b-511a-464d-8a19-76286dcc6feb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" Mar 20 09:00:03.859352 master-0 kubenswrapper[18707]: I0320 09:00:03.859290 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8882dc2b-511a-464d-8a19-76286dcc6feb-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm\" (UID: \"8882dc2b-511a-464d-8a19-76286dcc6feb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" Mar 20 09:00:03.859918 master-0 kubenswrapper[18707]: I0320 09:00:03.859867 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8882dc2b-511a-464d-8a19-76286dcc6feb-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm\" (UID: \"8882dc2b-511a-464d-8a19-76286dcc6feb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" Mar 20 09:00:03.960719 master-0 kubenswrapper[18707]: I0320 09:00:03.960555 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8882dc2b-511a-464d-8a19-76286dcc6feb-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm\" (UID: \"8882dc2b-511a-464d-8a19-76286dcc6feb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" Mar 20 09:00:03.960719 master-0 kubenswrapper[18707]: I0320 09:00:03.960631 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxcld\" (UniqueName: \"kubernetes.io/projected/8882dc2b-511a-464d-8a19-76286dcc6feb-kube-api-access-cxcld\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm\" (UID: \"8882dc2b-511a-464d-8a19-76286dcc6feb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" Mar 20 09:00:03.961305 master-0 kubenswrapper[18707]: I0320 09:00:03.961243 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8882dc2b-511a-464d-8a19-76286dcc6feb-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm\" (UID: \"8882dc2b-511a-464d-8a19-76286dcc6feb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" Mar 20 09:00:03.962077 master-0 kubenswrapper[18707]: I0320 09:00:03.962017 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8882dc2b-511a-464d-8a19-76286dcc6feb-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm\" (UID: \"8882dc2b-511a-464d-8a19-76286dcc6feb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" Mar 20 09:00:03.963221 master-0 kubenswrapper[18707]: I0320 09:00:03.963162 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8882dc2b-511a-464d-8a19-76286dcc6feb-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm\" (UID: \"8882dc2b-511a-464d-8a19-76286dcc6feb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" Mar 20 09:00:03.980963 master-0 kubenswrapper[18707]: I0320 09:00:03.980915 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxcld\" (UniqueName: \"kubernetes.io/projected/8882dc2b-511a-464d-8a19-76286dcc6feb-kube-api-access-cxcld\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm\" (UID: \"8882dc2b-511a-464d-8a19-76286dcc6feb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" Mar 20 09:00:04.104541 master-0 kubenswrapper[18707]: I0320 09:00:04.104352 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" Mar 20 09:00:04.177915 master-0 kubenswrapper[18707]: I0320 09:00:04.177859 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9"] Mar 20 09:00:04.178011 master-0 kubenswrapper[18707]: W0320 09:00:04.177951 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b58d9e2_20c6_4741_a558_25f3ce37fd2f.slice/crio-3602eab013e0997d941a33da9623aae7bbcb2dcdee0bfcecbe70fa5434006357 WatchSource:0}: Error finding container 3602eab013e0997d941a33da9623aae7bbcb2dcdee0bfcecbe70fa5434006357: Status 404 returned error can't find the container with id 3602eab013e0997d941a33da9623aae7bbcb2dcdee0bfcecbe70fa5434006357 Mar 20 09:00:04.274026 master-0 kubenswrapper[18707]: I0320 09:00:04.273955 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" event={"ID":"9b58d9e2-20c6-4741-a558-25f3ce37fd2f","Type":"ContainerStarted","Data":"3602eab013e0997d941a33da9623aae7bbcb2dcdee0bfcecbe70fa5434006357"} Mar 20 09:00:04.589979 master-0 kubenswrapper[18707]: I0320 09:00:04.589894 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm"] Mar 20 09:00:04.601913 master-0 kubenswrapper[18707]: W0320 09:00:04.601636 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8882dc2b_511a_464d_8a19_76286dcc6feb.slice/crio-0a4cfb5b0edf93026d586b55090d6291720ba3057c65683a0b519379b19f298e WatchSource:0}: Error finding container 0a4cfb5b0edf93026d586b55090d6291720ba3057c65683a0b519379b19f298e: Status 404 returned error can't find the container with id 0a4cfb5b0edf93026d586b55090d6291720ba3057c65683a0b519379b19f298e Mar 20 09:00:05.284916 master-0 kubenswrapper[18707]: I0320 09:00:05.284379 18707 generic.go:334] "Generic (PLEG): container finished" podID="8882dc2b-511a-464d-8a19-76286dcc6feb" containerID="a093061bd7328a7658ae4a5666898335e59a7a56ef598912110766d60d447bbb" exitCode=0 Mar 20 09:00:05.284916 master-0 kubenswrapper[18707]: I0320 09:00:05.284454 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" event={"ID":"8882dc2b-511a-464d-8a19-76286dcc6feb","Type":"ContainerDied","Data":"a093061bd7328a7658ae4a5666898335e59a7a56ef598912110766d60d447bbb"} Mar 20 09:00:05.284916 master-0 kubenswrapper[18707]: I0320 09:00:05.284542 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" event={"ID":"8882dc2b-511a-464d-8a19-76286dcc6feb","Type":"ContainerStarted","Data":"0a4cfb5b0edf93026d586b55090d6291720ba3057c65683a0b519379b19f298e"} Mar 20 09:00:05.287403 master-0 kubenswrapper[18707]: I0320 09:00:05.287097 18707 generic.go:334] "Generic (PLEG): container finished" podID="9b58d9e2-20c6-4741-a558-25f3ce37fd2f" containerID="5f97db8bbf5d73bd7c8e3f659738a5f60a2076d8862246a595c0360ac87733eb" exitCode=0 Mar 20 09:00:05.287403 master-0 kubenswrapper[18707]: I0320 09:00:05.287143 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" event={"ID":"9b58d9e2-20c6-4741-a558-25f3ce37fd2f","Type":"ContainerDied","Data":"5f97db8bbf5d73bd7c8e3f659738a5f60a2076d8862246a595c0360ac87733eb"} Mar 20 09:00:08.327109 master-0 kubenswrapper[18707]: I0320 09:00:08.327060 18707 generic.go:334] "Generic (PLEG): container finished" podID="9b062733-8620-41cd-b96f-96083e9837d0" containerID="ff190395d2b29cbf655b83c099b84972d6058256a1ccac5526a4af6cb1ae9d53" exitCode=0 Mar 20 09:00:08.328628 master-0 kubenswrapper[18707]: I0320 09:00:08.328141 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" event={"ID":"9b062733-8620-41cd-b96f-96083e9837d0","Type":"ContainerDied","Data":"ff190395d2b29cbf655b83c099b84972d6058256a1ccac5526a4af6cb1ae9d53"} Mar 20 09:00:08.944329 master-0 kubenswrapper[18707]: I0320 09:00:08.944179 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh"] Mar 20 09:00:08.945862 master-0 kubenswrapper[18707]: I0320 09:00:08.945829 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" Mar 20 09:00:08.951433 master-0 kubenswrapper[18707]: I0320 09:00:08.951393 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh"] Mar 20 09:00:08.973902 master-0 kubenswrapper[18707]: I0320 09:00:08.973809 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p52hx\" (UniqueName: \"kubernetes.io/projected/29f2b99e-0e7f-42be-887c-2579a6068b70-kube-api-access-p52hx\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh\" (UID: \"29f2b99e-0e7f-42be-887c-2579a6068b70\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" Mar 20 09:00:08.974160 master-0 kubenswrapper[18707]: I0320 09:00:08.973928 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29f2b99e-0e7f-42be-887c-2579a6068b70-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh\" (UID: \"29f2b99e-0e7f-42be-887c-2579a6068b70\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" Mar 20 09:00:08.974160 master-0 kubenswrapper[18707]: I0320 09:00:08.973975 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29f2b99e-0e7f-42be-887c-2579a6068b70-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh\" (UID: \"29f2b99e-0e7f-42be-887c-2579a6068b70\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" Mar 20 09:00:09.082205 master-0 kubenswrapper[18707]: I0320 09:00:09.075399 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p52hx\" (UniqueName: \"kubernetes.io/projected/29f2b99e-0e7f-42be-887c-2579a6068b70-kube-api-access-p52hx\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh\" (UID: \"29f2b99e-0e7f-42be-887c-2579a6068b70\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" Mar 20 09:00:09.082205 master-0 kubenswrapper[18707]: I0320 09:00:09.075523 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29f2b99e-0e7f-42be-887c-2579a6068b70-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh\" (UID: \"29f2b99e-0e7f-42be-887c-2579a6068b70\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" Mar 20 09:00:09.082205 master-0 kubenswrapper[18707]: I0320 09:00:09.075580 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29f2b99e-0e7f-42be-887c-2579a6068b70-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh\" (UID: \"29f2b99e-0e7f-42be-887c-2579a6068b70\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" Mar 20 09:00:09.082205 master-0 kubenswrapper[18707]: I0320 09:00:09.076025 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29f2b99e-0e7f-42be-887c-2579a6068b70-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh\" (UID: \"29f2b99e-0e7f-42be-887c-2579a6068b70\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" Mar 20 09:00:09.082205 master-0 kubenswrapper[18707]: I0320 09:00:09.076202 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29f2b99e-0e7f-42be-887c-2579a6068b70-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh\" (UID: \"29f2b99e-0e7f-42be-887c-2579a6068b70\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" Mar 20 09:00:09.091716 master-0 kubenswrapper[18707]: I0320 09:00:09.091671 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p52hx\" (UniqueName: \"kubernetes.io/projected/29f2b99e-0e7f-42be-887c-2579a6068b70-kube-api-access-p52hx\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh\" (UID: \"29f2b99e-0e7f-42be-887c-2579a6068b70\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" Mar 20 09:00:09.282054 master-0 kubenswrapper[18707]: I0320 09:00:09.281988 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" Mar 20 09:00:09.344560 master-0 kubenswrapper[18707]: I0320 09:00:09.344488 18707 generic.go:334] "Generic (PLEG): container finished" podID="9b062733-8620-41cd-b96f-96083e9837d0" containerID="79785a449310805ef1cb3e6108a07170743d281dabbcac51275477e08bbe3481" exitCode=0 Mar 20 09:00:09.345168 master-0 kubenswrapper[18707]: I0320 09:00:09.344553 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" event={"ID":"9b062733-8620-41cd-b96f-96083e9837d0","Type":"ContainerDied","Data":"79785a449310805ef1cb3e6108a07170743d281dabbcac51275477e08bbe3481"} Mar 20 09:00:09.352081 master-0 kubenswrapper[18707]: I0320 09:00:09.351175 18707 generic.go:334] "Generic (PLEG): container finished" podID="8882dc2b-511a-464d-8a19-76286dcc6feb" containerID="934ee65706bd84c36c297b8f12ae26c28eab18d0c1d8a880fdc7dd07159881ff" exitCode=0 Mar 20 09:00:09.352081 master-0 kubenswrapper[18707]: I0320 09:00:09.351234 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" event={"ID":"8882dc2b-511a-464d-8a19-76286dcc6feb","Type":"ContainerDied","Data":"934ee65706bd84c36c297b8f12ae26c28eab18d0c1d8a880fdc7dd07159881ff"} Mar 20 09:00:09.704966 master-0 kubenswrapper[18707]: I0320 09:00:09.704901 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh"] Mar 20 09:00:09.713088 master-0 kubenswrapper[18707]: W0320 09:00:09.713029 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29f2b99e_0e7f_42be_887c_2579a6068b70.slice/crio-32178e8cabedb44145a78fcb5312bb8b94e000a40738cb3a21d4b7744651fc42 WatchSource:0}: Error finding container 32178e8cabedb44145a78fcb5312bb8b94e000a40738cb3a21d4b7744651fc42: Status 404 returned error can't find the container with id 32178e8cabedb44145a78fcb5312bb8b94e000a40738cb3a21d4b7744651fc42 Mar 20 09:00:10.360574 master-0 kubenswrapper[18707]: I0320 09:00:10.360506 18707 generic.go:334] "Generic (PLEG): container finished" podID="8882dc2b-511a-464d-8a19-76286dcc6feb" containerID="de0b0e2d77a1be6516e7c1f0aa16932412652236b4a81cb0d087ae224202f5cf" exitCode=0 Mar 20 09:00:10.360574 master-0 kubenswrapper[18707]: I0320 09:00:10.360585 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" event={"ID":"8882dc2b-511a-464d-8a19-76286dcc6feb","Type":"ContainerDied","Data":"de0b0e2d77a1be6516e7c1f0aa16932412652236b4a81cb0d087ae224202f5cf"} Mar 20 09:00:10.361972 master-0 kubenswrapper[18707]: I0320 09:00:10.361928 18707 generic.go:334] "Generic (PLEG): container finished" podID="29f2b99e-0e7f-42be-887c-2579a6068b70" containerID="9110a12030515677c202b34d3ce3ed1ae9284ce22e0d3c87be9b6aae02d0810d" exitCode=0 Mar 20 09:00:10.363496 master-0 kubenswrapper[18707]: I0320 09:00:10.362151 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" event={"ID":"29f2b99e-0e7f-42be-887c-2579a6068b70","Type":"ContainerDied","Data":"9110a12030515677c202b34d3ce3ed1ae9284ce22e0d3c87be9b6aae02d0810d"} Mar 20 09:00:10.363496 master-0 kubenswrapper[18707]: I0320 09:00:10.362276 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" event={"ID":"29f2b99e-0e7f-42be-887c-2579a6068b70","Type":"ContainerStarted","Data":"32178e8cabedb44145a78fcb5312bb8b94e000a40738cb3a21d4b7744651fc42"} Mar 20 09:00:10.757492 master-0 kubenswrapper[18707]: I0320 09:00:10.757447 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" Mar 20 09:00:10.830569 master-0 kubenswrapper[18707]: I0320 09:00:10.830458 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b062733-8620-41cd-b96f-96083e9837d0-util\") pod \"9b062733-8620-41cd-b96f-96083e9837d0\" (UID: \"9b062733-8620-41cd-b96f-96083e9837d0\") " Mar 20 09:00:10.830838 master-0 kubenswrapper[18707]: I0320 09:00:10.830610 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b062733-8620-41cd-b96f-96083e9837d0-bundle\") pod \"9b062733-8620-41cd-b96f-96083e9837d0\" (UID: \"9b062733-8620-41cd-b96f-96083e9837d0\") " Mar 20 09:00:10.830838 master-0 kubenswrapper[18707]: I0320 09:00:10.830681 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-925bh\" (UniqueName: \"kubernetes.io/projected/9b062733-8620-41cd-b96f-96083e9837d0-kube-api-access-925bh\") pod \"9b062733-8620-41cd-b96f-96083e9837d0\" (UID: \"9b062733-8620-41cd-b96f-96083e9837d0\") " Mar 20 09:00:10.831878 master-0 kubenswrapper[18707]: I0320 09:00:10.831821 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b062733-8620-41cd-b96f-96083e9837d0-bundle" (OuterVolumeSpecName: "bundle") pod "9b062733-8620-41cd-b96f-96083e9837d0" (UID: "9b062733-8620-41cd-b96f-96083e9837d0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:00:10.834255 master-0 kubenswrapper[18707]: I0320 09:00:10.834124 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b062733-8620-41cd-b96f-96083e9837d0-kube-api-access-925bh" (OuterVolumeSpecName: "kube-api-access-925bh") pod "9b062733-8620-41cd-b96f-96083e9837d0" (UID: "9b062733-8620-41cd-b96f-96083e9837d0"). InnerVolumeSpecName "kube-api-access-925bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:00:10.842742 master-0 kubenswrapper[18707]: I0320 09:00:10.842671 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b062733-8620-41cd-b96f-96083e9837d0-util" (OuterVolumeSpecName: "util") pod "9b062733-8620-41cd-b96f-96083e9837d0" (UID: "9b062733-8620-41cd-b96f-96083e9837d0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:00:10.933435 master-0 kubenswrapper[18707]: I0320 09:00:10.933364 18707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b062733-8620-41cd-b96f-96083e9837d0-util\") on node \"master-0\" DevicePath \"\"" Mar 20 09:00:10.933435 master-0 kubenswrapper[18707]: I0320 09:00:10.933414 18707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b062733-8620-41cd-b96f-96083e9837d0-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:00:10.933435 master-0 kubenswrapper[18707]: I0320 09:00:10.933428 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-925bh\" (UniqueName: \"kubernetes.io/projected/9b062733-8620-41cd-b96f-96083e9837d0-kube-api-access-925bh\") on node \"master-0\" DevicePath \"\"" Mar 20 09:00:11.374444 master-0 kubenswrapper[18707]: I0320 09:00:11.374356 18707 generic.go:334] "Generic (PLEG): container finished" podID="9b58d9e2-20c6-4741-a558-25f3ce37fd2f" containerID="28270efa91c842b08ede71bfdf72c3d8efc2113eea0eaf1aa22c8bd95290ee6c" exitCode=0 Mar 20 09:00:11.375071 master-0 kubenswrapper[18707]: I0320 09:00:11.374492 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" event={"ID":"9b58d9e2-20c6-4741-a558-25f3ce37fd2f","Type":"ContainerDied","Data":"28270efa91c842b08ede71bfdf72c3d8efc2113eea0eaf1aa22c8bd95290ee6c"} Mar 20 09:00:11.377776 master-0 kubenswrapper[18707]: I0320 09:00:11.377737 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" Mar 20 09:00:11.377871 master-0 kubenswrapper[18707]: I0320 09:00:11.377772 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kp5jf" event={"ID":"9b062733-8620-41cd-b96f-96083e9837d0","Type":"ContainerDied","Data":"0b541f580c67b1e9f59800ccab19c39e1b06afd84f41be221409972b5f12c1b3"} Mar 20 09:00:11.377871 master-0 kubenswrapper[18707]: I0320 09:00:11.377843 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b541f580c67b1e9f59800ccab19c39e1b06afd84f41be221409972b5f12c1b3" Mar 20 09:00:11.674668 master-0 kubenswrapper[18707]: I0320 09:00:11.674412 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" Mar 20 09:00:11.749774 master-0 kubenswrapper[18707]: I0320 09:00:11.749686 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxcld\" (UniqueName: \"kubernetes.io/projected/8882dc2b-511a-464d-8a19-76286dcc6feb-kube-api-access-cxcld\") pod \"8882dc2b-511a-464d-8a19-76286dcc6feb\" (UID: \"8882dc2b-511a-464d-8a19-76286dcc6feb\") " Mar 20 09:00:11.750063 master-0 kubenswrapper[18707]: I0320 09:00:11.749867 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8882dc2b-511a-464d-8a19-76286dcc6feb-util\") pod \"8882dc2b-511a-464d-8a19-76286dcc6feb\" (UID: \"8882dc2b-511a-464d-8a19-76286dcc6feb\") " Mar 20 09:00:11.750063 master-0 kubenswrapper[18707]: I0320 09:00:11.749916 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8882dc2b-511a-464d-8a19-76286dcc6feb-bundle\") pod \"8882dc2b-511a-464d-8a19-76286dcc6feb\" (UID: \"8882dc2b-511a-464d-8a19-76286dcc6feb\") " Mar 20 09:00:11.750889 master-0 kubenswrapper[18707]: I0320 09:00:11.750830 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8882dc2b-511a-464d-8a19-76286dcc6feb-bundle" (OuterVolumeSpecName: "bundle") pod "8882dc2b-511a-464d-8a19-76286dcc6feb" (UID: "8882dc2b-511a-464d-8a19-76286dcc6feb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:00:11.751515 master-0 kubenswrapper[18707]: I0320 09:00:11.751472 18707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8882dc2b-511a-464d-8a19-76286dcc6feb-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:00:11.752946 master-0 kubenswrapper[18707]: I0320 09:00:11.752885 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8882dc2b-511a-464d-8a19-76286dcc6feb-kube-api-access-cxcld" (OuterVolumeSpecName: "kube-api-access-cxcld") pod "8882dc2b-511a-464d-8a19-76286dcc6feb" (UID: "8882dc2b-511a-464d-8a19-76286dcc6feb"). InnerVolumeSpecName "kube-api-access-cxcld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:00:11.760963 master-0 kubenswrapper[18707]: I0320 09:00:11.760908 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8882dc2b-511a-464d-8a19-76286dcc6feb-util" (OuterVolumeSpecName: "util") pod "8882dc2b-511a-464d-8a19-76286dcc6feb" (UID: "8882dc2b-511a-464d-8a19-76286dcc6feb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:00:11.853063 master-0 kubenswrapper[18707]: I0320 09:00:11.852974 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxcld\" (UniqueName: \"kubernetes.io/projected/8882dc2b-511a-464d-8a19-76286dcc6feb-kube-api-access-cxcld\") on node \"master-0\" DevicePath \"\"" Mar 20 09:00:11.853063 master-0 kubenswrapper[18707]: I0320 09:00:11.853053 18707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8882dc2b-511a-464d-8a19-76286dcc6feb-util\") on node \"master-0\" DevicePath \"\"" Mar 20 09:00:12.387818 master-0 kubenswrapper[18707]: I0320 09:00:12.387749 18707 generic.go:334] "Generic (PLEG): container finished" podID="9b58d9e2-20c6-4741-a558-25f3ce37fd2f" containerID="1b2df55b9d1425e18435e878f7c3007ccd572aa9fc44d0b6617047f3910e04a1" exitCode=0 Mar 20 09:00:12.388347 master-0 kubenswrapper[18707]: I0320 09:00:12.387828 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" event={"ID":"9b58d9e2-20c6-4741-a558-25f3ce37fd2f","Type":"ContainerDied","Data":"1b2df55b9d1425e18435e878f7c3007ccd572aa9fc44d0b6617047f3910e04a1"} Mar 20 09:00:12.390841 master-0 kubenswrapper[18707]: I0320 09:00:12.390787 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" event={"ID":"8882dc2b-511a-464d-8a19-76286dcc6feb","Type":"ContainerDied","Data":"0a4cfb5b0edf93026d586b55090d6291720ba3057c65683a0b519379b19f298e"} Mar 20 09:00:12.390938 master-0 kubenswrapper[18707]: I0320 09:00:12.390845 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a4cfb5b0edf93026d586b55090d6291720ba3057c65683a0b519379b19f298e" Mar 20 09:00:12.390938 master-0 kubenswrapper[18707]: I0320 09:00:12.390888 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874g29cm" Mar 20 09:00:13.829328 master-0 kubenswrapper[18707]: I0320 09:00:13.828787 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" Mar 20 09:00:13.900919 master-0 kubenswrapper[18707]: I0320 09:00:13.900823 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7xcb\" (UniqueName: \"kubernetes.io/projected/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-kube-api-access-w7xcb\") pod \"9b58d9e2-20c6-4741-a558-25f3ce37fd2f\" (UID: \"9b58d9e2-20c6-4741-a558-25f3ce37fd2f\") " Mar 20 09:00:13.901153 master-0 kubenswrapper[18707]: I0320 09:00:13.901015 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-bundle\") pod \"9b58d9e2-20c6-4741-a558-25f3ce37fd2f\" (UID: \"9b58d9e2-20c6-4741-a558-25f3ce37fd2f\") " Mar 20 09:00:13.901153 master-0 kubenswrapper[18707]: I0320 09:00:13.901083 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-util\") pod \"9b58d9e2-20c6-4741-a558-25f3ce37fd2f\" (UID: \"9b58d9e2-20c6-4741-a558-25f3ce37fd2f\") " Mar 20 09:00:13.902548 master-0 kubenswrapper[18707]: I0320 09:00:13.902465 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-bundle" (OuterVolumeSpecName: "bundle") pod "9b58d9e2-20c6-4741-a558-25f3ce37fd2f" (UID: "9b58d9e2-20c6-4741-a558-25f3ce37fd2f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:00:13.903903 master-0 kubenswrapper[18707]: I0320 09:00:13.903809 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-kube-api-access-w7xcb" (OuterVolumeSpecName: "kube-api-access-w7xcb") pod "9b58d9e2-20c6-4741-a558-25f3ce37fd2f" (UID: "9b58d9e2-20c6-4741-a558-25f3ce37fd2f"). InnerVolumeSpecName "kube-api-access-w7xcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:00:13.908626 master-0 kubenswrapper[18707]: I0320 09:00:13.908393 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-util" (OuterVolumeSpecName: "util") pod "9b58d9e2-20c6-4741-a558-25f3ce37fd2f" (UID: "9b58d9e2-20c6-4741-a558-25f3ce37fd2f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:00:14.004912 master-0 kubenswrapper[18707]: I0320 09:00:14.004822 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7xcb\" (UniqueName: \"kubernetes.io/projected/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-kube-api-access-w7xcb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:00:14.004912 master-0 kubenswrapper[18707]: I0320 09:00:14.004892 18707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:00:14.004912 master-0 kubenswrapper[18707]: I0320 09:00:14.004911 18707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b58d9e2-20c6-4741-a558-25f3ce37fd2f-util\") on node \"master-0\" DevicePath \"\"" Mar 20 09:00:14.411821 master-0 kubenswrapper[18707]: I0320 09:00:14.411733 18707 generic.go:334] "Generic (PLEG): container finished" podID="29f2b99e-0e7f-42be-887c-2579a6068b70" containerID="f4b4ae95ad00272455e270f2c45a7d4c833a6dc7f03bb6caf04fac4b32c7d3dd" exitCode=0 Mar 20 09:00:14.411821 master-0 kubenswrapper[18707]: I0320 09:00:14.411835 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" event={"ID":"29f2b99e-0e7f-42be-887c-2579a6068b70","Type":"ContainerDied","Data":"f4b4ae95ad00272455e270f2c45a7d4c833a6dc7f03bb6caf04fac4b32c7d3dd"} Mar 20 09:00:14.419127 master-0 kubenswrapper[18707]: I0320 09:00:14.419080 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" event={"ID":"9b58d9e2-20c6-4741-a558-25f3ce37fd2f","Type":"ContainerDied","Data":"3602eab013e0997d941a33da9623aae7bbcb2dcdee0bfcecbe70fa5434006357"} Mar 20 09:00:14.419247 master-0 kubenswrapper[18707]: I0320 09:00:14.419136 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3602eab013e0997d941a33da9623aae7bbcb2dcdee0bfcecbe70fa5434006357" Mar 20 09:00:14.419331 master-0 kubenswrapper[18707]: I0320 09:00:14.419253 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1v54q9" Mar 20 09:00:14.965901 master-0 kubenswrapper[18707]: I0320 09:00:14.965842 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9kvp9"] Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: E0320 09:00:14.966215 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b062733-8620-41cd-b96f-96083e9837d0" containerName="pull" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: I0320 09:00:14.966232 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b062733-8620-41cd-b96f-96083e9837d0" containerName="pull" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: E0320 09:00:14.966250 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b58d9e2-20c6-4741-a558-25f3ce37fd2f" containerName="extract" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: I0320 09:00:14.966258 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b58d9e2-20c6-4741-a558-25f3ce37fd2f" containerName="extract" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: E0320 09:00:14.966278 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b58d9e2-20c6-4741-a558-25f3ce37fd2f" containerName="util" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: I0320 09:00:14.966288 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b58d9e2-20c6-4741-a558-25f3ce37fd2f" containerName="util" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: E0320 09:00:14.966299 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b58d9e2-20c6-4741-a558-25f3ce37fd2f" containerName="pull" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: I0320 09:00:14.966307 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b58d9e2-20c6-4741-a558-25f3ce37fd2f" containerName="pull" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: E0320 09:00:14.966324 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8882dc2b-511a-464d-8a19-76286dcc6feb" containerName="extract" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: I0320 09:00:14.966333 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8882dc2b-511a-464d-8a19-76286dcc6feb" containerName="extract" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: E0320 09:00:14.966356 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b062733-8620-41cd-b96f-96083e9837d0" containerName="util" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: I0320 09:00:14.966364 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b062733-8620-41cd-b96f-96083e9837d0" containerName="util" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: E0320 09:00:14.966385 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8882dc2b-511a-464d-8a19-76286dcc6feb" containerName="util" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: I0320 09:00:14.966395 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8882dc2b-511a-464d-8a19-76286dcc6feb" containerName="util" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: E0320 09:00:14.966418 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8882dc2b-511a-464d-8a19-76286dcc6feb" containerName="pull" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: I0320 09:00:14.966426 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8882dc2b-511a-464d-8a19-76286dcc6feb" containerName="pull" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: E0320 09:00:14.966454 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b062733-8620-41cd-b96f-96083e9837d0" containerName="extract" Mar 20 09:00:14.966543 master-0 kubenswrapper[18707]: I0320 09:00:14.966463 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b062733-8620-41cd-b96f-96083e9837d0" containerName="extract" Mar 20 09:00:14.967312 master-0 kubenswrapper[18707]: I0320 09:00:14.966646 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b062733-8620-41cd-b96f-96083e9837d0" containerName="extract" Mar 20 09:00:14.967312 master-0 kubenswrapper[18707]: I0320 09:00:14.966665 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b58d9e2-20c6-4741-a558-25f3ce37fd2f" containerName="extract" Mar 20 09:00:14.967312 master-0 kubenswrapper[18707]: I0320 09:00:14.966696 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8882dc2b-511a-464d-8a19-76286dcc6feb" containerName="extract" Mar 20 09:00:14.967420 master-0 kubenswrapper[18707]: I0320 09:00:14.967396 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9kvp9" Mar 20 09:00:14.969330 master-0 kubenswrapper[18707]: I0320 09:00:14.969215 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 20 09:00:14.969922 master-0 kubenswrapper[18707]: I0320 09:00:14.969885 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 20 09:00:14.986341 master-0 kubenswrapper[18707]: I0320 09:00:14.986291 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9kvp9"] Mar 20 09:00:15.028218 master-0 kubenswrapper[18707]: I0320 09:00:15.025595 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4fc3bfad-6c92-4cd2-be22-c1d3a9c99b51-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-9kvp9\" (UID: \"4fc3bfad-6c92-4cd2-be22-c1d3a9c99b51\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9kvp9" Mar 20 09:00:15.028218 master-0 kubenswrapper[18707]: I0320 09:00:15.025696 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w95pr\" (UniqueName: \"kubernetes.io/projected/4fc3bfad-6c92-4cd2-be22-c1d3a9c99b51-kube-api-access-w95pr\") pod \"cert-manager-operator-controller-manager-66c8bdd694-9kvp9\" (UID: \"4fc3bfad-6c92-4cd2-be22-c1d3a9c99b51\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9kvp9" Mar 20 09:00:15.126730 master-0 kubenswrapper[18707]: I0320 09:00:15.126679 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w95pr\" (UniqueName: \"kubernetes.io/projected/4fc3bfad-6c92-4cd2-be22-c1d3a9c99b51-kube-api-access-w95pr\") pod \"cert-manager-operator-controller-manager-66c8bdd694-9kvp9\" (UID: \"4fc3bfad-6c92-4cd2-be22-c1d3a9c99b51\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9kvp9" Mar 20 09:00:15.127324 master-0 kubenswrapper[18707]: I0320 09:00:15.127301 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4fc3bfad-6c92-4cd2-be22-c1d3a9c99b51-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-9kvp9\" (UID: \"4fc3bfad-6c92-4cd2-be22-c1d3a9c99b51\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9kvp9" Mar 20 09:00:15.127702 master-0 kubenswrapper[18707]: I0320 09:00:15.127682 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4fc3bfad-6c92-4cd2-be22-c1d3a9c99b51-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-9kvp9\" (UID: \"4fc3bfad-6c92-4cd2-be22-c1d3a9c99b51\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9kvp9" Mar 20 09:00:15.148029 master-0 kubenswrapper[18707]: I0320 09:00:15.147932 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w95pr\" (UniqueName: \"kubernetes.io/projected/4fc3bfad-6c92-4cd2-be22-c1d3a9c99b51-kube-api-access-w95pr\") pod \"cert-manager-operator-controller-manager-66c8bdd694-9kvp9\" (UID: \"4fc3bfad-6c92-4cd2-be22-c1d3a9c99b51\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9kvp9" Mar 20 09:00:15.289406 master-0 kubenswrapper[18707]: I0320 09:00:15.289330 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9kvp9" Mar 20 09:00:15.437110 master-0 kubenswrapper[18707]: I0320 09:00:15.437027 18707 generic.go:334] "Generic (PLEG): container finished" podID="29f2b99e-0e7f-42be-887c-2579a6068b70" containerID="8a12de6fc5b3ce5f6ba0ecc46ad8b7f12a32e4c49cc1d0303f91d7c4bdeed0de" exitCode=0 Mar 20 09:00:15.437110 master-0 kubenswrapper[18707]: I0320 09:00:15.437080 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" event={"ID":"29f2b99e-0e7f-42be-887c-2579a6068b70","Type":"ContainerDied","Data":"8a12de6fc5b3ce5f6ba0ecc46ad8b7f12a32e4c49cc1d0303f91d7c4bdeed0de"} Mar 20 09:00:15.718843 master-0 kubenswrapper[18707]: I0320 09:00:15.718788 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9kvp9"] Mar 20 09:00:16.447602 master-0 kubenswrapper[18707]: I0320 09:00:16.447541 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9kvp9" event={"ID":"4fc3bfad-6c92-4cd2-be22-c1d3a9c99b51","Type":"ContainerStarted","Data":"5a05fabab9d3e00132df12e4f1229ae73737c597e8aba9fd8aedb6717e9426e6"} Mar 20 09:00:16.895170 master-0 kubenswrapper[18707]: I0320 09:00:16.895134 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" Mar 20 09:00:16.968115 master-0 kubenswrapper[18707]: I0320 09:00:16.968034 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29f2b99e-0e7f-42be-887c-2579a6068b70-bundle\") pod \"29f2b99e-0e7f-42be-887c-2579a6068b70\" (UID: \"29f2b99e-0e7f-42be-887c-2579a6068b70\") " Mar 20 09:00:16.968387 master-0 kubenswrapper[18707]: I0320 09:00:16.968295 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p52hx\" (UniqueName: \"kubernetes.io/projected/29f2b99e-0e7f-42be-887c-2579a6068b70-kube-api-access-p52hx\") pod \"29f2b99e-0e7f-42be-887c-2579a6068b70\" (UID: \"29f2b99e-0e7f-42be-887c-2579a6068b70\") " Mar 20 09:00:16.968387 master-0 kubenswrapper[18707]: I0320 09:00:16.968372 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29f2b99e-0e7f-42be-887c-2579a6068b70-util\") pod \"29f2b99e-0e7f-42be-887c-2579a6068b70\" (UID: \"29f2b99e-0e7f-42be-887c-2579a6068b70\") " Mar 20 09:00:16.977581 master-0 kubenswrapper[18707]: I0320 09:00:16.970403 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29f2b99e-0e7f-42be-887c-2579a6068b70-bundle" (OuterVolumeSpecName: "bundle") pod "29f2b99e-0e7f-42be-887c-2579a6068b70" (UID: "29f2b99e-0e7f-42be-887c-2579a6068b70"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:00:16.978329 master-0 kubenswrapper[18707]: I0320 09:00:16.978281 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f2b99e-0e7f-42be-887c-2579a6068b70-kube-api-access-p52hx" (OuterVolumeSpecName: "kube-api-access-p52hx") pod "29f2b99e-0e7f-42be-887c-2579a6068b70" (UID: "29f2b99e-0e7f-42be-887c-2579a6068b70"). InnerVolumeSpecName "kube-api-access-p52hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:00:16.987456 master-0 kubenswrapper[18707]: I0320 09:00:16.985380 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29f2b99e-0e7f-42be-887c-2579a6068b70-util" (OuterVolumeSpecName: "util") pod "29f2b99e-0e7f-42be-887c-2579a6068b70" (UID: "29f2b99e-0e7f-42be-887c-2579a6068b70"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:00:17.070391 master-0 kubenswrapper[18707]: I0320 09:00:17.070348 18707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29f2b99e-0e7f-42be-887c-2579a6068b70-util\") on node \"master-0\" DevicePath \"\"" Mar 20 09:00:17.070656 master-0 kubenswrapper[18707]: I0320 09:00:17.070641 18707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29f2b99e-0e7f-42be-887c-2579a6068b70-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:00:17.070727 master-0 kubenswrapper[18707]: I0320 09:00:17.070715 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p52hx\" (UniqueName: \"kubernetes.io/projected/29f2b99e-0e7f-42be-887c-2579a6068b70-kube-api-access-p52hx\") on node \"master-0\" DevicePath \"\"" Mar 20 09:00:17.461513 master-0 kubenswrapper[18707]: I0320 09:00:17.461342 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" event={"ID":"29f2b99e-0e7f-42be-887c-2579a6068b70","Type":"ContainerDied","Data":"32178e8cabedb44145a78fcb5312bb8b94e000a40738cb3a21d4b7744651fc42"} Mar 20 09:00:17.461513 master-0 kubenswrapper[18707]: I0320 09:00:17.461394 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32178e8cabedb44145a78fcb5312bb8b94e000a40738cb3a21d4b7744651fc42" Mar 20 09:00:17.461513 master-0 kubenswrapper[18707]: I0320 09:00:17.461420 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726rptjh" Mar 20 09:00:19.482665 master-0 kubenswrapper[18707]: I0320 09:00:19.482609 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9kvp9" event={"ID":"4fc3bfad-6c92-4cd2-be22-c1d3a9c99b51","Type":"ContainerStarted","Data":"8c24cfa20a9614d474879d821f493ba5a73f6d019a7e01b6c5c3a4ffc2913467"} Mar 20 09:00:19.507478 master-0 kubenswrapper[18707]: I0320 09:00:19.507381 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-9kvp9" podStartSLOduration=2.280614493 podStartE2EDuration="5.50735323s" podCreationTimestamp="2026-03-20 09:00:14 +0000 UTC" firstStartedPulling="2026-03-20 09:00:15.725645684 +0000 UTC m=+1160.881826040" lastFinishedPulling="2026-03-20 09:00:18.952384421 +0000 UTC m=+1164.108564777" observedRunningTime="2026-03-20 09:00:19.502149551 +0000 UTC m=+1164.658329907" watchObservedRunningTime="2026-03-20 09:00:19.50735323 +0000 UTC m=+1164.663533586" Mar 20 09:00:23.331769 master-0 kubenswrapper[18707]: I0320 09:00:23.331691 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-kkc9m"] Mar 20 09:00:23.332460 master-0 kubenswrapper[18707]: E0320 09:00:23.332117 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f2b99e-0e7f-42be-887c-2579a6068b70" containerName="extract" Mar 20 09:00:23.332460 master-0 kubenswrapper[18707]: I0320 09:00:23.332135 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f2b99e-0e7f-42be-887c-2579a6068b70" containerName="extract" Mar 20 09:00:23.332460 master-0 kubenswrapper[18707]: E0320 09:00:23.332204 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f2b99e-0e7f-42be-887c-2579a6068b70" containerName="util" Mar 20 09:00:23.332460 master-0 kubenswrapper[18707]: I0320 09:00:23.332213 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f2b99e-0e7f-42be-887c-2579a6068b70" containerName="util" Mar 20 09:00:23.332460 master-0 kubenswrapper[18707]: E0320 09:00:23.332242 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f2b99e-0e7f-42be-887c-2579a6068b70" containerName="pull" Mar 20 09:00:23.332460 master-0 kubenswrapper[18707]: I0320 09:00:23.332251 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f2b99e-0e7f-42be-887c-2579a6068b70" containerName="pull" Mar 20 09:00:23.332460 master-0 kubenswrapper[18707]: I0320 09:00:23.332439 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f2b99e-0e7f-42be-887c-2579a6068b70" containerName="extract" Mar 20 09:00:23.333031 master-0 kubenswrapper[18707]: I0320 09:00:23.332999 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-kkc9m" Mar 20 09:00:23.335300 master-0 kubenswrapper[18707]: I0320 09:00:23.335269 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 09:00:23.335737 master-0 kubenswrapper[18707]: I0320 09:00:23.335691 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 09:00:23.343507 master-0 kubenswrapper[18707]: I0320 09:00:23.343317 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-kkc9m"] Mar 20 09:00:23.387327 master-0 kubenswrapper[18707]: I0320 09:00:23.386251 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e0b5f634-6dd3-4d5f-bbe5-0ec6269892c0-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-kkc9m\" (UID: \"e0b5f634-6dd3-4d5f-bbe5-0ec6269892c0\") " pod="cert-manager/cert-manager-webhook-6888856db4-kkc9m" Mar 20 09:00:23.387327 master-0 kubenswrapper[18707]: I0320 09:00:23.386362 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrs6z\" (UniqueName: \"kubernetes.io/projected/e0b5f634-6dd3-4d5f-bbe5-0ec6269892c0-kube-api-access-vrs6z\") pod \"cert-manager-webhook-6888856db4-kkc9m\" (UID: \"e0b5f634-6dd3-4d5f-bbe5-0ec6269892c0\") " pod="cert-manager/cert-manager-webhook-6888856db4-kkc9m" Mar 20 09:00:23.487857 master-0 kubenswrapper[18707]: I0320 09:00:23.487798 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrs6z\" (UniqueName: \"kubernetes.io/projected/e0b5f634-6dd3-4d5f-bbe5-0ec6269892c0-kube-api-access-vrs6z\") pod \"cert-manager-webhook-6888856db4-kkc9m\" (UID: \"e0b5f634-6dd3-4d5f-bbe5-0ec6269892c0\") " pod="cert-manager/cert-manager-webhook-6888856db4-kkc9m" Mar 20 09:00:23.488130 master-0 kubenswrapper[18707]: I0320 09:00:23.487930 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e0b5f634-6dd3-4d5f-bbe5-0ec6269892c0-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-kkc9m\" (UID: \"e0b5f634-6dd3-4d5f-bbe5-0ec6269892c0\") " pod="cert-manager/cert-manager-webhook-6888856db4-kkc9m" Mar 20 09:00:23.504640 master-0 kubenswrapper[18707]: I0320 09:00:23.504597 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrs6z\" (UniqueName: \"kubernetes.io/projected/e0b5f634-6dd3-4d5f-bbe5-0ec6269892c0-kube-api-access-vrs6z\") pod \"cert-manager-webhook-6888856db4-kkc9m\" (UID: \"e0b5f634-6dd3-4d5f-bbe5-0ec6269892c0\") " pod="cert-manager/cert-manager-webhook-6888856db4-kkc9m" Mar 20 09:00:23.505055 master-0 kubenswrapper[18707]: I0320 09:00:23.505036 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e0b5f634-6dd3-4d5f-bbe5-0ec6269892c0-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-kkc9m\" (UID: \"e0b5f634-6dd3-4d5f-bbe5-0ec6269892c0\") " pod="cert-manager/cert-manager-webhook-6888856db4-kkc9m" Mar 20 09:00:23.648169 master-0 kubenswrapper[18707]: I0320 09:00:23.648015 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-kkc9m" Mar 20 09:00:24.031119 master-0 kubenswrapper[18707]: I0320 09:00:24.030171 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-kkc9m"] Mar 20 09:00:24.525784 master-0 kubenswrapper[18707]: I0320 09:00:24.525690 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-kkc9m" event={"ID":"e0b5f634-6dd3-4d5f-bbe5-0ec6269892c0","Type":"ContainerStarted","Data":"046d76a32502b1323c9d855b3bb6f5888e852f024ea06333984307ce36713832"} Mar 20 09:00:25.707805 master-0 kubenswrapper[18707]: I0320 09:00:25.699102 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-mtx97"] Mar 20 09:00:25.707805 master-0 kubenswrapper[18707]: I0320 09:00:25.702085 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-mtx97" Mar 20 09:00:25.719291 master-0 kubenswrapper[18707]: I0320 09:00:25.719228 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-mtx97"] Mar 20 09:00:25.745241 master-0 kubenswrapper[18707]: I0320 09:00:25.741490 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtjmc\" (UniqueName: \"kubernetes.io/projected/75207b95-0950-4bc9-b345-698cf862d583-kube-api-access-gtjmc\") pod \"cert-manager-cainjector-5545bd876-mtx97\" (UID: \"75207b95-0950-4bc9-b345-698cf862d583\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtx97" Mar 20 09:00:25.745241 master-0 kubenswrapper[18707]: I0320 09:00:25.741564 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75207b95-0950-4bc9-b345-698cf862d583-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-mtx97\" (UID: \"75207b95-0950-4bc9-b345-698cf862d583\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtx97" Mar 20 09:00:25.843370 master-0 kubenswrapper[18707]: I0320 09:00:25.843315 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtjmc\" (UniqueName: \"kubernetes.io/projected/75207b95-0950-4bc9-b345-698cf862d583-kube-api-access-gtjmc\") pod \"cert-manager-cainjector-5545bd876-mtx97\" (UID: \"75207b95-0950-4bc9-b345-698cf862d583\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtx97" Mar 20 09:00:25.843693 master-0 kubenswrapper[18707]: I0320 09:00:25.843666 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75207b95-0950-4bc9-b345-698cf862d583-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-mtx97\" (UID: \"75207b95-0950-4bc9-b345-698cf862d583\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtx97" Mar 20 09:00:25.866924 master-0 kubenswrapper[18707]: I0320 09:00:25.865941 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtjmc\" (UniqueName: \"kubernetes.io/projected/75207b95-0950-4bc9-b345-698cf862d583-kube-api-access-gtjmc\") pod \"cert-manager-cainjector-5545bd876-mtx97\" (UID: \"75207b95-0950-4bc9-b345-698cf862d583\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtx97" Mar 20 09:00:25.874270 master-0 kubenswrapper[18707]: I0320 09:00:25.874233 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75207b95-0950-4bc9-b345-698cf862d583-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-mtx97\" (UID: \"75207b95-0950-4bc9-b345-698cf862d583\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtx97" Mar 20 09:00:26.058217 master-0 kubenswrapper[18707]: I0320 09:00:26.055696 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-mtx97" Mar 20 09:00:26.315286 master-0 kubenswrapper[18707]: I0320 09:00:26.311273 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-mtx97"] Mar 20 09:00:26.325540 master-0 kubenswrapper[18707]: W0320 09:00:26.325490 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75207b95_0950_4bc9_b345_698cf862d583.slice/crio-6d47de50aac5e49f755449d0ac2315afe94a406fd61d4cd0ee17ec6e23dd83d2 WatchSource:0}: Error finding container 6d47de50aac5e49f755449d0ac2315afe94a406fd61d4cd0ee17ec6e23dd83d2: Status 404 returned error can't find the container with id 6d47de50aac5e49f755449d0ac2315afe94a406fd61d4cd0ee17ec6e23dd83d2 Mar 20 09:00:26.544020 master-0 kubenswrapper[18707]: I0320 09:00:26.543946 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-mtx97" event={"ID":"75207b95-0950-4bc9-b345-698cf862d583","Type":"ContainerStarted","Data":"6d47de50aac5e49f755449d0ac2315afe94a406fd61d4cd0ee17ec6e23dd83d2"} Mar 20 09:00:27.121258 master-0 kubenswrapper[18707]: I0320 09:00:27.118862 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-q2cgp"] Mar 20 09:00:27.121258 master-0 kubenswrapper[18707]: I0320 09:00:27.120755 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-q2cgp" Mar 20 09:00:27.125566 master-0 kubenswrapper[18707]: I0320 09:00:27.123211 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 09:00:27.125566 master-0 kubenswrapper[18707]: I0320 09:00:27.123375 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-q2cgp"] Mar 20 09:00:27.133117 master-0 kubenswrapper[18707]: I0320 09:00:27.127825 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 09:00:27.164539 master-0 kubenswrapper[18707]: I0320 09:00:27.164471 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9drj7\" (UniqueName: \"kubernetes.io/projected/f55e2401-1fc9-4558-8ae1-5f539ee0a9cd-kube-api-access-9drj7\") pod \"nmstate-operator-796d4cfff4-q2cgp\" (UID: \"f55e2401-1fc9-4558-8ae1-5f539ee0a9cd\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-q2cgp" Mar 20 09:00:27.266680 master-0 kubenswrapper[18707]: I0320 09:00:27.266633 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9drj7\" (UniqueName: \"kubernetes.io/projected/f55e2401-1fc9-4558-8ae1-5f539ee0a9cd-kube-api-access-9drj7\") pod \"nmstate-operator-796d4cfff4-q2cgp\" (UID: \"f55e2401-1fc9-4558-8ae1-5f539ee0a9cd\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-q2cgp" Mar 20 09:00:27.284552 master-0 kubenswrapper[18707]: I0320 09:00:27.284485 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9drj7\" (UniqueName: \"kubernetes.io/projected/f55e2401-1fc9-4558-8ae1-5f539ee0a9cd-kube-api-access-9drj7\") pod \"nmstate-operator-796d4cfff4-q2cgp\" (UID: \"f55e2401-1fc9-4558-8ae1-5f539ee0a9cd\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-q2cgp" Mar 20 09:00:27.450674 master-0 kubenswrapper[18707]: I0320 09:00:27.450136 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-q2cgp" Mar 20 09:00:28.099602 master-0 kubenswrapper[18707]: I0320 09:00:28.098695 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-q2cgp"] Mar 20 09:00:30.200915 master-0 kubenswrapper[18707]: W0320 09:00:30.200639 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf55e2401_1fc9_4558_8ae1_5f539ee0a9cd.slice/crio-381aa23985e3db10dd6249d14b2d8f1057d49b0d4b52d1a704f85ec432081e2f WatchSource:0}: Error finding container 381aa23985e3db10dd6249d14b2d8f1057d49b0d4b52d1a704f85ec432081e2f: Status 404 returned error can't find the container with id 381aa23985e3db10dd6249d14b2d8f1057d49b0d4b52d1a704f85ec432081e2f Mar 20 09:00:30.577455 master-0 kubenswrapper[18707]: I0320 09:00:30.577389 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-q2cgp" event={"ID":"f55e2401-1fc9-4558-8ae1-5f539ee0a9cd","Type":"ContainerStarted","Data":"381aa23985e3db10dd6249d14b2d8f1057d49b0d4b52d1a704f85ec432081e2f"} Mar 20 09:00:30.578913 master-0 kubenswrapper[18707]: I0320 09:00:30.578853 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-mtx97" event={"ID":"75207b95-0950-4bc9-b345-698cf862d583","Type":"ContainerStarted","Data":"e045209cbf29fb74794b8d5e9ab99d51605d76161a2d042809dcc88f83901c4b"} Mar 20 09:00:30.580136 master-0 kubenswrapper[18707]: I0320 09:00:30.580084 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-kkc9m" event={"ID":"e0b5f634-6dd3-4d5f-bbe5-0ec6269892c0","Type":"ContainerStarted","Data":"e6758bae9525e89489db03177538b1d6649cbb9a9191f86854346bc93c219fa8"} Mar 20 09:00:30.580297 master-0 kubenswrapper[18707]: I0320 09:00:30.580270 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-kkc9m" Mar 20 09:00:30.598235 master-0 kubenswrapper[18707]: I0320 09:00:30.598099 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-mtx97" podStartSLOduration=1.612926079 podStartE2EDuration="5.598068988s" podCreationTimestamp="2026-03-20 09:00:25 +0000 UTC" firstStartedPulling="2026-03-20 09:00:26.327785911 +0000 UTC m=+1171.483966267" lastFinishedPulling="2026-03-20 09:00:30.31292882 +0000 UTC m=+1175.469109176" observedRunningTime="2026-03-20 09:00:30.594587088 +0000 UTC m=+1175.750767444" watchObservedRunningTime="2026-03-20 09:00:30.598068988 +0000 UTC m=+1175.754249344" Mar 20 09:00:30.666581 master-0 kubenswrapper[18707]: I0320 09:00:30.666466 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-kkc9m" podStartSLOduration=1.417437961 podStartE2EDuration="7.665084863s" podCreationTimestamp="2026-03-20 09:00:23 +0000 UTC" firstStartedPulling="2026-03-20 09:00:24.03947484 +0000 UTC m=+1169.195655196" lastFinishedPulling="2026-03-20 09:00:30.287121752 +0000 UTC m=+1175.443302098" observedRunningTime="2026-03-20 09:00:30.643488366 +0000 UTC m=+1175.799668732" watchObservedRunningTime="2026-03-20 09:00:30.665084863 +0000 UTC m=+1175.821265219" Mar 20 09:00:34.618145 master-0 kubenswrapper[18707]: I0320 09:00:34.618032 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-q2cgp" event={"ID":"f55e2401-1fc9-4558-8ae1-5f539ee0a9cd","Type":"ContainerStarted","Data":"0390b3eb5b6d97d2dcd5386ccea18a50fc962a05084b42e4f00f1723b6f7d7ee"} Mar 20 09:00:34.657273 master-0 kubenswrapper[18707]: I0320 09:00:34.657101 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-q2cgp" podStartSLOduration=4.048566171 podStartE2EDuration="7.657076988s" podCreationTimestamp="2026-03-20 09:00:27 +0000 UTC" firstStartedPulling="2026-03-20 09:00:30.202776292 +0000 UTC m=+1175.358956648" lastFinishedPulling="2026-03-20 09:00:33.811287109 +0000 UTC m=+1178.967467465" observedRunningTime="2026-03-20 09:00:34.650363396 +0000 UTC m=+1179.806543772" watchObservedRunningTime="2026-03-20 09:00:34.657076988 +0000 UTC m=+1179.813257354" Mar 20 09:00:34.840211 master-0 kubenswrapper[18707]: I0320 09:00:34.839258 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-6ncn2"] Mar 20 09:00:34.840457 master-0 kubenswrapper[18707]: I0320 09:00:34.840417 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-6ncn2" Mar 20 09:00:34.900276 master-0 kubenswrapper[18707]: I0320 09:00:34.899584 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-6ncn2"] Mar 20 09:00:34.933211 master-0 kubenswrapper[18707]: I0320 09:00:34.926440 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db657c8f-e1b7-4756-8a11-f0d5241579ee-bound-sa-token\") pod \"cert-manager-545d4d4674-6ncn2\" (UID: \"db657c8f-e1b7-4756-8a11-f0d5241579ee\") " pod="cert-manager/cert-manager-545d4d4674-6ncn2" Mar 20 09:00:34.933211 master-0 kubenswrapper[18707]: I0320 09:00:34.926514 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpckt\" (UniqueName: \"kubernetes.io/projected/db657c8f-e1b7-4756-8a11-f0d5241579ee-kube-api-access-hpckt\") pod \"cert-manager-545d4d4674-6ncn2\" (UID: \"db657c8f-e1b7-4756-8a11-f0d5241579ee\") " pod="cert-manager/cert-manager-545d4d4674-6ncn2" Mar 20 09:00:35.033691 master-0 kubenswrapper[18707]: I0320 09:00:35.033559 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpckt\" (UniqueName: \"kubernetes.io/projected/db657c8f-e1b7-4756-8a11-f0d5241579ee-kube-api-access-hpckt\") pod \"cert-manager-545d4d4674-6ncn2\" (UID: \"db657c8f-e1b7-4756-8a11-f0d5241579ee\") " pod="cert-manager/cert-manager-545d4d4674-6ncn2" Mar 20 09:00:35.033691 master-0 kubenswrapper[18707]: I0320 09:00:35.033694 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db657c8f-e1b7-4756-8a11-f0d5241579ee-bound-sa-token\") pod \"cert-manager-545d4d4674-6ncn2\" (UID: \"db657c8f-e1b7-4756-8a11-f0d5241579ee\") " pod="cert-manager/cert-manager-545d4d4674-6ncn2" Mar 20 09:00:35.071343 master-0 kubenswrapper[18707]: I0320 09:00:35.071271 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpckt\" (UniqueName: \"kubernetes.io/projected/db657c8f-e1b7-4756-8a11-f0d5241579ee-kube-api-access-hpckt\") pod \"cert-manager-545d4d4674-6ncn2\" (UID: \"db657c8f-e1b7-4756-8a11-f0d5241579ee\") " pod="cert-manager/cert-manager-545d4d4674-6ncn2" Mar 20 09:00:35.081215 master-0 kubenswrapper[18707]: I0320 09:00:35.078174 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db657c8f-e1b7-4756-8a11-f0d5241579ee-bound-sa-token\") pod \"cert-manager-545d4d4674-6ncn2\" (UID: \"db657c8f-e1b7-4756-8a11-f0d5241579ee\") " pod="cert-manager/cert-manager-545d4d4674-6ncn2" Mar 20 09:00:35.256253 master-0 kubenswrapper[18707]: I0320 09:00:35.256060 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-6ncn2" Mar 20 09:00:35.267100 master-0 kubenswrapper[18707]: I0320 09:00:35.267039 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-64ff997757-xgs48"] Mar 20 09:00:35.269040 master-0 kubenswrapper[18707]: I0320 09:00:35.268996 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" Mar 20 09:00:35.271166 master-0 kubenswrapper[18707]: I0320 09:00:35.271103 18707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 09:00:35.271347 master-0 kubenswrapper[18707]: I0320 09:00:35.271274 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 09:00:35.271412 master-0 kubenswrapper[18707]: I0320 09:00:35.271369 18707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 09:00:35.271412 master-0 kubenswrapper[18707]: I0320 09:00:35.271406 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 09:00:35.287076 master-0 kubenswrapper[18707]: I0320 09:00:35.287008 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-64ff997757-xgs48"] Mar 20 09:00:35.353288 master-0 kubenswrapper[18707]: I0320 09:00:35.351741 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pvsb\" (UniqueName: \"kubernetes.io/projected/e0736ead-ff56-4b8e-b654-e8b3f5d1f702-kube-api-access-7pvsb\") pod \"metallb-operator-controller-manager-64ff997757-xgs48\" (UID: \"e0736ead-ff56-4b8e-b654-e8b3f5d1f702\") " pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" Mar 20 09:00:35.353288 master-0 kubenswrapper[18707]: I0320 09:00:35.351947 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e0736ead-ff56-4b8e-b654-e8b3f5d1f702-webhook-cert\") pod \"metallb-operator-controller-manager-64ff997757-xgs48\" (UID: \"e0736ead-ff56-4b8e-b654-e8b3f5d1f702\") " pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" Mar 20 09:00:35.353288 master-0 kubenswrapper[18707]: I0320 09:00:35.352068 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e0736ead-ff56-4b8e-b654-e8b3f5d1f702-apiservice-cert\") pod \"metallb-operator-controller-manager-64ff997757-xgs48\" (UID: \"e0736ead-ff56-4b8e-b654-e8b3f5d1f702\") " pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" Mar 20 09:00:35.463433 master-0 kubenswrapper[18707]: I0320 09:00:35.463257 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e0736ead-ff56-4b8e-b654-e8b3f5d1f702-webhook-cert\") pod \"metallb-operator-controller-manager-64ff997757-xgs48\" (UID: \"e0736ead-ff56-4b8e-b654-e8b3f5d1f702\") " pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" Mar 20 09:00:35.463433 master-0 kubenswrapper[18707]: I0320 09:00:35.463341 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e0736ead-ff56-4b8e-b654-e8b3f5d1f702-apiservice-cert\") pod \"metallb-operator-controller-manager-64ff997757-xgs48\" (UID: \"e0736ead-ff56-4b8e-b654-e8b3f5d1f702\") " pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" Mar 20 09:00:35.463433 master-0 kubenswrapper[18707]: I0320 09:00:35.463406 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pvsb\" (UniqueName: \"kubernetes.io/projected/e0736ead-ff56-4b8e-b654-e8b3f5d1f702-kube-api-access-7pvsb\") pod \"metallb-operator-controller-manager-64ff997757-xgs48\" (UID: \"e0736ead-ff56-4b8e-b654-e8b3f5d1f702\") " pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" Mar 20 09:00:35.477227 master-0 kubenswrapper[18707]: I0320 09:00:35.473967 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e0736ead-ff56-4b8e-b654-e8b3f5d1f702-webhook-cert\") pod \"metallb-operator-controller-manager-64ff997757-xgs48\" (UID: \"e0736ead-ff56-4b8e-b654-e8b3f5d1f702\") " pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" Mar 20 09:00:35.477227 master-0 kubenswrapper[18707]: I0320 09:00:35.474916 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e0736ead-ff56-4b8e-b654-e8b3f5d1f702-apiservice-cert\") pod \"metallb-operator-controller-manager-64ff997757-xgs48\" (UID: \"e0736ead-ff56-4b8e-b654-e8b3f5d1f702\") " pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" Mar 20 09:00:35.493098 master-0 kubenswrapper[18707]: I0320 09:00:35.492824 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pvsb\" (UniqueName: \"kubernetes.io/projected/e0736ead-ff56-4b8e-b654-e8b3f5d1f702-kube-api-access-7pvsb\") pod \"metallb-operator-controller-manager-64ff997757-xgs48\" (UID: \"e0736ead-ff56-4b8e-b654-e8b3f5d1f702\") " pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" Mar 20 09:00:35.657852 master-0 kubenswrapper[18707]: I0320 09:00:35.657815 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" Mar 20 09:00:35.949234 master-0 kubenswrapper[18707]: I0320 09:00:35.949160 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-6ncn2"] Mar 20 09:00:36.063884 master-0 kubenswrapper[18707]: I0320 09:00:36.063546 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn"] Mar 20 09:00:36.085888 master-0 kubenswrapper[18707]: I0320 09:00:36.085810 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" Mar 20 09:00:36.095972 master-0 kubenswrapper[18707]: I0320 09:00:36.093438 18707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 09:00:36.095972 master-0 kubenswrapper[18707]: I0320 09:00:36.093644 18707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 09:00:36.096960 master-0 kubenswrapper[18707]: I0320 09:00:36.096905 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn"] Mar 20 09:00:36.209267 master-0 kubenswrapper[18707]: I0320 09:00:36.209112 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc7b6\" (UniqueName: \"kubernetes.io/projected/e786a96c-c80a-47c2-aa44-1e0666388caa-kube-api-access-hc7b6\") pod \"metallb-operator-webhook-server-6dc87c8bf8-srnwn\" (UID: \"e786a96c-c80a-47c2-aa44-1e0666388caa\") " pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" Mar 20 09:00:36.209483 master-0 kubenswrapper[18707]: I0320 09:00:36.209291 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e786a96c-c80a-47c2-aa44-1e0666388caa-webhook-cert\") pod \"metallb-operator-webhook-server-6dc87c8bf8-srnwn\" (UID: \"e786a96c-c80a-47c2-aa44-1e0666388caa\") " pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" Mar 20 09:00:36.209483 master-0 kubenswrapper[18707]: I0320 09:00:36.209355 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e786a96c-c80a-47c2-aa44-1e0666388caa-apiservice-cert\") pod \"metallb-operator-webhook-server-6dc87c8bf8-srnwn\" (UID: \"e786a96c-c80a-47c2-aa44-1e0666388caa\") " pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" Mar 20 09:00:36.313207 master-0 kubenswrapper[18707]: I0320 09:00:36.311025 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc7b6\" (UniqueName: \"kubernetes.io/projected/e786a96c-c80a-47c2-aa44-1e0666388caa-kube-api-access-hc7b6\") pod \"metallb-operator-webhook-server-6dc87c8bf8-srnwn\" (UID: \"e786a96c-c80a-47c2-aa44-1e0666388caa\") " pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" Mar 20 09:00:36.313207 master-0 kubenswrapper[18707]: I0320 09:00:36.311145 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e786a96c-c80a-47c2-aa44-1e0666388caa-webhook-cert\") pod \"metallb-operator-webhook-server-6dc87c8bf8-srnwn\" (UID: \"e786a96c-c80a-47c2-aa44-1e0666388caa\") " pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" Mar 20 09:00:36.313207 master-0 kubenswrapper[18707]: I0320 09:00:36.311210 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e786a96c-c80a-47c2-aa44-1e0666388caa-apiservice-cert\") pod \"metallb-operator-webhook-server-6dc87c8bf8-srnwn\" (UID: \"e786a96c-c80a-47c2-aa44-1e0666388caa\") " pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" Mar 20 09:00:36.327310 master-0 kubenswrapper[18707]: I0320 09:00:36.327257 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e786a96c-c80a-47c2-aa44-1e0666388caa-webhook-cert\") pod \"metallb-operator-webhook-server-6dc87c8bf8-srnwn\" (UID: \"e786a96c-c80a-47c2-aa44-1e0666388caa\") " pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" Mar 20 09:00:36.340402 master-0 kubenswrapper[18707]: I0320 09:00:36.340233 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc7b6\" (UniqueName: \"kubernetes.io/projected/e786a96c-c80a-47c2-aa44-1e0666388caa-kube-api-access-hc7b6\") pod \"metallb-operator-webhook-server-6dc87c8bf8-srnwn\" (UID: \"e786a96c-c80a-47c2-aa44-1e0666388caa\") " pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" Mar 20 09:00:36.340601 master-0 kubenswrapper[18707]: I0320 09:00:36.340572 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e786a96c-c80a-47c2-aa44-1e0666388caa-apiservice-cert\") pod \"metallb-operator-webhook-server-6dc87c8bf8-srnwn\" (UID: \"e786a96c-c80a-47c2-aa44-1e0666388caa\") " pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" Mar 20 09:00:36.461537 master-0 kubenswrapper[18707]: I0320 09:00:36.461423 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-64ff997757-xgs48"] Mar 20 09:00:36.524244 master-0 kubenswrapper[18707]: I0320 09:00:36.517588 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" Mar 20 09:00:36.744270 master-0 kubenswrapper[18707]: I0320 09:00:36.743651 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-6ncn2" event={"ID":"db657c8f-e1b7-4756-8a11-f0d5241579ee","Type":"ContainerStarted","Data":"7eeacc041baa182c4a3df5e6c4c401eac403d0b806cf8fe137e9d687672ffa6e"} Mar 20 09:00:36.744270 master-0 kubenswrapper[18707]: I0320 09:00:36.743702 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-6ncn2" event={"ID":"db657c8f-e1b7-4756-8a11-f0d5241579ee","Type":"ContainerStarted","Data":"db8c5d1b1ea2ee2f47c492ea67547315503d63c6d391700f0658176caecf58a0"} Mar 20 09:00:36.807060 master-0 kubenswrapper[18707]: I0320 09:00:36.806966 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-6ncn2" podStartSLOduration=2.806944122 podStartE2EDuration="2.806944122s" podCreationTimestamp="2026-03-20 09:00:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:00:36.786428636 +0000 UTC m=+1181.942608992" watchObservedRunningTime="2026-03-20 09:00:36.806944122 +0000 UTC m=+1181.963124478" Mar 20 09:00:36.807517 master-0 kubenswrapper[18707]: I0320 09:00:36.807472 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" event={"ID":"e0736ead-ff56-4b8e-b654-e8b3f5d1f702","Type":"ContainerStarted","Data":"05d172d3941ed9db78d59b2311c0701da0744e45bfb42e3464bcd8e627cf5487"} Mar 20 09:00:37.173206 master-0 kubenswrapper[18707]: I0320 09:00:37.170641 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn"] Mar 20 09:00:37.822378 master-0 kubenswrapper[18707]: I0320 09:00:37.822303 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" event={"ID":"e786a96c-c80a-47c2-aa44-1e0666388caa","Type":"ContainerStarted","Data":"d64ffd9738c4c664bd1536460debc904e82e81c6b08ec8da4e3fa2644970d04c"} Mar 20 09:00:38.663665 master-0 kubenswrapper[18707]: I0320 09:00:38.663600 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-kkc9m" Mar 20 09:00:42.901708 master-0 kubenswrapper[18707]: I0320 09:00:42.901634 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" event={"ID":"e0736ead-ff56-4b8e-b654-e8b3f5d1f702","Type":"ContainerStarted","Data":"3d2f0c99734566cf7cfee7855e2c1019b1aa135fa80b03a461c0c6e99f8f3a81"} Mar 20 09:00:42.902409 master-0 kubenswrapper[18707]: I0320 09:00:42.902305 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" Mar 20 09:00:44.646926 master-0 kubenswrapper[18707]: I0320 09:00:44.646849 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" podStartSLOduration=4.322846517 podStartE2EDuration="9.646831536s" podCreationTimestamp="2026-03-20 09:00:35 +0000 UTC" firstStartedPulling="2026-03-20 09:00:36.496336217 +0000 UTC m=+1181.652516573" lastFinishedPulling="2026-03-20 09:00:41.820321236 +0000 UTC m=+1186.976501592" observedRunningTime="2026-03-20 09:00:42.930783108 +0000 UTC m=+1188.086963544" watchObservedRunningTime="2026-03-20 09:00:44.646831536 +0000 UTC m=+1189.803011892" Mar 20 09:00:44.649682 master-0 kubenswrapper[18707]: I0320 09:00:44.649647 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-wvbr6"] Mar 20 09:00:44.650695 master-0 kubenswrapper[18707]: I0320 09:00:44.650668 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-wvbr6" Mar 20 09:00:44.652687 master-0 kubenswrapper[18707]: I0320 09:00:44.652644 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 20 09:00:44.653436 master-0 kubenswrapper[18707]: I0320 09:00:44.653405 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 20 09:00:44.666592 master-0 kubenswrapper[18707]: I0320 09:00:44.666529 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-wvbr6"] Mar 20 09:00:44.776689 master-0 kubenswrapper[18707]: I0320 09:00:44.776615 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk7hv\" (UniqueName: \"kubernetes.io/projected/721bf766-bb8f-406d-aed5-6cbafbfd59d5-kube-api-access-rk7hv\") pod \"obo-prometheus-operator-8ff7d675-wvbr6\" (UID: \"721bf766-bb8f-406d-aed5-6cbafbfd59d5\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-wvbr6" Mar 20 09:00:44.881208 master-0 kubenswrapper[18707]: I0320 09:00:44.880361 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk7hv\" (UniqueName: \"kubernetes.io/projected/721bf766-bb8f-406d-aed5-6cbafbfd59d5-kube-api-access-rk7hv\") pod \"obo-prometheus-operator-8ff7d675-wvbr6\" (UID: \"721bf766-bb8f-406d-aed5-6cbafbfd59d5\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-wvbr6" Mar 20 09:00:44.915288 master-0 kubenswrapper[18707]: I0320 09:00:44.913896 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk7hv\" (UniqueName: \"kubernetes.io/projected/721bf766-bb8f-406d-aed5-6cbafbfd59d5-kube-api-access-rk7hv\") pod \"obo-prometheus-operator-8ff7d675-wvbr6\" (UID: \"721bf766-bb8f-406d-aed5-6cbafbfd59d5\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-wvbr6" Mar 20 09:00:44.970587 master-0 kubenswrapper[18707]: I0320 09:00:44.969672 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-wvbr6" Mar 20 09:00:44.972046 master-0 kubenswrapper[18707]: I0320 09:00:44.971088 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" event={"ID":"e786a96c-c80a-47c2-aa44-1e0666388caa","Type":"ContainerStarted","Data":"d17a9d4702f64ca3832df0c93d742f42686495a03ae9497c1734cc7511d65521"} Mar 20 09:00:44.972573 master-0 kubenswrapper[18707]: I0320 09:00:44.972536 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" Mar 20 09:00:45.919072 master-0 kubenswrapper[18707]: I0320 09:00:45.918990 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" podStartSLOduration=3.839127705 podStartE2EDuration="10.918968759s" podCreationTimestamp="2026-03-20 09:00:35 +0000 UTC" firstStartedPulling="2026-03-20 09:00:37.153438054 +0000 UTC m=+1182.309618410" lastFinishedPulling="2026-03-20 09:00:44.233279108 +0000 UTC m=+1189.389459464" observedRunningTime="2026-03-20 09:00:45.070634356 +0000 UTC m=+1190.226814742" watchObservedRunningTime="2026-03-20 09:00:45.918968759 +0000 UTC m=+1191.075149115" Mar 20 09:00:45.924054 master-0 kubenswrapper[18707]: I0320 09:00:45.923985 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-wvbr6"] Mar 20 09:00:45.987212 master-0 kubenswrapper[18707]: I0320 09:00:45.983518 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-wvbr6" event={"ID":"721bf766-bb8f-406d-aed5-6cbafbfd59d5","Type":"ContainerStarted","Data":"f9969726916bd142ec37c7a56938c95d6c91c935a8b3d66d88306afaec676d9c"} Mar 20 09:00:46.373504 master-0 kubenswrapper[18707]: I0320 09:00:46.373447 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6"] Mar 20 09:00:46.374491 master-0 kubenswrapper[18707]: I0320 09:00:46.374472 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6" Mar 20 09:00:46.381242 master-0 kubenswrapper[18707]: I0320 09:00:46.381175 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 20 09:00:46.411297 master-0 kubenswrapper[18707]: I0320 09:00:46.411241 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f"] Mar 20 09:00:46.412547 master-0 kubenswrapper[18707]: I0320 09:00:46.412517 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f" Mar 20 09:00:46.416977 master-0 kubenswrapper[18707]: I0320 09:00:46.416919 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6"] Mar 20 09:00:46.432763 master-0 kubenswrapper[18707]: I0320 09:00:46.432681 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f"] Mar 20 09:00:46.471122 master-0 kubenswrapper[18707]: I0320 09:00:46.470532 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6b49f6fb-869f-439c-b02c-d0ebf71bf3be-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6\" (UID: \"6b49f6fb-869f-439c-b02c-d0ebf71bf3be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6" Mar 20 09:00:46.471122 master-0 kubenswrapper[18707]: I0320 09:00:46.470618 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6b49f6fb-869f-439c-b02c-d0ebf71bf3be-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6\" (UID: \"6b49f6fb-869f-439c-b02c-d0ebf71bf3be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6" Mar 20 09:00:46.574117 master-0 kubenswrapper[18707]: I0320 09:00:46.572235 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6b49f6fb-869f-439c-b02c-d0ebf71bf3be-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6\" (UID: \"6b49f6fb-869f-439c-b02c-d0ebf71bf3be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6" Mar 20 09:00:46.574117 master-0 kubenswrapper[18707]: I0320 09:00:46.572311 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5164b263-6a6f-4156-bf3c-cd9e58bde58d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f\" (UID: \"5164b263-6a6f-4156-bf3c-cd9e58bde58d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f" Mar 20 09:00:46.574117 master-0 kubenswrapper[18707]: I0320 09:00:46.572346 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6b49f6fb-869f-439c-b02c-d0ebf71bf3be-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6\" (UID: \"6b49f6fb-869f-439c-b02c-d0ebf71bf3be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6" Mar 20 09:00:46.574117 master-0 kubenswrapper[18707]: I0320 09:00:46.572368 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5164b263-6a6f-4156-bf3c-cd9e58bde58d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f\" (UID: \"5164b263-6a6f-4156-bf3c-cd9e58bde58d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f" Mar 20 09:00:46.583114 master-0 kubenswrapper[18707]: I0320 09:00:46.582857 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6b49f6fb-869f-439c-b02c-d0ebf71bf3be-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6\" (UID: \"6b49f6fb-869f-439c-b02c-d0ebf71bf3be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6" Mar 20 09:00:46.583114 master-0 kubenswrapper[18707]: I0320 09:00:46.582973 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6b49f6fb-869f-439c-b02c-d0ebf71bf3be-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6\" (UID: \"6b49f6fb-869f-439c-b02c-d0ebf71bf3be\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6" Mar 20 09:00:46.674241 master-0 kubenswrapper[18707]: I0320 09:00:46.674082 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5164b263-6a6f-4156-bf3c-cd9e58bde58d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f\" (UID: \"5164b263-6a6f-4156-bf3c-cd9e58bde58d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f" Mar 20 09:00:46.674241 master-0 kubenswrapper[18707]: I0320 09:00:46.674153 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5164b263-6a6f-4156-bf3c-cd9e58bde58d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f\" (UID: \"5164b263-6a6f-4156-bf3c-cd9e58bde58d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f" Mar 20 09:00:46.677927 master-0 kubenswrapper[18707]: I0320 09:00:46.677887 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5164b263-6a6f-4156-bf3c-cd9e58bde58d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f\" (UID: \"5164b263-6a6f-4156-bf3c-cd9e58bde58d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f" Mar 20 09:00:46.678376 master-0 kubenswrapper[18707]: I0320 09:00:46.678345 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5164b263-6a6f-4156-bf3c-cd9e58bde58d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f\" (UID: \"5164b263-6a6f-4156-bf3c-cd9e58bde58d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f" Mar 20 09:00:46.694388 master-0 kubenswrapper[18707]: I0320 09:00:46.694315 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6" Mar 20 09:00:46.789219 master-0 kubenswrapper[18707]: I0320 09:00:46.788303 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f" Mar 20 09:00:46.825668 master-0 kubenswrapper[18707]: I0320 09:00:46.825153 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-m27sp"] Mar 20 09:00:46.826374 master-0 kubenswrapper[18707]: I0320 09:00:46.826341 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-m27sp" Mar 20 09:00:46.830274 master-0 kubenswrapper[18707]: I0320 09:00:46.829359 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 20 09:00:46.849976 master-0 kubenswrapper[18707]: I0320 09:00:46.849923 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-m27sp"] Mar 20 09:00:46.886328 master-0 kubenswrapper[18707]: I0320 09:00:46.886235 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk5p8\" (UniqueName: \"kubernetes.io/projected/5dd92c71-273e-4b7e-9a4d-54596a91ad38-kube-api-access-fk5p8\") pod \"observability-operator-6dd7dd855f-m27sp\" (UID: \"5dd92c71-273e-4b7e-9a4d-54596a91ad38\") " pod="openshift-operators/observability-operator-6dd7dd855f-m27sp" Mar 20 09:00:46.886532 master-0 kubenswrapper[18707]: I0320 09:00:46.886486 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5dd92c71-273e-4b7e-9a4d-54596a91ad38-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-m27sp\" (UID: \"5dd92c71-273e-4b7e-9a4d-54596a91ad38\") " pod="openshift-operators/observability-operator-6dd7dd855f-m27sp" Mar 20 09:00:46.988441 master-0 kubenswrapper[18707]: I0320 09:00:46.988340 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk5p8\" (UniqueName: \"kubernetes.io/projected/5dd92c71-273e-4b7e-9a4d-54596a91ad38-kube-api-access-fk5p8\") pod \"observability-operator-6dd7dd855f-m27sp\" (UID: \"5dd92c71-273e-4b7e-9a4d-54596a91ad38\") " pod="openshift-operators/observability-operator-6dd7dd855f-m27sp" Mar 20 09:00:46.988995 master-0 kubenswrapper[18707]: I0320 09:00:46.988683 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5dd92c71-273e-4b7e-9a4d-54596a91ad38-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-m27sp\" (UID: \"5dd92c71-273e-4b7e-9a4d-54596a91ad38\") " pod="openshift-operators/observability-operator-6dd7dd855f-m27sp" Mar 20 09:00:46.997611 master-0 kubenswrapper[18707]: I0320 09:00:46.994099 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5dd92c71-273e-4b7e-9a4d-54596a91ad38-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-m27sp\" (UID: \"5dd92c71-273e-4b7e-9a4d-54596a91ad38\") " pod="openshift-operators/observability-operator-6dd7dd855f-m27sp" Mar 20 09:00:47.038209 master-0 kubenswrapper[18707]: I0320 09:00:47.028315 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk5p8\" (UniqueName: \"kubernetes.io/projected/5dd92c71-273e-4b7e-9a4d-54596a91ad38-kube-api-access-fk5p8\") pod \"observability-operator-6dd7dd855f-m27sp\" (UID: \"5dd92c71-273e-4b7e-9a4d-54596a91ad38\") " pod="openshift-operators/observability-operator-6dd7dd855f-m27sp" Mar 20 09:00:47.178907 master-0 kubenswrapper[18707]: I0320 09:00:47.173573 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-m27sp" Mar 20 09:00:47.356870 master-0 kubenswrapper[18707]: I0320 09:00:47.356814 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6"] Mar 20 09:00:47.484976 master-0 kubenswrapper[18707]: I0320 09:00:47.484365 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f"] Mar 20 09:00:47.518674 master-0 kubenswrapper[18707]: I0320 09:00:47.518642 18707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:00:47.554126 master-0 kubenswrapper[18707]: I0320 09:00:47.554055 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-777f7bbbbb-d5qjm"] Mar 20 09:00:47.555667 master-0 kubenswrapper[18707]: I0320 09:00:47.555630 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:00:47.558597 master-0 kubenswrapper[18707]: I0320 09:00:47.558383 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 20 09:00:47.590200 master-0 kubenswrapper[18707]: I0320 09:00:47.581726 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-777f7bbbbb-d5qjm"] Mar 20 09:00:47.656485 master-0 kubenswrapper[18707]: W0320 09:00:47.656234 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dd92c71_273e_4b7e_9a4d_54596a91ad38.slice/crio-4cfcbfd38158b06ba629e2a5958594ace10006598fd9a9d6db070a977072dace WatchSource:0}: Error finding container 4cfcbfd38158b06ba629e2a5958594ace10006598fd9a9d6db070a977072dace: Status 404 returned error can't find the container with id 4cfcbfd38158b06ba629e2a5958594ace10006598fd9a9d6db070a977072dace Mar 20 09:00:47.673635 master-0 kubenswrapper[18707]: I0320 09:00:47.673513 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-m27sp"] Mar 20 09:00:47.718346 master-0 kubenswrapper[18707]: I0320 09:00:47.718267 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a76b5038-ca74-400b-b020-1e951912b195-webhook-cert\") pod \"perses-operator-777f7bbbbb-d5qjm\" (UID: \"a76b5038-ca74-400b-b020-1e951912b195\") " pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:00:47.718346 master-0 kubenswrapper[18707]: I0320 09:00:47.718342 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqdg6\" (UniqueName: \"kubernetes.io/projected/a76b5038-ca74-400b-b020-1e951912b195-kube-api-access-jqdg6\") pod \"perses-operator-777f7bbbbb-d5qjm\" (UID: \"a76b5038-ca74-400b-b020-1e951912b195\") " pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:00:47.718606 master-0 kubenswrapper[18707]: I0320 09:00:47.718479 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a76b5038-ca74-400b-b020-1e951912b195-openshift-service-ca\") pod \"perses-operator-777f7bbbbb-d5qjm\" (UID: \"a76b5038-ca74-400b-b020-1e951912b195\") " pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:00:47.718606 master-0 kubenswrapper[18707]: I0320 09:00:47.718510 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a76b5038-ca74-400b-b020-1e951912b195-apiservice-cert\") pod \"perses-operator-777f7bbbbb-d5qjm\" (UID: \"a76b5038-ca74-400b-b020-1e951912b195\") " pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:00:47.819818 master-0 kubenswrapper[18707]: I0320 09:00:47.819739 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a76b5038-ca74-400b-b020-1e951912b195-webhook-cert\") pod \"perses-operator-777f7bbbbb-d5qjm\" (UID: \"a76b5038-ca74-400b-b020-1e951912b195\") " pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:00:47.819818 master-0 kubenswrapper[18707]: I0320 09:00:47.819817 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqdg6\" (UniqueName: \"kubernetes.io/projected/a76b5038-ca74-400b-b020-1e951912b195-kube-api-access-jqdg6\") pod \"perses-operator-777f7bbbbb-d5qjm\" (UID: \"a76b5038-ca74-400b-b020-1e951912b195\") " pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:00:47.820164 master-0 kubenswrapper[18707]: I0320 09:00:47.819929 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a76b5038-ca74-400b-b020-1e951912b195-openshift-service-ca\") pod \"perses-operator-777f7bbbbb-d5qjm\" (UID: \"a76b5038-ca74-400b-b020-1e951912b195\") " pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:00:47.820164 master-0 kubenswrapper[18707]: I0320 09:00:47.819978 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a76b5038-ca74-400b-b020-1e951912b195-apiservice-cert\") pod \"perses-operator-777f7bbbbb-d5qjm\" (UID: \"a76b5038-ca74-400b-b020-1e951912b195\") " pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:00:47.824979 master-0 kubenswrapper[18707]: I0320 09:00:47.824657 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a76b5038-ca74-400b-b020-1e951912b195-openshift-service-ca\") pod \"perses-operator-777f7bbbbb-d5qjm\" (UID: \"a76b5038-ca74-400b-b020-1e951912b195\") " pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:00:47.824979 master-0 kubenswrapper[18707]: I0320 09:00:47.824904 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a76b5038-ca74-400b-b020-1e951912b195-webhook-cert\") pod \"perses-operator-777f7bbbbb-d5qjm\" (UID: \"a76b5038-ca74-400b-b020-1e951912b195\") " pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:00:47.832220 master-0 kubenswrapper[18707]: I0320 09:00:47.826823 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a76b5038-ca74-400b-b020-1e951912b195-apiservice-cert\") pod \"perses-operator-777f7bbbbb-d5qjm\" (UID: \"a76b5038-ca74-400b-b020-1e951912b195\") " pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:00:47.861268 master-0 kubenswrapper[18707]: I0320 09:00:47.861114 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqdg6\" (UniqueName: \"kubernetes.io/projected/a76b5038-ca74-400b-b020-1e951912b195-kube-api-access-jqdg6\") pod \"perses-operator-777f7bbbbb-d5qjm\" (UID: \"a76b5038-ca74-400b-b020-1e951912b195\") " pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:00:47.896974 master-0 kubenswrapper[18707]: I0320 09:00:47.896924 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:00:48.028813 master-0 kubenswrapper[18707]: I0320 09:00:48.028730 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f" event={"ID":"5164b263-6a6f-4156-bf3c-cd9e58bde58d","Type":"ContainerStarted","Data":"76d0b6825effa1f34fb70b41981c9114941eadae6b8c7eb0acfb1a0160a6b849"} Mar 20 09:00:48.030265 master-0 kubenswrapper[18707]: I0320 09:00:48.030152 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6" event={"ID":"6b49f6fb-869f-439c-b02c-d0ebf71bf3be","Type":"ContainerStarted","Data":"5ba412686c4e3ed622b6d67984bd4088eec285d1c2a4933f4e81ea710b06a72b"} Mar 20 09:00:48.033437 master-0 kubenswrapper[18707]: I0320 09:00:48.033156 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-m27sp" event={"ID":"5dd92c71-273e-4b7e-9a4d-54596a91ad38","Type":"ContainerStarted","Data":"4cfcbfd38158b06ba629e2a5958594ace10006598fd9a9d6db070a977072dace"} Mar 20 09:00:48.397209 master-0 kubenswrapper[18707]: I0320 09:00:48.382072 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-777f7bbbbb-d5qjm"] Mar 20 09:00:48.397549 master-0 kubenswrapper[18707]: W0320 09:00:48.394351 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda76b5038_ca74_400b_b020_1e951912b195.slice/crio-e26b287835a88fc7680f502cff43bcdee38abae72c1a8a4dfd63e18ec496e5fb WatchSource:0}: Error finding container e26b287835a88fc7680f502cff43bcdee38abae72c1a8a4dfd63e18ec496e5fb: Status 404 returned error can't find the container with id e26b287835a88fc7680f502cff43bcdee38abae72c1a8a4dfd63e18ec496e5fb Mar 20 09:00:49.058170 master-0 kubenswrapper[18707]: I0320 09:00:49.058048 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" event={"ID":"a76b5038-ca74-400b-b020-1e951912b195","Type":"ContainerStarted","Data":"e26b287835a88fc7680f502cff43bcdee38abae72c1a8a4dfd63e18ec496e5fb"} Mar 20 09:00:56.525332 master-0 kubenswrapper[18707]: I0320 09:00:56.524418 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6dc87c8bf8-srnwn" Mar 20 09:00:59.175030 master-0 kubenswrapper[18707]: I0320 09:00:59.174882 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f" event={"ID":"5164b263-6a6f-4156-bf3c-cd9e58bde58d","Type":"ContainerStarted","Data":"49fbcdc99852920c4f0e7484a54d1e01e3caee0838650250c1863267705077eb"} Mar 20 09:01:00.184693 master-0 kubenswrapper[18707]: I0320 09:01:00.184625 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6" event={"ID":"6b49f6fb-869f-439c-b02c-d0ebf71bf3be","Type":"ContainerStarted","Data":"ea42489ee053bb6bae9c932786d461d54b9872b68cabdd4afbb761f830a92008"} Mar 20 09:01:00.187554 master-0 kubenswrapper[18707]: I0320 09:01:00.187486 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-m27sp" event={"ID":"5dd92c71-273e-4b7e-9a4d-54596a91ad38","Type":"ContainerStarted","Data":"6f6d4432cf58bfbf4930055bd3f8fb4adf5e8a6b693ec4b11bcf03e7c7c91f2b"} Mar 20 09:01:00.188054 master-0 kubenswrapper[18707]: I0320 09:01:00.188009 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-m27sp" Mar 20 09:01:00.189492 master-0 kubenswrapper[18707]: I0320 09:01:00.189445 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-wvbr6" event={"ID":"721bf766-bb8f-406d-aed5-6cbafbfd59d5","Type":"ContainerStarted","Data":"3a7e1a3cf02fc077e0b5f5313808089f927c20928b617eee9b4771561429b368"} Mar 20 09:01:00.190285 master-0 kubenswrapper[18707]: I0320 09:01:00.190251 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-m27sp" Mar 20 09:01:00.193266 master-0 kubenswrapper[18707]: I0320 09:01:00.193223 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" event={"ID":"a76b5038-ca74-400b-b020-1e951912b195","Type":"ContainerStarted","Data":"3ed57f03cb2a0195f6584409146ef171b1720cb6ea31d974a37be186cd1f9f17"} Mar 20 09:01:00.193409 master-0 kubenswrapper[18707]: I0320 09:01:00.193344 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:01:01.219906 master-0 kubenswrapper[18707]: I0320 09:01:01.219792 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-ngfr6" podStartSLOduration=3.959434508 podStartE2EDuration="15.219766724s" podCreationTimestamp="2026-03-20 09:00:46 +0000 UTC" firstStartedPulling="2026-03-20 09:00:47.340758377 +0000 UTC m=+1192.496938733" lastFinishedPulling="2026-03-20 09:00:58.601090593 +0000 UTC m=+1203.757270949" observedRunningTime="2026-03-20 09:01:01.215162023 +0000 UTC m=+1206.371342399" watchObservedRunningTime="2026-03-20 09:01:01.219766724 +0000 UTC m=+1206.375947080" Mar 20 09:01:01.381008 master-0 kubenswrapper[18707]: I0320 09:01:01.380910 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6976bc9f87-xwq6f" podStartSLOduration=4.280418491 podStartE2EDuration="15.380881528s" podCreationTimestamp="2026-03-20 09:00:46 +0000 UTC" firstStartedPulling="2026-03-20 09:00:47.518553478 +0000 UTC m=+1192.674733834" lastFinishedPulling="2026-03-20 09:00:58.619016505 +0000 UTC m=+1203.775196871" observedRunningTime="2026-03-20 09:01:01.30744444 +0000 UTC m=+1206.463624806" watchObservedRunningTime="2026-03-20 09:01:01.380881528 +0000 UTC m=+1206.537061884" Mar 20 09:01:01.398795 master-0 kubenswrapper[18707]: I0320 09:01:01.396454 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-m27sp" podStartSLOduration=4.3300496 podStartE2EDuration="15.396428823s" podCreationTimestamp="2026-03-20 09:00:46 +0000 UTC" firstStartedPulling="2026-03-20 09:00:47.665884228 +0000 UTC m=+1192.822064584" lastFinishedPulling="2026-03-20 09:00:58.732263451 +0000 UTC m=+1203.888443807" observedRunningTime="2026-03-20 09:01:01.371659715 +0000 UTC m=+1206.527840071" watchObservedRunningTime="2026-03-20 09:01:01.396428823 +0000 UTC m=+1206.552609179" Mar 20 09:01:01.452218 master-0 kubenswrapper[18707]: I0320 09:01:01.442879 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-wvbr6" podStartSLOduration=4.783445234 podStartE2EDuration="17.442854689s" podCreationTimestamp="2026-03-20 09:00:44 +0000 UTC" firstStartedPulling="2026-03-20 09:00:45.94141449 +0000 UTC m=+1191.097594846" lastFinishedPulling="2026-03-20 09:00:58.600823945 +0000 UTC m=+1203.757004301" observedRunningTime="2026-03-20 09:01:01.429250991 +0000 UTC m=+1206.585431377" watchObservedRunningTime="2026-03-20 09:01:01.442854689 +0000 UTC m=+1206.599035045" Mar 20 09:01:01.491927 master-0 kubenswrapper[18707]: I0320 09:01:01.491752 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" podStartSLOduration=4.146102709 podStartE2EDuration="14.491728066s" podCreationTimestamp="2026-03-20 09:00:47 +0000 UTC" firstStartedPulling="2026-03-20 09:00:48.400376657 +0000 UTC m=+1193.556557013" lastFinishedPulling="2026-03-20 09:00:58.746002014 +0000 UTC m=+1203.902182370" observedRunningTime="2026-03-20 09:01:01.476372327 +0000 UTC m=+1206.632552683" watchObservedRunningTime="2026-03-20 09:01:01.491728066 +0000 UTC m=+1206.647908422" Mar 20 09:01:07.900325 master-0 kubenswrapper[18707]: I0320 09:01:07.900260 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-777f7bbbbb-d5qjm" Mar 20 09:01:15.665271 master-0 kubenswrapper[18707]: I0320 09:01:15.665177 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" Mar 20 09:01:23.323325 master-0 kubenswrapper[18707]: I0320 09:01:23.323249 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p"] Mar 20 09:01:23.325799 master-0 kubenswrapper[18707]: I0320 09:01:23.325774 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p" Mar 20 09:01:23.337365 master-0 kubenswrapper[18707]: I0320 09:01:23.335150 18707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 09:01:23.343653 master-0 kubenswrapper[18707]: I0320 09:01:23.342513 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-pp5nf"] Mar 20 09:01:23.354961 master-0 kubenswrapper[18707]: I0320 09:01:23.350009 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.354961 master-0 kubenswrapper[18707]: I0320 09:01:23.353036 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 09:01:23.356585 master-0 kubenswrapper[18707]: I0320 09:01:23.356549 18707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 09:01:23.368328 master-0 kubenswrapper[18707]: I0320 09:01:23.367714 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/035717cf-e166-4721-a9a5-6a92b26bd5d3-frr-startup\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.368328 master-0 kubenswrapper[18707]: I0320 09:01:23.367841 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrz7l\" (UniqueName: \"kubernetes.io/projected/035717cf-e166-4721-a9a5-6a92b26bd5d3-kube-api-access-qrz7l\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.368328 master-0 kubenswrapper[18707]: I0320 09:01:23.367902 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc7582bc-b43b-4cf0-86d3-41529e4598b2-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-lrv7p\" (UID: \"cc7582bc-b43b-4cf0-86d3-41529e4598b2\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p" Mar 20 09:01:23.368328 master-0 kubenswrapper[18707]: I0320 09:01:23.367935 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035717cf-e166-4721-a9a5-6a92b26bd5d3-metrics-certs\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.368328 master-0 kubenswrapper[18707]: I0320 09:01:23.368022 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/035717cf-e166-4721-a9a5-6a92b26bd5d3-reloader\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.368328 master-0 kubenswrapper[18707]: I0320 09:01:23.368066 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/035717cf-e166-4721-a9a5-6a92b26bd5d3-frr-conf\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.368328 master-0 kubenswrapper[18707]: I0320 09:01:23.368098 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/035717cf-e166-4721-a9a5-6a92b26bd5d3-frr-sockets\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.368328 master-0 kubenswrapper[18707]: I0320 09:01:23.368135 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm6ns\" (UniqueName: \"kubernetes.io/projected/cc7582bc-b43b-4cf0-86d3-41529e4598b2-kube-api-access-hm6ns\") pod \"frr-k8s-webhook-server-bcc4b6f68-lrv7p\" (UID: \"cc7582bc-b43b-4cf0-86d3-41529e4598b2\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p" Mar 20 09:01:23.368328 master-0 kubenswrapper[18707]: I0320 09:01:23.368161 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/035717cf-e166-4721-a9a5-6a92b26bd5d3-metrics\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.370465 master-0 kubenswrapper[18707]: I0320 09:01:23.370411 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p"] Mar 20 09:01:23.421329 master-0 kubenswrapper[18707]: I0320 09:01:23.421271 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-6cndp"] Mar 20 09:01:23.422946 master-0 kubenswrapper[18707]: I0320 09:01:23.422930 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6cndp" Mar 20 09:01:23.425110 master-0 kubenswrapper[18707]: I0320 09:01:23.425092 18707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 09:01:23.425528 master-0 kubenswrapper[18707]: I0320 09:01:23.425431 18707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 09:01:23.425528 master-0 kubenswrapper[18707]: I0320 09:01:23.425486 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 09:01:23.448591 master-0 kubenswrapper[18707]: I0320 09:01:23.447062 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-77wvd"] Mar 20 09:01:23.459314 master-0 kubenswrapper[18707]: I0320 09:01:23.458803 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-77wvd" Mar 20 09:01:23.462944 master-0 kubenswrapper[18707]: I0320 09:01:23.462396 18707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 09:01:23.467501 master-0 kubenswrapper[18707]: I0320 09:01:23.467461 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-77wvd"] Mar 20 09:01:23.476311 master-0 kubenswrapper[18707]: I0320 09:01:23.476262 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc7582bc-b43b-4cf0-86d3-41529e4598b2-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-lrv7p\" (UID: \"cc7582bc-b43b-4cf0-86d3-41529e4598b2\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p" Mar 20 09:01:23.476507 master-0 kubenswrapper[18707]: I0320 09:01:23.476326 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035717cf-e166-4721-a9a5-6a92b26bd5d3-metrics-certs\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.476507 master-0 kubenswrapper[18707]: I0320 09:01:23.476369 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-metrics-certs\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:23.476507 master-0 kubenswrapper[18707]: I0320 09:01:23.476391 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46bfda60-579a-4d53-bb23-b5a70353fdf1-metrics-certs\") pod \"controller-7bb4cc7c98-77wvd\" (UID: \"46bfda60-579a-4d53-bb23-b5a70353fdf1\") " pod="metallb-system/controller-7bb4cc7c98-77wvd" Mar 20 09:01:23.476507 master-0 kubenswrapper[18707]: I0320 09:01:23.476413 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46bfda60-579a-4d53-bb23-b5a70353fdf1-cert\") pod \"controller-7bb4cc7c98-77wvd\" (UID: \"46bfda60-579a-4d53-bb23-b5a70353fdf1\") " pod="metallb-system/controller-7bb4cc7c98-77wvd" Mar 20 09:01:23.476507 master-0 kubenswrapper[18707]: I0320 09:01:23.476435 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/035717cf-e166-4721-a9a5-6a92b26bd5d3-reloader\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.476507 master-0 kubenswrapper[18707]: I0320 09:01:23.476475 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9p52\" (UniqueName: \"kubernetes.io/projected/ab262648-351f-4b75-9fd9-0e45cb472247-kube-api-access-f9p52\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:23.476507 master-0 kubenswrapper[18707]: I0320 09:01:23.476507 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/035717cf-e166-4721-a9a5-6a92b26bd5d3-frr-conf\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.476839 master-0 kubenswrapper[18707]: I0320 09:01:23.476545 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/035717cf-e166-4721-a9a5-6a92b26bd5d3-frr-sockets\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.476839 master-0 kubenswrapper[18707]: I0320 09:01:23.476580 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm6ns\" (UniqueName: \"kubernetes.io/projected/cc7582bc-b43b-4cf0-86d3-41529e4598b2-kube-api-access-hm6ns\") pod \"frr-k8s-webhook-server-bcc4b6f68-lrv7p\" (UID: \"cc7582bc-b43b-4cf0-86d3-41529e4598b2\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p" Mar 20 09:01:23.476839 master-0 kubenswrapper[18707]: I0320 09:01:23.476605 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l67xr\" (UniqueName: \"kubernetes.io/projected/46bfda60-579a-4d53-bb23-b5a70353fdf1-kube-api-access-l67xr\") pod \"controller-7bb4cc7c98-77wvd\" (UID: \"46bfda60-579a-4d53-bb23-b5a70353fdf1\") " pod="metallb-system/controller-7bb4cc7c98-77wvd" Mar 20 09:01:23.476839 master-0 kubenswrapper[18707]: I0320 09:01:23.476632 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/035717cf-e166-4721-a9a5-6a92b26bd5d3-metrics\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.476839 master-0 kubenswrapper[18707]: I0320 09:01:23.476659 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-memberlist\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:23.476839 master-0 kubenswrapper[18707]: I0320 09:01:23.476688 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ab262648-351f-4b75-9fd9-0e45cb472247-metallb-excludel2\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:23.476839 master-0 kubenswrapper[18707]: I0320 09:01:23.476713 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/035717cf-e166-4721-a9a5-6a92b26bd5d3-frr-startup\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.476839 master-0 kubenswrapper[18707]: I0320 09:01:23.476747 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrz7l\" (UniqueName: \"kubernetes.io/projected/035717cf-e166-4721-a9a5-6a92b26bd5d3-kube-api-access-qrz7l\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.478100 master-0 kubenswrapper[18707]: I0320 09:01:23.477877 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/035717cf-e166-4721-a9a5-6a92b26bd5d3-reloader\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.478100 master-0 kubenswrapper[18707]: I0320 09:01:23.477964 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/035717cf-e166-4721-a9a5-6a92b26bd5d3-frr-startup\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.478392 master-0 kubenswrapper[18707]: I0320 09:01:23.478355 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/035717cf-e166-4721-a9a5-6a92b26bd5d3-frr-conf\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.482211 master-0 kubenswrapper[18707]: I0320 09:01:23.478568 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/035717cf-e166-4721-a9a5-6a92b26bd5d3-frr-sockets\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.482211 master-0 kubenswrapper[18707]: E0320 09:01:23.478663 18707 secret.go:189] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 20 09:01:23.482211 master-0 kubenswrapper[18707]: E0320 09:01:23.478702 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035717cf-e166-4721-a9a5-6a92b26bd5d3-metrics-certs podName:035717cf-e166-4721-a9a5-6a92b26bd5d3 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:23.978688654 +0000 UTC m=+1229.134869010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/035717cf-e166-4721-a9a5-6a92b26bd5d3-metrics-certs") pod "frr-k8s-pp5nf" (UID: "035717cf-e166-4721-a9a5-6a92b26bd5d3") : secret "frr-k8s-certs-secret" not found Mar 20 09:01:23.482211 master-0 kubenswrapper[18707]: I0320 09:01:23.480310 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/035717cf-e166-4721-a9a5-6a92b26bd5d3-metrics\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.492881 master-0 kubenswrapper[18707]: I0320 09:01:23.492835 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc7582bc-b43b-4cf0-86d3-41529e4598b2-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-lrv7p\" (UID: \"cc7582bc-b43b-4cf0-86d3-41529e4598b2\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p" Mar 20 09:01:23.524140 master-0 kubenswrapper[18707]: I0320 09:01:23.524098 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm6ns\" (UniqueName: \"kubernetes.io/projected/cc7582bc-b43b-4cf0-86d3-41529e4598b2-kube-api-access-hm6ns\") pod \"frr-k8s-webhook-server-bcc4b6f68-lrv7p\" (UID: \"cc7582bc-b43b-4cf0-86d3-41529e4598b2\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p" Mar 20 09:01:23.526816 master-0 kubenswrapper[18707]: I0320 09:01:23.526781 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrz7l\" (UniqueName: \"kubernetes.io/projected/035717cf-e166-4721-a9a5-6a92b26bd5d3-kube-api-access-qrz7l\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:23.578354 master-0 kubenswrapper[18707]: I0320 09:01:23.578168 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-metrics-certs\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:23.578625 master-0 kubenswrapper[18707]: I0320 09:01:23.578608 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46bfda60-579a-4d53-bb23-b5a70353fdf1-metrics-certs\") pod \"controller-7bb4cc7c98-77wvd\" (UID: \"46bfda60-579a-4d53-bb23-b5a70353fdf1\") " pod="metallb-system/controller-7bb4cc7c98-77wvd" Mar 20 09:01:23.578733 master-0 kubenswrapper[18707]: I0320 09:01:23.578720 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46bfda60-579a-4d53-bb23-b5a70353fdf1-cert\") pod \"controller-7bb4cc7c98-77wvd\" (UID: \"46bfda60-579a-4d53-bb23-b5a70353fdf1\") " pod="metallb-system/controller-7bb4cc7c98-77wvd" Mar 20 09:01:23.578843 master-0 kubenswrapper[18707]: I0320 09:01:23.578830 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9p52\" (UniqueName: \"kubernetes.io/projected/ab262648-351f-4b75-9fd9-0e45cb472247-kube-api-access-f9p52\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:23.578977 master-0 kubenswrapper[18707]: I0320 09:01:23.578956 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l67xr\" (UniqueName: \"kubernetes.io/projected/46bfda60-579a-4d53-bb23-b5a70353fdf1-kube-api-access-l67xr\") pod \"controller-7bb4cc7c98-77wvd\" (UID: \"46bfda60-579a-4d53-bb23-b5a70353fdf1\") " pod="metallb-system/controller-7bb4cc7c98-77wvd" Mar 20 09:01:23.579084 master-0 kubenswrapper[18707]: I0320 09:01:23.579065 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-memberlist\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:23.580903 master-0 kubenswrapper[18707]: E0320 09:01:23.578382 18707 secret.go:189] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 20 09:01:23.580903 master-0 kubenswrapper[18707]: E0320 09:01:23.578834 18707 secret.go:189] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 20 09:01:23.580903 master-0 kubenswrapper[18707]: E0320 09:01:23.579165 18707 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 09:01:23.581082 master-0 kubenswrapper[18707]: I0320 09:01:23.581065 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ab262648-351f-4b75-9fd9-0e45cb472247-metallb-excludel2\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:23.581244 master-0 kubenswrapper[18707]: I0320 09:01:23.581086 18707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 09:01:23.581578 master-0 kubenswrapper[18707]: E0320 09:01:23.581566 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-metrics-certs podName:ab262648-351f-4b75-9fd9-0e45cb472247 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:24.081526523 +0000 UTC m=+1229.237706879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-metrics-certs") pod "speaker-6cndp" (UID: "ab262648-351f-4b75-9fd9-0e45cb472247") : secret "speaker-certs-secret" not found Mar 20 09:01:23.581701 master-0 kubenswrapper[18707]: E0320 09:01:23.581687 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46bfda60-579a-4d53-bb23-b5a70353fdf1-metrics-certs podName:46bfda60-579a-4d53-bb23-b5a70353fdf1 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:24.081673957 +0000 UTC m=+1229.237854303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46bfda60-579a-4d53-bb23-b5a70353fdf1-metrics-certs") pod "controller-7bb4cc7c98-77wvd" (UID: "46bfda60-579a-4d53-bb23-b5a70353fdf1") : secret "controller-certs-secret" not found Mar 20 09:01:23.581830 master-0 kubenswrapper[18707]: E0320 09:01:23.581810 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-memberlist podName:ab262648-351f-4b75-9fd9-0e45cb472247 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:24.081793351 +0000 UTC m=+1229.237973717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-memberlist") pod "speaker-6cndp" (UID: "ab262648-351f-4b75-9fd9-0e45cb472247") : secret "metallb-memberlist" not found Mar 20 09:01:23.581942 master-0 kubenswrapper[18707]: I0320 09:01:23.581842 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ab262648-351f-4b75-9fd9-0e45cb472247-metallb-excludel2\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:23.598927 master-0 kubenswrapper[18707]: I0320 09:01:23.598886 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9p52\" (UniqueName: \"kubernetes.io/projected/ab262648-351f-4b75-9fd9-0e45cb472247-kube-api-access-f9p52\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:23.599150 master-0 kubenswrapper[18707]: I0320 09:01:23.598933 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l67xr\" (UniqueName: \"kubernetes.io/projected/46bfda60-579a-4d53-bb23-b5a70353fdf1-kube-api-access-l67xr\") pod \"controller-7bb4cc7c98-77wvd\" (UID: \"46bfda60-579a-4d53-bb23-b5a70353fdf1\") " pod="metallb-system/controller-7bb4cc7c98-77wvd" Mar 20 09:01:23.608220 master-0 kubenswrapper[18707]: I0320 09:01:23.606593 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46bfda60-579a-4d53-bb23-b5a70353fdf1-cert\") pod \"controller-7bb4cc7c98-77wvd\" (UID: \"46bfda60-579a-4d53-bb23-b5a70353fdf1\") " pod="metallb-system/controller-7bb4cc7c98-77wvd" Mar 20 09:01:23.662228 master-0 kubenswrapper[18707]: I0320 09:01:23.661426 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p" Mar 20 09:01:24.016288 master-0 kubenswrapper[18707]: I0320 09:01:24.015998 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035717cf-e166-4721-a9a5-6a92b26bd5d3-metrics-certs\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:24.019772 master-0 kubenswrapper[18707]: I0320 09:01:24.019718 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/035717cf-e166-4721-a9a5-6a92b26bd5d3-metrics-certs\") pod \"frr-k8s-pp5nf\" (UID: \"035717cf-e166-4721-a9a5-6a92b26bd5d3\") " pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:24.122624 master-0 kubenswrapper[18707]: I0320 09:01:24.122493 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-metrics-certs\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:24.122624 master-0 kubenswrapper[18707]: I0320 09:01:24.122575 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46bfda60-579a-4d53-bb23-b5a70353fdf1-metrics-certs\") pod \"controller-7bb4cc7c98-77wvd\" (UID: \"46bfda60-579a-4d53-bb23-b5a70353fdf1\") " pod="metallb-system/controller-7bb4cc7c98-77wvd" Mar 20 09:01:24.122842 master-0 kubenswrapper[18707]: I0320 09:01:24.122819 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-memberlist\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:24.123174 master-0 kubenswrapper[18707]: E0320 09:01:24.123056 18707 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 09:01:24.123174 master-0 kubenswrapper[18707]: E0320 09:01:24.123119 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-memberlist podName:ab262648-351f-4b75-9fd9-0e45cb472247 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:25.123101329 +0000 UTC m=+1230.279281695 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-memberlist") pod "speaker-6cndp" (UID: "ab262648-351f-4b75-9fd9-0e45cb472247") : secret "metallb-memberlist" not found Mar 20 09:01:24.126238 master-0 kubenswrapper[18707]: I0320 09:01:24.126179 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-metrics-certs\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:24.126852 master-0 kubenswrapper[18707]: I0320 09:01:24.126809 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46bfda60-579a-4d53-bb23-b5a70353fdf1-metrics-certs\") pod \"controller-7bb4cc7c98-77wvd\" (UID: \"46bfda60-579a-4d53-bb23-b5a70353fdf1\") " pod="metallb-system/controller-7bb4cc7c98-77wvd" Mar 20 09:01:24.129567 master-0 kubenswrapper[18707]: I0320 09:01:24.129516 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p"] Mar 20 09:01:24.132352 master-0 kubenswrapper[18707]: W0320 09:01:24.132309 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc7582bc_b43b_4cf0_86d3_41529e4598b2.slice/crio-b1d66ec3b093cc502ac08764a344370bdb8eba1fab4b4932d404cc974f2f05d3 WatchSource:0}: Error finding container b1d66ec3b093cc502ac08764a344370bdb8eba1fab4b4932d404cc974f2f05d3: Status 404 returned error can't find the container with id b1d66ec3b093cc502ac08764a344370bdb8eba1fab4b4932d404cc974f2f05d3 Mar 20 09:01:24.285592 master-0 kubenswrapper[18707]: I0320 09:01:24.285516 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:24.389343 master-0 kubenswrapper[18707]: I0320 09:01:24.389132 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-77wvd" Mar 20 09:01:24.454551 master-0 kubenswrapper[18707]: I0320 09:01:24.454449 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p" event={"ID":"cc7582bc-b43b-4cf0-86d3-41529e4598b2","Type":"ContainerStarted","Data":"b1d66ec3b093cc502ac08764a344370bdb8eba1fab4b4932d404cc974f2f05d3"} Mar 20 09:01:24.921013 master-0 kubenswrapper[18707]: I0320 09:01:24.920949 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-77wvd"] Mar 20 09:01:25.158882 master-0 kubenswrapper[18707]: I0320 09:01:25.158827 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-memberlist\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:25.159673 master-0 kubenswrapper[18707]: E0320 09:01:25.159646 18707 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 09:01:25.159749 master-0 kubenswrapper[18707]: E0320 09:01:25.159698 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-memberlist podName:ab262648-351f-4b75-9fd9-0e45cb472247 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:27.159680231 +0000 UTC m=+1232.315860587 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-memberlist") pod "speaker-6cndp" (UID: "ab262648-351f-4b75-9fd9-0e45cb472247") : secret "metallb-memberlist" not found Mar 20 09:01:25.369105 master-0 kubenswrapper[18707]: I0320 09:01:25.369041 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-p28fn"] Mar 20 09:01:25.372347 master-0 kubenswrapper[18707]: I0320 09:01:25.372306 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-p28fn" Mar 20 09:01:25.384673 master-0 kubenswrapper[18707]: I0320 09:01:25.384622 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6"] Mar 20 09:01:25.385790 master-0 kubenswrapper[18707]: I0320 09:01:25.385759 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6" Mar 20 09:01:25.388449 master-0 kubenswrapper[18707]: I0320 09:01:25.388393 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 09:01:25.432573 master-0 kubenswrapper[18707]: I0320 09:01:25.432507 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6"] Mar 20 09:01:25.444652 master-0 kubenswrapper[18707]: I0320 09:01:25.444565 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-p28fn"] Mar 20 09:01:25.465179 master-0 kubenswrapper[18707]: I0320 09:01:25.464646 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjmcb\" (UniqueName: \"kubernetes.io/projected/45f17f3f-8533-43d7-8b52-481acf2e7624-kube-api-access-fjmcb\") pod \"nmstate-metrics-9b8c8685d-p28fn\" (UID: \"45f17f3f-8533-43d7-8b52-481acf2e7624\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-p28fn" Mar 20 09:01:25.465179 master-0 kubenswrapper[18707]: I0320 09:01:25.464759 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4446x\" (UniqueName: \"kubernetes.io/projected/493f5b6f-f933-4bf3-b35b-654d1ca74a55-kube-api-access-4446x\") pod \"nmstate-webhook-5f558f5558-w8sq6\" (UID: \"493f5b6f-f933-4bf3-b35b-654d1ca74a55\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6" Mar 20 09:01:25.465179 master-0 kubenswrapper[18707]: I0320 09:01:25.464922 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/493f5b6f-f933-4bf3-b35b-654d1ca74a55-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-w8sq6\" (UID: \"493f5b6f-f933-4bf3-b35b-654d1ca74a55\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6" Mar 20 09:01:25.491141 master-0 kubenswrapper[18707]: I0320 09:01:25.490252 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-8ffsp"] Mar 20 09:01:25.492127 master-0 kubenswrapper[18707]: I0320 09:01:25.492096 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:25.510805 master-0 kubenswrapper[18707]: I0320 09:01:25.510725 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pp5nf" event={"ID":"035717cf-e166-4721-a9a5-6a92b26bd5d3","Type":"ContainerStarted","Data":"3e6b1ca8a7ca03047305e42abe79a7c20729524c8675962c4cb6f87f2b277053"} Mar 20 09:01:25.525147 master-0 kubenswrapper[18707]: I0320 09:01:25.523959 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-77wvd" event={"ID":"46bfda60-579a-4d53-bb23-b5a70353fdf1","Type":"ContainerStarted","Data":"d88ce55c10c6d20d53dc476fec2b080d89468c15248b200303df9435147ef0e3"} Mar 20 09:01:25.525147 master-0 kubenswrapper[18707]: I0320 09:01:25.524026 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-77wvd" event={"ID":"46bfda60-579a-4d53-bb23-b5a70353fdf1","Type":"ContainerStarted","Data":"cba04c117415bd657b11c3869a5cb547f7032be8790cdfdef55465bbdbb38dfd"} Mar 20 09:01:25.575055 master-0 kubenswrapper[18707]: I0320 09:01:25.574993 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh"] Mar 20 09:01:25.576514 master-0 kubenswrapper[18707]: I0320 09:01:25.576497 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh" Mar 20 09:01:25.582580 master-0 kubenswrapper[18707]: I0320 09:01:25.582517 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c6854094-9c34-4da0-8c9d-0bee5171c81c-dbus-socket\") pod \"nmstate-handler-8ffsp\" (UID: \"c6854094-9c34-4da0-8c9d-0bee5171c81c\") " pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:25.582785 master-0 kubenswrapper[18707]: I0320 09:01:25.582674 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjmcb\" (UniqueName: \"kubernetes.io/projected/45f17f3f-8533-43d7-8b52-481acf2e7624-kube-api-access-fjmcb\") pod \"nmstate-metrics-9b8c8685d-p28fn\" (UID: \"45f17f3f-8533-43d7-8b52-481acf2e7624\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-p28fn" Mar 20 09:01:25.582785 master-0 kubenswrapper[18707]: I0320 09:01:25.582766 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4446x\" (UniqueName: \"kubernetes.io/projected/493f5b6f-f933-4bf3-b35b-654d1ca74a55-kube-api-access-4446x\") pod \"nmstate-webhook-5f558f5558-w8sq6\" (UID: \"493f5b6f-f933-4bf3-b35b-654d1ca74a55\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6" Mar 20 09:01:25.582892 master-0 kubenswrapper[18707]: I0320 09:01:25.582844 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c6854094-9c34-4da0-8c9d-0bee5171c81c-nmstate-lock\") pod \"nmstate-handler-8ffsp\" (UID: \"c6854094-9c34-4da0-8c9d-0bee5171c81c\") " pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:25.583058 master-0 kubenswrapper[18707]: I0320 09:01:25.583019 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96zbr\" (UniqueName: \"kubernetes.io/projected/c6854094-9c34-4da0-8c9d-0bee5171c81c-kube-api-access-96zbr\") pod \"nmstate-handler-8ffsp\" (UID: \"c6854094-9c34-4da0-8c9d-0bee5171c81c\") " pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:25.583136 master-0 kubenswrapper[18707]: I0320 09:01:25.583067 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c6854094-9c34-4da0-8c9d-0bee5171c81c-ovs-socket\") pod \"nmstate-handler-8ffsp\" (UID: \"c6854094-9c34-4da0-8c9d-0bee5171c81c\") " pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:25.583202 master-0 kubenswrapper[18707]: I0320 09:01:25.583140 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/493f5b6f-f933-4bf3-b35b-654d1ca74a55-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-w8sq6\" (UID: \"493f5b6f-f933-4bf3-b35b-654d1ca74a55\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6" Mar 20 09:01:25.583381 master-0 kubenswrapper[18707]: E0320 09:01:25.583361 18707 secret.go:189] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 20 09:01:25.585959 master-0 kubenswrapper[18707]: E0320 09:01:25.583864 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493f5b6f-f933-4bf3-b35b-654d1ca74a55-tls-key-pair podName:493f5b6f-f933-4bf3-b35b-654d1ca74a55 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:26.083839442 +0000 UTC m=+1231.240019818 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/493f5b6f-f933-4bf3-b35b-654d1ca74a55-tls-key-pair") pod "nmstate-webhook-5f558f5558-w8sq6" (UID: "493f5b6f-f933-4bf3-b35b-654d1ca74a55") : secret "openshift-nmstate-webhook" not found Mar 20 09:01:25.585959 master-0 kubenswrapper[18707]: I0320 09:01:25.585254 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 09:01:25.585959 master-0 kubenswrapper[18707]: I0320 09:01:25.585501 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 09:01:25.601612 master-0 kubenswrapper[18707]: I0320 09:01:25.598372 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh"] Mar 20 09:01:25.640327 master-0 kubenswrapper[18707]: I0320 09:01:25.640224 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjmcb\" (UniqueName: \"kubernetes.io/projected/45f17f3f-8533-43d7-8b52-481acf2e7624-kube-api-access-fjmcb\") pod \"nmstate-metrics-9b8c8685d-p28fn\" (UID: \"45f17f3f-8533-43d7-8b52-481acf2e7624\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-p28fn" Mar 20 09:01:25.645246 master-0 kubenswrapper[18707]: I0320 09:01:25.640109 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4446x\" (UniqueName: \"kubernetes.io/projected/493f5b6f-f933-4bf3-b35b-654d1ca74a55-kube-api-access-4446x\") pod \"nmstate-webhook-5f558f5558-w8sq6\" (UID: \"493f5b6f-f933-4bf3-b35b-654d1ca74a55\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6" Mar 20 09:01:25.688659 master-0 kubenswrapper[18707]: I0320 09:01:25.686353 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96zbr\" (UniqueName: \"kubernetes.io/projected/c6854094-9c34-4da0-8c9d-0bee5171c81c-kube-api-access-96zbr\") pod \"nmstate-handler-8ffsp\" (UID: \"c6854094-9c34-4da0-8c9d-0bee5171c81c\") " pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:25.688659 master-0 kubenswrapper[18707]: I0320 09:01:25.686495 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c6854094-9c34-4da0-8c9d-0bee5171c81c-ovs-socket\") pod \"nmstate-handler-8ffsp\" (UID: \"c6854094-9c34-4da0-8c9d-0bee5171c81c\") " pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:25.688659 master-0 kubenswrapper[18707]: I0320 09:01:25.686597 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c6854094-9c34-4da0-8c9d-0bee5171c81c-dbus-socket\") pod \"nmstate-handler-8ffsp\" (UID: \"c6854094-9c34-4da0-8c9d-0bee5171c81c\") " pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:25.688659 master-0 kubenswrapper[18707]: I0320 09:01:25.687315 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/93b2f711-44ac-4cf5-a6f4-3203df9fe0a3-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5wfxh\" (UID: \"93b2f711-44ac-4cf5-a6f4-3203df9fe0a3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh" Mar 20 09:01:25.688659 master-0 kubenswrapper[18707]: I0320 09:01:25.687387 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn8g8\" (UniqueName: \"kubernetes.io/projected/93b2f711-44ac-4cf5-a6f4-3203df9fe0a3-kube-api-access-sn8g8\") pod \"nmstate-console-plugin-86f58fcf4-5wfxh\" (UID: \"93b2f711-44ac-4cf5-a6f4-3203df9fe0a3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh" Mar 20 09:01:25.688659 master-0 kubenswrapper[18707]: I0320 09:01:25.687447 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c6854094-9c34-4da0-8c9d-0bee5171c81c-nmstate-lock\") pod \"nmstate-handler-8ffsp\" (UID: \"c6854094-9c34-4da0-8c9d-0bee5171c81c\") " pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:25.688659 master-0 kubenswrapper[18707]: I0320 09:01:25.687469 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/93b2f711-44ac-4cf5-a6f4-3203df9fe0a3-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-5wfxh\" (UID: \"93b2f711-44ac-4cf5-a6f4-3203df9fe0a3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh" Mar 20 09:01:25.692217 master-0 kubenswrapper[18707]: I0320 09:01:25.690951 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c6854094-9c34-4da0-8c9d-0bee5171c81c-ovs-socket\") pod \"nmstate-handler-8ffsp\" (UID: \"c6854094-9c34-4da0-8c9d-0bee5171c81c\") " pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:25.692217 master-0 kubenswrapper[18707]: I0320 09:01:25.691016 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c6854094-9c34-4da0-8c9d-0bee5171c81c-dbus-socket\") pod \"nmstate-handler-8ffsp\" (UID: \"c6854094-9c34-4da0-8c9d-0bee5171c81c\") " pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:25.692217 master-0 kubenswrapper[18707]: I0320 09:01:25.691047 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c6854094-9c34-4da0-8c9d-0bee5171c81c-nmstate-lock\") pod \"nmstate-handler-8ffsp\" (UID: \"c6854094-9c34-4da0-8c9d-0bee5171c81c\") " pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:25.692217 master-0 kubenswrapper[18707]: I0320 09:01:25.692141 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-p28fn" Mar 20 09:01:25.712854 master-0 kubenswrapper[18707]: I0320 09:01:25.712815 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96zbr\" (UniqueName: \"kubernetes.io/projected/c6854094-9c34-4da0-8c9d-0bee5171c81c-kube-api-access-96zbr\") pod \"nmstate-handler-8ffsp\" (UID: \"c6854094-9c34-4da0-8c9d-0bee5171c81c\") " pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:25.788908 master-0 kubenswrapper[18707]: I0320 09:01:25.788852 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn8g8\" (UniqueName: \"kubernetes.io/projected/93b2f711-44ac-4cf5-a6f4-3203df9fe0a3-kube-api-access-sn8g8\") pod \"nmstate-console-plugin-86f58fcf4-5wfxh\" (UID: \"93b2f711-44ac-4cf5-a6f4-3203df9fe0a3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh" Mar 20 09:01:25.789115 master-0 kubenswrapper[18707]: I0320 09:01:25.788927 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/93b2f711-44ac-4cf5-a6f4-3203df9fe0a3-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-5wfxh\" (UID: \"93b2f711-44ac-4cf5-a6f4-3203df9fe0a3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh" Mar 20 09:01:25.789115 master-0 kubenswrapper[18707]: I0320 09:01:25.789068 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/93b2f711-44ac-4cf5-a6f4-3203df9fe0a3-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5wfxh\" (UID: \"93b2f711-44ac-4cf5-a6f4-3203df9fe0a3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh" Mar 20 09:01:25.792586 master-0 kubenswrapper[18707]: E0320 09:01:25.789245 18707 secret.go:189] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 20 09:01:25.792586 master-0 kubenswrapper[18707]: E0320 09:01:25.789302 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93b2f711-44ac-4cf5-a6f4-3203df9fe0a3-plugin-serving-cert podName:93b2f711-44ac-4cf5-a6f4-3203df9fe0a3 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:26.289287883 +0000 UTC m=+1231.445468239 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/93b2f711-44ac-4cf5-a6f4-3203df9fe0a3-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-5wfxh" (UID: "93b2f711-44ac-4cf5-a6f4-3203df9fe0a3") : secret "plugin-serving-cert" not found Mar 20 09:01:25.792586 master-0 kubenswrapper[18707]: I0320 09:01:25.790551 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/93b2f711-44ac-4cf5-a6f4-3203df9fe0a3-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-5wfxh\" (UID: \"93b2f711-44ac-4cf5-a6f4-3203df9fe0a3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh" Mar 20 09:01:25.826147 master-0 kubenswrapper[18707]: I0320 09:01:25.826100 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:25.846171 master-0 kubenswrapper[18707]: I0320 09:01:25.846071 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn8g8\" (UniqueName: \"kubernetes.io/projected/93b2f711-44ac-4cf5-a6f4-3203df9fe0a3-kube-api-access-sn8g8\") pod \"nmstate-console-plugin-86f58fcf4-5wfxh\" (UID: \"93b2f711-44ac-4cf5-a6f4-3203df9fe0a3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh" Mar 20 09:01:25.848087 master-0 kubenswrapper[18707]: I0320 09:01:25.848040 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5df6fc5897-d9ws4"] Mar 20 09:01:25.849789 master-0 kubenswrapper[18707]: I0320 09:01:25.849752 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:25.861098 master-0 kubenswrapper[18707]: I0320 09:01:25.860976 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5df6fc5897-d9ws4"] Mar 20 09:01:25.997862 master-0 kubenswrapper[18707]: I0320 09:01:25.997768 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lrlk\" (UniqueName: \"kubernetes.io/projected/30417709-6491-48dd-a06b-d2ed4581f60e-kube-api-access-4lrlk\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:25.998108 master-0 kubenswrapper[18707]: I0320 09:01:25.997930 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30417709-6491-48dd-a06b-d2ed4581f60e-oauth-serving-cert\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:25.998108 master-0 kubenswrapper[18707]: I0320 09:01:25.997989 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30417709-6491-48dd-a06b-d2ed4581f60e-console-serving-cert\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:25.998108 master-0 kubenswrapper[18707]: I0320 09:01:25.998062 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30417709-6491-48dd-a06b-d2ed4581f60e-console-config\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:25.998282 master-0 kubenswrapper[18707]: I0320 09:01:25.998091 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30417709-6491-48dd-a06b-d2ed4581f60e-trusted-ca-bundle\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:25.998282 master-0 kubenswrapper[18707]: I0320 09:01:25.998214 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30417709-6491-48dd-a06b-d2ed4581f60e-console-oauth-config\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:25.998385 master-0 kubenswrapper[18707]: I0320 09:01:25.998259 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30417709-6491-48dd-a06b-d2ed4581f60e-service-ca\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.112205 master-0 kubenswrapper[18707]: I0320 09:01:26.101020 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30417709-6491-48dd-a06b-d2ed4581f60e-console-config\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.112205 master-0 kubenswrapper[18707]: I0320 09:01:26.101080 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30417709-6491-48dd-a06b-d2ed4581f60e-trusted-ca-bundle\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.112205 master-0 kubenswrapper[18707]: I0320 09:01:26.101128 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30417709-6491-48dd-a06b-d2ed4581f60e-console-oauth-config\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.112205 master-0 kubenswrapper[18707]: I0320 09:01:26.101160 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30417709-6491-48dd-a06b-d2ed4581f60e-service-ca\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.112205 master-0 kubenswrapper[18707]: I0320 09:01:26.101252 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lrlk\" (UniqueName: \"kubernetes.io/projected/30417709-6491-48dd-a06b-d2ed4581f60e-kube-api-access-4lrlk\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.112205 master-0 kubenswrapper[18707]: I0320 09:01:26.101288 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30417709-6491-48dd-a06b-d2ed4581f60e-oauth-serving-cert\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.112205 master-0 kubenswrapper[18707]: I0320 09:01:26.101309 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30417709-6491-48dd-a06b-d2ed4581f60e-console-serving-cert\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.112205 master-0 kubenswrapper[18707]: I0320 09:01:26.101331 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/493f5b6f-f933-4bf3-b35b-654d1ca74a55-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-w8sq6\" (UID: \"493f5b6f-f933-4bf3-b35b-654d1ca74a55\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6" Mar 20 09:01:26.112205 master-0 kubenswrapper[18707]: I0320 09:01:26.106889 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/493f5b6f-f933-4bf3-b35b-654d1ca74a55-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-w8sq6\" (UID: \"493f5b6f-f933-4bf3-b35b-654d1ca74a55\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6" Mar 20 09:01:26.112205 master-0 kubenswrapper[18707]: I0320 09:01:26.107616 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/30417709-6491-48dd-a06b-d2ed4581f60e-console-config\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.112205 master-0 kubenswrapper[18707]: I0320 09:01:26.108432 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30417709-6491-48dd-a06b-d2ed4581f60e-trusted-ca-bundle\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.112205 master-0 kubenswrapper[18707]: I0320 09:01:26.111653 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30417709-6491-48dd-a06b-d2ed4581f60e-service-ca\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.112767 master-0 kubenswrapper[18707]: I0320 09:01:26.112404 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/30417709-6491-48dd-a06b-d2ed4581f60e-oauth-serving-cert\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.135538 master-0 kubenswrapper[18707]: I0320 09:01:26.128879 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/30417709-6491-48dd-a06b-d2ed4581f60e-console-oauth-config\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.135538 master-0 kubenswrapper[18707]: I0320 09:01:26.131453 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/30417709-6491-48dd-a06b-d2ed4581f60e-console-serving-cert\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.141263 master-0 kubenswrapper[18707]: I0320 09:01:26.136301 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lrlk\" (UniqueName: \"kubernetes.io/projected/30417709-6491-48dd-a06b-d2ed4581f60e-kube-api-access-4lrlk\") pod \"console-5df6fc5897-d9ws4\" (UID: \"30417709-6491-48dd-a06b-d2ed4581f60e\") " pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.194302 master-0 kubenswrapper[18707]: I0320 09:01:26.189657 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:26.254507 master-0 kubenswrapper[18707]: I0320 09:01:26.253282 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-p28fn"] Mar 20 09:01:26.309217 master-0 kubenswrapper[18707]: I0320 09:01:26.309157 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/93b2f711-44ac-4cf5-a6f4-3203df9fe0a3-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5wfxh\" (UID: \"93b2f711-44ac-4cf5-a6f4-3203df9fe0a3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh" Mar 20 09:01:26.313448 master-0 kubenswrapper[18707]: I0320 09:01:26.313410 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6" Mar 20 09:01:26.334920 master-0 kubenswrapper[18707]: I0320 09:01:26.334649 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/93b2f711-44ac-4cf5-a6f4-3203df9fe0a3-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-5wfxh\" (UID: \"93b2f711-44ac-4cf5-a6f4-3203df9fe0a3\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh" Mar 20 09:01:26.540355 master-0 kubenswrapper[18707]: I0320 09:01:26.540290 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh" Mar 20 09:01:26.544546 master-0 kubenswrapper[18707]: I0320 09:01:26.544478 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-p28fn" event={"ID":"45f17f3f-8533-43d7-8b52-481acf2e7624","Type":"ContainerStarted","Data":"b83e8540ed788eb56887c0c7937c8f554c93eb9d3820395d47c5bc26d19b88a7"} Mar 20 09:01:26.545927 master-0 kubenswrapper[18707]: I0320 09:01:26.545878 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8ffsp" event={"ID":"c6854094-9c34-4da0-8c9d-0bee5171c81c","Type":"ContainerStarted","Data":"e61857b070d82f643eefb888543e1df9a74e4eeacf3ca7ba9d2d130c6ad7506f"} Mar 20 09:01:26.704877 master-0 kubenswrapper[18707]: I0320 09:01:26.704816 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5df6fc5897-d9ws4"] Mar 20 09:01:26.725598 master-0 kubenswrapper[18707]: W0320 09:01:26.711954 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30417709_6491_48dd_a06b_d2ed4581f60e.slice/crio-777d104a8887b230d8addaf89953b197692f56585d1e5bcd031add1ff9e4a64d WatchSource:0}: Error finding container 777d104a8887b230d8addaf89953b197692f56585d1e5bcd031add1ff9e4a64d: Status 404 returned error can't find the container with id 777d104a8887b230d8addaf89953b197692f56585d1e5bcd031add1ff9e4a64d Mar 20 09:01:26.767490 master-0 kubenswrapper[18707]: I0320 09:01:26.767360 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6"] Mar 20 09:01:27.022263 master-0 kubenswrapper[18707]: I0320 09:01:27.022161 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh"] Mar 20 09:01:27.032648 master-0 kubenswrapper[18707]: W0320 09:01:27.032575 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93b2f711_44ac_4cf5_a6f4_3203df9fe0a3.slice/crio-fd9e1cc20230aac27b212ae03fd8ecc31c1e574494048f8ac35a31d7dca174c5 WatchSource:0}: Error finding container fd9e1cc20230aac27b212ae03fd8ecc31c1e574494048f8ac35a31d7dca174c5: Status 404 returned error can't find the container with id fd9e1cc20230aac27b212ae03fd8ecc31c1e574494048f8ac35a31d7dca174c5 Mar 20 09:01:27.227458 master-0 kubenswrapper[18707]: I0320 09:01:27.227400 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-memberlist\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:27.231624 master-0 kubenswrapper[18707]: I0320 09:01:27.231570 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ab262648-351f-4b75-9fd9-0e45cb472247-memberlist\") pod \"speaker-6cndp\" (UID: \"ab262648-351f-4b75-9fd9-0e45cb472247\") " pod="metallb-system/speaker-6cndp" Mar 20 09:01:27.363114 master-0 kubenswrapper[18707]: I0320 09:01:27.363018 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6cndp" Mar 20 09:01:27.407582 master-0 kubenswrapper[18707]: W0320 09:01:27.407505 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab262648_351f_4b75_9fd9_0e45cb472247.slice/crio-84f385621dc0fa2d0b3679238f935c9b93b126b7d9826dc7de724174895ecc9b WatchSource:0}: Error finding container 84f385621dc0fa2d0b3679238f935c9b93b126b7d9826dc7de724174895ecc9b: Status 404 returned error can't find the container with id 84f385621dc0fa2d0b3679238f935c9b93b126b7d9826dc7de724174895ecc9b Mar 20 09:01:27.561709 master-0 kubenswrapper[18707]: I0320 09:01:27.561603 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6" event={"ID":"493f5b6f-f933-4bf3-b35b-654d1ca74a55","Type":"ContainerStarted","Data":"185b8f306e7f3daa4c2a985733772ecde20faf8141244496f874268faf2fa40d"} Mar 20 09:01:27.568091 master-0 kubenswrapper[18707]: I0320 09:01:27.568033 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5df6fc5897-d9ws4" event={"ID":"30417709-6491-48dd-a06b-d2ed4581f60e","Type":"ContainerStarted","Data":"8cd7c27efa56e130b0ddd4c184fc54aaeb3f0b592c2ea01faf2229c965caf686"} Mar 20 09:01:27.568231 master-0 kubenswrapper[18707]: I0320 09:01:27.568105 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5df6fc5897-d9ws4" event={"ID":"30417709-6491-48dd-a06b-d2ed4581f60e","Type":"ContainerStarted","Data":"777d104a8887b230d8addaf89953b197692f56585d1e5bcd031add1ff9e4a64d"} Mar 20 09:01:27.575126 master-0 kubenswrapper[18707]: I0320 09:01:27.573445 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-77wvd" event={"ID":"46bfda60-579a-4d53-bb23-b5a70353fdf1","Type":"ContainerStarted","Data":"684fe368c2cea27963adb61508558ec5bc7c778170f064253f7b34104c01c4d6"} Mar 20 09:01:27.575126 master-0 kubenswrapper[18707]: I0320 09:01:27.573569 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-77wvd" Mar 20 09:01:27.575417 master-0 kubenswrapper[18707]: I0320 09:01:27.575240 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6cndp" event={"ID":"ab262648-351f-4b75-9fd9-0e45cb472247","Type":"ContainerStarted","Data":"84f385621dc0fa2d0b3679238f935c9b93b126b7d9826dc7de724174895ecc9b"} Mar 20 09:01:27.576762 master-0 kubenswrapper[18707]: I0320 09:01:27.576696 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh" event={"ID":"93b2f711-44ac-4cf5-a6f4-3203df9fe0a3","Type":"ContainerStarted","Data":"fd9e1cc20230aac27b212ae03fd8ecc31c1e574494048f8ac35a31d7dca174c5"} Mar 20 09:01:27.723276 master-0 kubenswrapper[18707]: I0320 09:01:27.723141 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5df6fc5897-d9ws4" podStartSLOduration=2.723124204 podStartE2EDuration="2.723124204s" podCreationTimestamp="2026-03-20 09:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:01:27.721203959 +0000 UTC m=+1232.877384315" watchObservedRunningTime="2026-03-20 09:01:27.723124204 +0000 UTC m=+1232.879304570" Mar 20 09:01:27.821745 master-0 kubenswrapper[18707]: I0320 09:01:27.821676 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-77wvd" podStartSLOduration=3.166010818 podStartE2EDuration="4.821649079s" podCreationTimestamp="2026-03-20 09:01:23 +0000 UTC" firstStartedPulling="2026-03-20 09:01:25.114110049 +0000 UTC m=+1230.270290405" lastFinishedPulling="2026-03-20 09:01:26.76974831 +0000 UTC m=+1231.925928666" observedRunningTime="2026-03-20 09:01:27.809452481 +0000 UTC m=+1232.965632847" watchObservedRunningTime="2026-03-20 09:01:27.821649079 +0000 UTC m=+1232.977829445" Mar 20 09:01:28.588166 master-0 kubenswrapper[18707]: I0320 09:01:28.588117 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6cndp" event={"ID":"ab262648-351f-4b75-9fd9-0e45cb472247","Type":"ContainerStarted","Data":"d07e1004bdf20d6692d84ff52f88c7ed1067e1e542590ff5926953709e17f6a0"} Mar 20 09:01:28.588166 master-0 kubenswrapper[18707]: I0320 09:01:28.588170 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6cndp" event={"ID":"ab262648-351f-4b75-9fd9-0e45cb472247","Type":"ContainerStarted","Data":"823fad884eb16f18d48586ab52522db82aba87c5ea528fa8caba08ed3a631635"} Mar 20 09:01:28.588838 master-0 kubenswrapper[18707]: I0320 09:01:28.588632 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-6cndp" Mar 20 09:01:28.606482 master-0 kubenswrapper[18707]: I0320 09:01:28.606405 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-6cndp" podStartSLOduration=5.606387244 podStartE2EDuration="5.606387244s" podCreationTimestamp="2026-03-20 09:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:01:28.603467041 +0000 UTC m=+1233.759647397" watchObservedRunningTime="2026-03-20 09:01:28.606387244 +0000 UTC m=+1233.762567600" Mar 20 09:01:33.646633 master-0 kubenswrapper[18707]: I0320 09:01:33.646573 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-p28fn" event={"ID":"45f17f3f-8533-43d7-8b52-481acf2e7624","Type":"ContainerStarted","Data":"14298b4ea6965db0f0070dd6015220cb46da8721d0c191da9d7dcf4ee01595de"} Mar 20 09:01:33.646633 master-0 kubenswrapper[18707]: I0320 09:01:33.646632 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-p28fn" event={"ID":"45f17f3f-8533-43d7-8b52-481acf2e7624","Type":"ContainerStarted","Data":"ae85bf23a5d7388cc4be6ae2f56386b69545302b4da0e0d2f27bf7588b73ba33"} Mar 20 09:01:33.648571 master-0 kubenswrapper[18707]: I0320 09:01:33.648532 18707 generic.go:334] "Generic (PLEG): container finished" podID="035717cf-e166-4721-a9a5-6a92b26bd5d3" containerID="0185c01ed3deb0a9abfe1a7fd7be2a1fbe0dcfb9c14da36af345d2dd88684d5e" exitCode=0 Mar 20 09:01:33.648662 master-0 kubenswrapper[18707]: I0320 09:01:33.648592 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pp5nf" event={"ID":"035717cf-e166-4721-a9a5-6a92b26bd5d3","Type":"ContainerDied","Data":"0185c01ed3deb0a9abfe1a7fd7be2a1fbe0dcfb9c14da36af345d2dd88684d5e"} Mar 20 09:01:33.652151 master-0 kubenswrapper[18707]: I0320 09:01:33.652099 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8ffsp" event={"ID":"c6854094-9c34-4da0-8c9d-0bee5171c81c","Type":"ContainerStarted","Data":"106938482d0ede157f81208e09c6dfbca0c4262e323ce22d7433182f135dac88"} Mar 20 09:01:33.652283 master-0 kubenswrapper[18707]: I0320 09:01:33.652245 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:33.656736 master-0 kubenswrapper[18707]: I0320 09:01:33.656652 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh" event={"ID":"93b2f711-44ac-4cf5-a6f4-3203df9fe0a3","Type":"ContainerStarted","Data":"92d378f77ddaa9313b3924f77972d04b5f87a75b3a8e2938a111cb7a1dc848d8"} Mar 20 09:01:33.667575 master-0 kubenswrapper[18707]: I0320 09:01:33.667311 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6" event={"ID":"493f5b6f-f933-4bf3-b35b-654d1ca74a55","Type":"ContainerStarted","Data":"d707a0cc16920912de8405d1350f95e6f10891e6bc6538bab6a136e14cf275ec"} Mar 20 09:01:33.667757 master-0 kubenswrapper[18707]: I0320 09:01:33.667736 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6" Mar 20 09:01:33.672371 master-0 kubenswrapper[18707]: I0320 09:01:33.669541 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p" event={"ID":"cc7582bc-b43b-4cf0-86d3-41529e4598b2","Type":"ContainerStarted","Data":"aefb74eebf98ada00e8e842029aa22eb6ac534f4094a3589db167ddfc8818185"} Mar 20 09:01:33.672371 master-0 kubenswrapper[18707]: I0320 09:01:33.669928 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p" Mar 20 09:01:33.689347 master-0 kubenswrapper[18707]: I0320 09:01:33.688160 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-p28fn" podStartSLOduration=2.184951726 podStartE2EDuration="8.68813087s" podCreationTimestamp="2026-03-20 09:01:25 +0000 UTC" firstStartedPulling="2026-03-20 09:01:26.287455479 +0000 UTC m=+1231.443635825" lastFinishedPulling="2026-03-20 09:01:32.790634613 +0000 UTC m=+1237.946814969" observedRunningTime="2026-03-20 09:01:33.670653421 +0000 UTC m=+1238.826833807" watchObservedRunningTime="2026-03-20 09:01:33.68813087 +0000 UTC m=+1238.844311256" Mar 20 09:01:33.710752 master-0 kubenswrapper[18707]: I0320 09:01:33.708730 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-8ffsp" podStartSLOduration=1.834277114 podStartE2EDuration="8.708706728s" podCreationTimestamp="2026-03-20 09:01:25 +0000 UTC" firstStartedPulling="2026-03-20 09:01:25.918071533 +0000 UTC m=+1231.074251889" lastFinishedPulling="2026-03-20 09:01:32.792501147 +0000 UTC m=+1237.948681503" observedRunningTime="2026-03-20 09:01:33.697638092 +0000 UTC m=+1238.853818458" watchObservedRunningTime="2026-03-20 09:01:33.708706728 +0000 UTC m=+1238.864887084" Mar 20 09:01:33.764362 master-0 kubenswrapper[18707]: I0320 09:01:33.764238 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6" podStartSLOduration=2.790779367 podStartE2EDuration="8.764210414s" podCreationTimestamp="2026-03-20 09:01:25 +0000 UTC" firstStartedPulling="2026-03-20 09:01:26.803072282 +0000 UTC m=+1231.959252638" lastFinishedPulling="2026-03-20 09:01:32.776503329 +0000 UTC m=+1237.932683685" observedRunningTime="2026-03-20 09:01:33.739975062 +0000 UTC m=+1238.896155418" watchObservedRunningTime="2026-03-20 09:01:33.764210414 +0000 UTC m=+1238.920390790" Mar 20 09:01:33.785504 master-0 kubenswrapper[18707]: I0320 09:01:33.784409 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-5wfxh" podStartSLOduration=3.048280395 podStartE2EDuration="8.784384151s" podCreationTimestamp="2026-03-20 09:01:25 +0000 UTC" firstStartedPulling="2026-03-20 09:01:27.039324303 +0000 UTC m=+1232.195504659" lastFinishedPulling="2026-03-20 09:01:32.775428059 +0000 UTC m=+1237.931608415" observedRunningTime="2026-03-20 09:01:33.773622643 +0000 UTC m=+1238.929802999" watchObservedRunningTime="2026-03-20 09:01:33.784384151 +0000 UTC m=+1238.940564527" Mar 20 09:01:33.852980 master-0 kubenswrapper[18707]: I0320 09:01:33.852888 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p" podStartSLOduration=2.194735032 podStartE2EDuration="10.852865547s" podCreationTimestamp="2026-03-20 09:01:23 +0000 UTC" firstStartedPulling="2026-03-20 09:01:24.135267597 +0000 UTC m=+1229.291447953" lastFinishedPulling="2026-03-20 09:01:32.793398112 +0000 UTC m=+1237.949578468" observedRunningTime="2026-03-20 09:01:33.834994856 +0000 UTC m=+1238.991175212" watchObservedRunningTime="2026-03-20 09:01:33.852865547 +0000 UTC m=+1239.009045903" Mar 20 09:01:34.685461 master-0 kubenswrapper[18707]: I0320 09:01:34.685394 18707 generic.go:334] "Generic (PLEG): container finished" podID="035717cf-e166-4721-a9a5-6a92b26bd5d3" containerID="38c214be5a5533ed25ffd2d2cad9e55fb8b0827528e2b8b32050405957789879" exitCode=0 Mar 20 09:01:34.686025 master-0 kubenswrapper[18707]: I0320 09:01:34.685570 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pp5nf" event={"ID":"035717cf-e166-4721-a9a5-6a92b26bd5d3","Type":"ContainerDied","Data":"38c214be5a5533ed25ffd2d2cad9e55fb8b0827528e2b8b32050405957789879"} Mar 20 09:01:35.700091 master-0 kubenswrapper[18707]: I0320 09:01:35.700008 18707 generic.go:334] "Generic (PLEG): container finished" podID="035717cf-e166-4721-a9a5-6a92b26bd5d3" containerID="5c2639ae500db2f9d944e98fb289c8a4d79ba154fe1a4bf7145d5ea90dcb90a4" exitCode=0 Mar 20 09:01:35.700823 master-0 kubenswrapper[18707]: I0320 09:01:35.700085 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pp5nf" event={"ID":"035717cf-e166-4721-a9a5-6a92b26bd5d3","Type":"ContainerDied","Data":"5c2639ae500db2f9d944e98fb289c8a4d79ba154fe1a4bf7145d5ea90dcb90a4"} Mar 20 09:01:36.189969 master-0 kubenswrapper[18707]: I0320 09:01:36.189919 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:36.190108 master-0 kubenswrapper[18707]: I0320 09:01:36.189979 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:36.197248 master-0 kubenswrapper[18707]: I0320 09:01:36.195231 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:36.719435 master-0 kubenswrapper[18707]: I0320 09:01:36.719347 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pp5nf" event={"ID":"035717cf-e166-4721-a9a5-6a92b26bd5d3","Type":"ContainerStarted","Data":"9db661853939bf463d8217f31a9ba4f4dd415ca14c9b4d51124a23b452790f05"} Mar 20 09:01:36.719435 master-0 kubenswrapper[18707]: I0320 09:01:36.719433 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pp5nf" event={"ID":"035717cf-e166-4721-a9a5-6a92b26bd5d3","Type":"ContainerStarted","Data":"b365df93dccf900f8b4cd03fe4ac216acb99b89cbb49b8c4b721da79bce8d6dc"} Mar 20 09:01:36.732575 master-0 kubenswrapper[18707]: I0320 09:01:36.719461 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pp5nf" event={"ID":"035717cf-e166-4721-a9a5-6a92b26bd5d3","Type":"ContainerStarted","Data":"f50bd0a3f5d16daf46d844946084ec7560a9393cb30b0334dc0ee0c4c297f014"} Mar 20 09:01:36.732575 master-0 kubenswrapper[18707]: I0320 09:01:36.719478 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pp5nf" event={"ID":"035717cf-e166-4721-a9a5-6a92b26bd5d3","Type":"ContainerStarted","Data":"2e5ca20de1cbdc7842900db567fccc9aeda5f73568c2fa6d6e988c68afc8e38f"} Mar 20 09:01:36.732575 master-0 kubenswrapper[18707]: I0320 09:01:36.719496 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pp5nf" event={"ID":"035717cf-e166-4721-a9a5-6a92b26bd5d3","Type":"ContainerStarted","Data":"d8dc70a226bcd81a71052b4d3fe8da57ef1add643dc2fa8746d2f76a03deeb5a"} Mar 20 09:01:36.732575 master-0 kubenswrapper[18707]: I0320 09:01:36.726532 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5df6fc5897-d9ws4" Mar 20 09:01:36.813173 master-0 kubenswrapper[18707]: I0320 09:01:36.812148 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d4699dbdf-6z95l"] Mar 20 09:01:37.366403 master-0 kubenswrapper[18707]: I0320 09:01:37.366336 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-6cndp" Mar 20 09:01:37.746055 master-0 kubenswrapper[18707]: I0320 09:01:37.745900 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pp5nf" event={"ID":"035717cf-e166-4721-a9a5-6a92b26bd5d3","Type":"ContainerStarted","Data":"1b600a0cdc0f4acee40c98ddec17fb47ea9c5472d25de0c35072c0eb997a07b0"} Mar 20 09:01:37.747489 master-0 kubenswrapper[18707]: I0320 09:01:37.747438 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:37.775160 master-0 kubenswrapper[18707]: I0320 09:01:37.774250 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-pp5nf" podStartSLOduration=6.389041929 podStartE2EDuration="14.774230524s" podCreationTimestamp="2026-03-20 09:01:23 +0000 UTC" firstStartedPulling="2026-03-20 09:01:24.446590184 +0000 UTC m=+1229.602770550" lastFinishedPulling="2026-03-20 09:01:32.831778789 +0000 UTC m=+1237.987959145" observedRunningTime="2026-03-20 09:01:37.772546716 +0000 UTC m=+1242.928727102" watchObservedRunningTime="2026-03-20 09:01:37.774230524 +0000 UTC m=+1242.930410890" Mar 20 09:01:39.287755 master-0 kubenswrapper[18707]: I0320 09:01:39.287681 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:39.339100 master-0 kubenswrapper[18707]: I0320 09:01:39.339034 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:40.867214 master-0 kubenswrapper[18707]: I0320 09:01:40.866813 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-8ffsp" Mar 20 09:01:43.668630 master-0 kubenswrapper[18707]: I0320 09:01:43.668537 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-lrv7p" Mar 20 09:01:44.393895 master-0 kubenswrapper[18707]: I0320 09:01:44.393829 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-77wvd" Mar 20 09:01:46.327349 master-0 kubenswrapper[18707]: I0320 09:01:46.327216 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-w8sq6" Mar 20 09:01:48.407299 master-0 kubenswrapper[18707]: I0320 09:01:48.407082 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-rpqkf"] Mar 20 09:01:48.408939 master-0 kubenswrapper[18707]: I0320 09:01:48.408824 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.415446 master-0 kubenswrapper[18707]: I0320 09:01:48.415388 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Mar 20 09:01:48.429969 master-0 kubenswrapper[18707]: I0320 09:01:48.429432 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-rpqkf"] Mar 20 09:01:48.477565 master-0 kubenswrapper[18707]: I0320 09:01:48.477495 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-csi-plugin-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.477892 master-0 kubenswrapper[18707]: I0320 09:01:48.477584 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-metrics-cert\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.477892 master-0 kubenswrapper[18707]: I0320 09:01:48.477619 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-node-plugin-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.477892 master-0 kubenswrapper[18707]: I0320 09:01:48.477686 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-file-lock-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.477892 master-0 kubenswrapper[18707]: I0320 09:01:48.477849 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-registration-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.477892 master-0 kubenswrapper[18707]: I0320 09:01:48.477872 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-pod-volumes-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.477892 master-0 kubenswrapper[18707]: I0320 09:01:48.477899 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-sys\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.478643 master-0 kubenswrapper[18707]: I0320 09:01:48.477925 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-device-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.478643 master-0 kubenswrapper[18707]: I0320 09:01:48.477949 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-lvmd-config\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.478643 master-0 kubenswrapper[18707]: I0320 09:01:48.477976 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-run-udev\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.478643 master-0 kubenswrapper[18707]: I0320 09:01:48.478013 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77b2m\" (UniqueName: \"kubernetes.io/projected/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-kube-api-access-77b2m\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.580047 master-0 kubenswrapper[18707]: I0320 09:01:48.579970 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-metrics-cert\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.580350 master-0 kubenswrapper[18707]: I0320 09:01:48.580060 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-node-plugin-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.580350 master-0 kubenswrapper[18707]: I0320 09:01:48.580234 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-file-lock-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.580572 master-0 kubenswrapper[18707]: I0320 09:01:48.580503 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-node-plugin-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.580661 master-0 kubenswrapper[18707]: I0320 09:01:48.580625 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-file-lock-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.580890 master-0 kubenswrapper[18707]: I0320 09:01:48.580845 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-registration-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.581029 master-0 kubenswrapper[18707]: I0320 09:01:48.581009 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-registration-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.581100 master-0 kubenswrapper[18707]: I0320 09:01:48.581074 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-pod-volumes-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.581210 master-0 kubenswrapper[18707]: I0320 09:01:48.581114 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-sys\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.581210 master-0 kubenswrapper[18707]: I0320 09:01:48.581147 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-device-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.581364 master-0 kubenswrapper[18707]: I0320 09:01:48.581253 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-lvmd-config\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.581364 master-0 kubenswrapper[18707]: I0320 09:01:48.581315 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-pod-volumes-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.581503 master-0 kubenswrapper[18707]: I0320 09:01:48.581332 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-device-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.581503 master-0 kubenswrapper[18707]: I0320 09:01:48.581442 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-run-udev\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.581503 master-0 kubenswrapper[18707]: I0320 09:01:48.581337 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-sys\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.581503 master-0 kubenswrapper[18707]: I0320 09:01:48.581418 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-run-udev\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.581785 master-0 kubenswrapper[18707]: I0320 09:01:48.581582 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77b2m\" (UniqueName: \"kubernetes.io/projected/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-kube-api-access-77b2m\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.581785 master-0 kubenswrapper[18707]: I0320 09:01:48.581639 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-lvmd-config\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.581785 master-0 kubenswrapper[18707]: I0320 09:01:48.581716 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-csi-plugin-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.582118 master-0 kubenswrapper[18707]: I0320 09:01:48.582070 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-csi-plugin-dir\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.583805 master-0 kubenswrapper[18707]: I0320 09:01:48.583764 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-metrics-cert\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.613340 master-0 kubenswrapper[18707]: I0320 09:01:48.613256 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77b2m\" (UniqueName: \"kubernetes.io/projected/5bdfb1a1-4899-4e4d-a03f-7b17c88c0767-kube-api-access-77b2m\") pod \"vg-manager-rpqkf\" (UID: \"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767\") " pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:48.756582 master-0 kubenswrapper[18707]: I0320 09:01:48.756430 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:49.337039 master-0 kubenswrapper[18707]: W0320 09:01:49.336958 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bdfb1a1_4899_4e4d_a03f_7b17c88c0767.slice/crio-08e9d0875c90eab7928e143b43c6c2c982087ae34fe5d5056c015dc0689ad398 WatchSource:0}: Error finding container 08e9d0875c90eab7928e143b43c6c2c982087ae34fe5d5056c015dc0689ad398: Status 404 returned error can't find the container with id 08e9d0875c90eab7928e143b43c6c2c982087ae34fe5d5056c015dc0689ad398 Mar 20 09:01:49.337503 master-0 kubenswrapper[18707]: I0320 09:01:49.337170 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-rpqkf"] Mar 20 09:01:49.890877 master-0 kubenswrapper[18707]: I0320 09:01:49.890827 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-rpqkf" event={"ID":"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767","Type":"ContainerStarted","Data":"0631eeb18dd2550ea05ac5febeaa965473e0c60131821a0c60cf9c081c11f119"} Mar 20 09:01:49.891526 master-0 kubenswrapper[18707]: I0320 09:01:49.891496 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-rpqkf" event={"ID":"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767","Type":"ContainerStarted","Data":"08e9d0875c90eab7928e143b43c6c2c982087ae34fe5d5056c015dc0689ad398"} Mar 20 09:01:49.920512 master-0 kubenswrapper[18707]: I0320 09:01:49.919828 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-rpqkf" podStartSLOduration=1.919799094 podStartE2EDuration="1.919799094s" podCreationTimestamp="2026-03-20 09:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:01:49.910122618 +0000 UTC m=+1255.066302974" watchObservedRunningTime="2026-03-20 09:01:49.919799094 +0000 UTC m=+1255.075979450" Mar 20 09:01:51.914434 master-0 kubenswrapper[18707]: I0320 09:01:51.914398 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-rpqkf_5bdfb1a1-4899-4e4d-a03f-7b17c88c0767/vg-manager/0.log" Mar 20 09:01:51.914954 master-0 kubenswrapper[18707]: I0320 09:01:51.914554 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-rpqkf" event={"ID":"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767","Type":"ContainerDied","Data":"0631eeb18dd2550ea05ac5febeaa965473e0c60131821a0c60cf9c081c11f119"} Mar 20 09:01:51.915156 master-0 kubenswrapper[18707]: I0320 09:01:51.915140 18707 scope.go:117] "RemoveContainer" containerID="0631eeb18dd2550ea05ac5febeaa965473e0c60131821a0c60cf9c081c11f119" Mar 20 09:01:51.917337 master-0 kubenswrapper[18707]: I0320 09:01:51.914462 18707 generic.go:334] "Generic (PLEG): container finished" podID="5bdfb1a1-4899-4e4d-a03f-7b17c88c0767" containerID="0631eeb18dd2550ea05ac5febeaa965473e0c60131821a0c60cf9c081c11f119" exitCode=1 Mar 20 09:01:52.252515 master-0 kubenswrapper[18707]: I0320 09:01:52.252332 18707 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Mar 20 09:01:52.685631 master-0 kubenswrapper[18707]: I0320 09:01:52.685494 18707 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2026-03-20T09:01:52.252430441Z","Handler":null,"Name":""} Mar 20 09:01:52.688281 master-0 kubenswrapper[18707]: I0320 09:01:52.688260 18707 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Mar 20 09:01:52.688421 master-0 kubenswrapper[18707]: I0320 09:01:52.688407 18707 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Mar 20 09:01:52.942243 master-0 kubenswrapper[18707]: I0320 09:01:52.942091 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-rpqkf_5bdfb1a1-4899-4e4d-a03f-7b17c88c0767/vg-manager/0.log" Mar 20 09:01:52.942243 master-0 kubenswrapper[18707]: I0320 09:01:52.942149 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-rpqkf" event={"ID":"5bdfb1a1-4899-4e4d-a03f-7b17c88c0767","Type":"ContainerStarted","Data":"073ec775ec4eef63846a94a6bf5b022ec67239e6020d99c0a7ee7a9420beafeb"} Mar 20 09:01:54.291346 master-0 kubenswrapper[18707]: I0320 09:01:54.291256 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-pp5nf" Mar 20 09:01:54.784213 master-0 kubenswrapper[18707]: I0320 09:01:54.781954 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5g65w"] Mar 20 09:01:54.787638 master-0 kubenswrapper[18707]: I0320 09:01:54.786430 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5g65w" Mar 20 09:01:54.788468 master-0 kubenswrapper[18707]: I0320 09:01:54.788424 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 09:01:54.788706 master-0 kubenswrapper[18707]: I0320 09:01:54.788683 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 09:01:54.793941 master-0 kubenswrapper[18707]: I0320 09:01:54.793865 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5g65w"] Mar 20 09:01:54.933357 master-0 kubenswrapper[18707]: I0320 09:01:54.931808 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p75qj\" (UniqueName: \"kubernetes.io/projected/9ec99d64-c074-4512-99c9-265c941a8b1c-kube-api-access-p75qj\") pod \"openstack-operator-index-5g65w\" (UID: \"9ec99d64-c074-4512-99c9-265c941a8b1c\") " pod="openstack-operators/openstack-operator-index-5g65w" Mar 20 09:01:55.048983 master-0 kubenswrapper[18707]: I0320 09:01:55.048910 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p75qj\" (UniqueName: \"kubernetes.io/projected/9ec99d64-c074-4512-99c9-265c941a8b1c-kube-api-access-p75qj\") pod \"openstack-operator-index-5g65w\" (UID: \"9ec99d64-c074-4512-99c9-265c941a8b1c\") " pod="openstack-operators/openstack-operator-index-5g65w" Mar 20 09:01:55.064847 master-0 kubenswrapper[18707]: I0320 09:01:55.064778 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 09:01:55.083210 master-0 kubenswrapper[18707]: I0320 09:01:55.078499 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 09:01:55.089821 master-0 kubenswrapper[18707]: I0320 09:01:55.088851 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p75qj\" (UniqueName: \"kubernetes.io/projected/9ec99d64-c074-4512-99c9-265c941a8b1c-kube-api-access-p75qj\") pod \"openstack-operator-index-5g65w\" (UID: \"9ec99d64-c074-4512-99c9-265c941a8b1c\") " pod="openstack-operators/openstack-operator-index-5g65w" Mar 20 09:01:55.143409 master-0 kubenswrapper[18707]: I0320 09:01:55.143357 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5g65w" Mar 20 09:01:55.621110 master-0 kubenswrapper[18707]: I0320 09:01:55.620940 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5g65w"] Mar 20 09:01:55.979452 master-0 kubenswrapper[18707]: I0320 09:01:55.979340 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5g65w" event={"ID":"9ec99d64-c074-4512-99c9-265c941a8b1c","Type":"ContainerStarted","Data":"7e424b2c31c1a67eaee629cedb15f72dac948cc9affd019f0ef956d6d9430add"} Mar 20 09:01:56.990741 master-0 kubenswrapper[18707]: I0320 09:01:56.990514 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5g65w" event={"ID":"9ec99d64-c074-4512-99c9-265c941a8b1c","Type":"ContainerStarted","Data":"9ccf9054080ef6c160d56bc919856d784b3115c9c25ec1e4b6bba562f9a335eb"} Mar 20 09:01:57.027526 master-0 kubenswrapper[18707]: I0320 09:01:57.027436 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5g65w" podStartSLOduration=2.055089065 podStartE2EDuration="3.027414571s" podCreationTimestamp="2026-03-20 09:01:54 +0000 UTC" firstStartedPulling="2026-03-20 09:01:55.631356697 +0000 UTC m=+1260.787537053" lastFinishedPulling="2026-03-20 09:01:56.603682163 +0000 UTC m=+1261.759862559" observedRunningTime="2026-03-20 09:01:57.009925341 +0000 UTC m=+1262.166105727" watchObservedRunningTime="2026-03-20 09:01:57.027414571 +0000 UTC m=+1262.183594927" Mar 20 09:01:58.757344 master-0 kubenswrapper[18707]: I0320 09:01:58.757244 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:58.761141 master-0 kubenswrapper[18707]: I0320 09:01:58.761044 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:59.011637 master-0 kubenswrapper[18707]: I0320 09:01:59.011288 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:01:59.012880 master-0 kubenswrapper[18707]: I0320 09:01:59.012833 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-rpqkf" Mar 20 09:02:01.859784 master-0 kubenswrapper[18707]: I0320 09:02:01.859648 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-d4699dbdf-6z95l" podUID="2f81c6b4-2412-41e5-9a3f-c98fed48445a" containerName="console" containerID="cri-o://4649d5def556d152588b98a81a0614a6d47323d6036446b93a2ef94623528d29" gracePeriod=15 Mar 20 09:02:02.064626 master-0 kubenswrapper[18707]: I0320 09:02:02.064541 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d4699dbdf-6z95l_2f81c6b4-2412-41e5-9a3f-c98fed48445a/console/0.log" Mar 20 09:02:02.064932 master-0 kubenswrapper[18707]: I0320 09:02:02.064640 18707 generic.go:334] "Generic (PLEG): container finished" podID="2f81c6b4-2412-41e5-9a3f-c98fed48445a" containerID="4649d5def556d152588b98a81a0614a6d47323d6036446b93a2ef94623528d29" exitCode=2 Mar 20 09:02:02.064932 master-0 kubenswrapper[18707]: I0320 09:02:02.064692 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4699dbdf-6z95l" event={"ID":"2f81c6b4-2412-41e5-9a3f-c98fed48445a","Type":"ContainerDied","Data":"4649d5def556d152588b98a81a0614a6d47323d6036446b93a2ef94623528d29"} Mar 20 09:02:02.363684 master-0 kubenswrapper[18707]: I0320 09:02:02.362976 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d4699dbdf-6z95l_2f81c6b4-2412-41e5-9a3f-c98fed48445a/console/0.log" Mar 20 09:02:02.363684 master-0 kubenswrapper[18707]: I0320 09:02:02.363060 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 09:02:02.451450 master-0 kubenswrapper[18707]: I0320 09:02:02.451390 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-config\") pod \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " Mar 20 09:02:02.451724 master-0 kubenswrapper[18707]: I0320 09:02:02.451477 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-oauth-serving-cert\") pod \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " Mar 20 09:02:02.451724 master-0 kubenswrapper[18707]: I0320 09:02:02.451515 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z968m\" (UniqueName: \"kubernetes.io/projected/2f81c6b4-2412-41e5-9a3f-c98fed48445a-kube-api-access-z968m\") pod \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " Mar 20 09:02:02.451724 master-0 kubenswrapper[18707]: I0320 09:02:02.451544 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-oauth-config\") pod \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " Mar 20 09:02:02.451724 master-0 kubenswrapper[18707]: I0320 09:02:02.451578 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-serving-cert\") pod \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " Mar 20 09:02:02.451724 master-0 kubenswrapper[18707]: I0320 09:02:02.451597 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-trusted-ca-bundle\") pod \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " Mar 20 09:02:02.451724 master-0 kubenswrapper[18707]: I0320 09:02:02.451615 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-service-ca\") pod \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\" (UID: \"2f81c6b4-2412-41e5-9a3f-c98fed48445a\") " Mar 20 09:02:02.452809 master-0 kubenswrapper[18707]: I0320 09:02:02.452737 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-config" (OuterVolumeSpecName: "console-config") pod "2f81c6b4-2412-41e5-9a3f-c98fed48445a" (UID: "2f81c6b4-2412-41e5-9a3f-c98fed48445a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:02:02.452948 master-0 kubenswrapper[18707]: I0320 09:02:02.452916 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2f81c6b4-2412-41e5-9a3f-c98fed48445a" (UID: "2f81c6b4-2412-41e5-9a3f-c98fed48445a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:02:02.453310 master-0 kubenswrapper[18707]: I0320 09:02:02.453285 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-service-ca" (OuterVolumeSpecName: "service-ca") pod "2f81c6b4-2412-41e5-9a3f-c98fed48445a" (UID: "2f81c6b4-2412-41e5-9a3f-c98fed48445a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:02:02.453943 master-0 kubenswrapper[18707]: I0320 09:02:02.453887 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2f81c6b4-2412-41e5-9a3f-c98fed48445a" (UID: "2f81c6b4-2412-41e5-9a3f-c98fed48445a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:02:02.456323 master-0 kubenswrapper[18707]: I0320 09:02:02.455444 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2f81c6b4-2412-41e5-9a3f-c98fed48445a" (UID: "2f81c6b4-2412-41e5-9a3f-c98fed48445a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:02:02.467030 master-0 kubenswrapper[18707]: I0320 09:02:02.466981 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f81c6b4-2412-41e5-9a3f-c98fed48445a-kube-api-access-z968m" (OuterVolumeSpecName: "kube-api-access-z968m") pod "2f81c6b4-2412-41e5-9a3f-c98fed48445a" (UID: "2f81c6b4-2412-41e5-9a3f-c98fed48445a"). InnerVolumeSpecName "kube-api-access-z968m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:02:02.467519 master-0 kubenswrapper[18707]: I0320 09:02:02.467465 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2f81c6b4-2412-41e5-9a3f-c98fed48445a" (UID: "2f81c6b4-2412-41e5-9a3f-c98fed48445a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:02:02.554150 master-0 kubenswrapper[18707]: I0320 09:02:02.554062 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z968m\" (UniqueName: \"kubernetes.io/projected/2f81c6b4-2412-41e5-9a3f-c98fed48445a-kube-api-access-z968m\") on node \"master-0\" DevicePath \"\"" Mar 20 09:02:02.554150 master-0 kubenswrapper[18707]: I0320 09:02:02.554136 18707 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:02:02.554150 master-0 kubenswrapper[18707]: I0320 09:02:02.554157 18707 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 09:02:02.554150 master-0 kubenswrapper[18707]: I0320 09:02:02.554175 18707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:02:02.554718 master-0 kubenswrapper[18707]: I0320 09:02:02.554216 18707 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 20 09:02:02.554718 master-0 kubenswrapper[18707]: I0320 09:02:02.554234 18707 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-console-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:02:02.554718 master-0 kubenswrapper[18707]: I0320 09:02:02.554250 18707 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f81c6b4-2412-41e5-9a3f-c98fed48445a-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 20 09:02:03.074943 master-0 kubenswrapper[18707]: I0320 09:02:03.074909 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d4699dbdf-6z95l_2f81c6b4-2412-41e5-9a3f-c98fed48445a/console/0.log" Mar 20 09:02:03.075612 master-0 kubenswrapper[18707]: I0320 09:02:03.075586 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d4699dbdf-6z95l" event={"ID":"2f81c6b4-2412-41e5-9a3f-c98fed48445a","Type":"ContainerDied","Data":"ae06d4739d879cf9dc6762237ea1a205a9db4460b820a0df3cdd21f6fc1b8d63"} Mar 20 09:02:03.075724 master-0 kubenswrapper[18707]: I0320 09:02:03.075678 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d4699dbdf-6z95l" Mar 20 09:02:03.075847 master-0 kubenswrapper[18707]: I0320 09:02:03.075701 18707 scope.go:117] "RemoveContainer" containerID="4649d5def556d152588b98a81a0614a6d47323d6036446b93a2ef94623528d29" Mar 20 09:02:03.120571 master-0 kubenswrapper[18707]: I0320 09:02:03.120511 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d4699dbdf-6z95l"] Mar 20 09:02:03.126924 master-0 kubenswrapper[18707]: I0320 09:02:03.126850 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-d4699dbdf-6z95l"] Mar 20 09:02:05.107589 master-0 kubenswrapper[18707]: I0320 09:02:05.107517 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f81c6b4-2412-41e5-9a3f-c98fed48445a" path="/var/lib/kubelet/pods/2f81c6b4-2412-41e5-9a3f-c98fed48445a/volumes" Mar 20 09:02:05.144410 master-0 kubenswrapper[18707]: I0320 09:02:05.144336 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-5g65w" Mar 20 09:02:05.144410 master-0 kubenswrapper[18707]: I0320 09:02:05.144398 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-5g65w" Mar 20 09:02:05.175047 master-0 kubenswrapper[18707]: I0320 09:02:05.174990 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-5g65w" Mar 20 09:02:06.168216 master-0 kubenswrapper[18707]: I0320 09:02:06.168133 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-5g65w" Mar 20 09:02:07.385409 master-0 kubenswrapper[18707]: I0320 09:02:07.385337 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj"] Mar 20 09:02:07.386880 master-0 kubenswrapper[18707]: E0320 09:02:07.386846 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f81c6b4-2412-41e5-9a3f-c98fed48445a" containerName="console" Mar 20 09:02:07.387012 master-0 kubenswrapper[18707]: I0320 09:02:07.386995 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f81c6b4-2412-41e5-9a3f-c98fed48445a" containerName="console" Mar 20 09:02:07.387430 master-0 kubenswrapper[18707]: I0320 09:02:07.387410 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f81c6b4-2412-41e5-9a3f-c98fed48445a" containerName="console" Mar 20 09:02:07.389348 master-0 kubenswrapper[18707]: I0320 09:02:07.389319 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" Mar 20 09:02:07.397254 master-0 kubenswrapper[18707]: I0320 09:02:07.397039 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj"] Mar 20 09:02:07.565954 master-0 kubenswrapper[18707]: I0320 09:02:07.565867 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/244ea8f3-b716-4c47-987d-1023bb049363-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj\" (UID: \"244ea8f3-b716-4c47-987d-1023bb049363\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" Mar 20 09:02:07.566270 master-0 kubenswrapper[18707]: I0320 09:02:07.566035 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/244ea8f3-b716-4c47-987d-1023bb049363-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj\" (UID: \"244ea8f3-b716-4c47-987d-1023bb049363\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" Mar 20 09:02:07.566270 master-0 kubenswrapper[18707]: I0320 09:02:07.566106 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z2zz\" (UniqueName: \"kubernetes.io/projected/244ea8f3-b716-4c47-987d-1023bb049363-kube-api-access-8z2zz\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj\" (UID: \"244ea8f3-b716-4c47-987d-1023bb049363\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" Mar 20 09:02:07.668044 master-0 kubenswrapper[18707]: I0320 09:02:07.667924 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/244ea8f3-b716-4c47-987d-1023bb049363-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj\" (UID: \"244ea8f3-b716-4c47-987d-1023bb049363\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" Mar 20 09:02:07.668436 master-0 kubenswrapper[18707]: I0320 09:02:07.668402 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/244ea8f3-b716-4c47-987d-1023bb049363-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj\" (UID: \"244ea8f3-b716-4c47-987d-1023bb049363\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" Mar 20 09:02:07.668551 master-0 kubenswrapper[18707]: I0320 09:02:07.668536 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z2zz\" (UniqueName: \"kubernetes.io/projected/244ea8f3-b716-4c47-987d-1023bb049363-kube-api-access-8z2zz\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj\" (UID: \"244ea8f3-b716-4c47-987d-1023bb049363\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" Mar 20 09:02:07.668680 master-0 kubenswrapper[18707]: I0320 09:02:07.668645 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/244ea8f3-b716-4c47-987d-1023bb049363-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj\" (UID: \"244ea8f3-b716-4c47-987d-1023bb049363\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" Mar 20 09:02:07.669286 master-0 kubenswrapper[18707]: I0320 09:02:07.669219 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/244ea8f3-b716-4c47-987d-1023bb049363-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj\" (UID: \"244ea8f3-b716-4c47-987d-1023bb049363\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" Mar 20 09:02:07.690441 master-0 kubenswrapper[18707]: I0320 09:02:07.690352 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z2zz\" (UniqueName: \"kubernetes.io/projected/244ea8f3-b716-4c47-987d-1023bb049363-kube-api-access-8z2zz\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj\" (UID: \"244ea8f3-b716-4c47-987d-1023bb049363\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" Mar 20 09:02:07.724174 master-0 kubenswrapper[18707]: I0320 09:02:07.724095 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" Mar 20 09:02:08.206996 master-0 kubenswrapper[18707]: I0320 09:02:08.206364 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj"] Mar 20 09:02:08.216663 master-0 kubenswrapper[18707]: W0320 09:02:08.212999 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod244ea8f3_b716_4c47_987d_1023bb049363.slice/crio-69af3fc4c7ff999ded362efc9afdde3f46ada19e0dfaf76f31b179582cf99b4d WatchSource:0}: Error finding container 69af3fc4c7ff999ded362efc9afdde3f46ada19e0dfaf76f31b179582cf99b4d: Status 404 returned error can't find the container with id 69af3fc4c7ff999ded362efc9afdde3f46ada19e0dfaf76f31b179582cf99b4d Mar 20 09:02:09.164651 master-0 kubenswrapper[18707]: I0320 09:02:09.164344 18707 generic.go:334] "Generic (PLEG): container finished" podID="244ea8f3-b716-4c47-987d-1023bb049363" containerID="094f667321cde1d8dc069f03f7b9bd2437ddcb3b21cca55fdb1ae4ea319db1e3" exitCode=0 Mar 20 09:02:09.165843 master-0 kubenswrapper[18707]: I0320 09:02:09.164696 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" event={"ID":"244ea8f3-b716-4c47-987d-1023bb049363","Type":"ContainerDied","Data":"094f667321cde1d8dc069f03f7b9bd2437ddcb3b21cca55fdb1ae4ea319db1e3"} Mar 20 09:02:09.165843 master-0 kubenswrapper[18707]: I0320 09:02:09.164770 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" event={"ID":"244ea8f3-b716-4c47-987d-1023bb049363","Type":"ContainerStarted","Data":"69af3fc4c7ff999ded362efc9afdde3f46ada19e0dfaf76f31b179582cf99b4d"} Mar 20 09:02:11.188916 master-0 kubenswrapper[18707]: I0320 09:02:11.188805 18707 generic.go:334] "Generic (PLEG): container finished" podID="244ea8f3-b716-4c47-987d-1023bb049363" containerID="9943f6b81c6647f762f3c919d74a7236514438dcb05557e0065587de2de5e35e" exitCode=0 Mar 20 09:02:11.188916 master-0 kubenswrapper[18707]: I0320 09:02:11.188898 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" event={"ID":"244ea8f3-b716-4c47-987d-1023bb049363","Type":"ContainerDied","Data":"9943f6b81c6647f762f3c919d74a7236514438dcb05557e0065587de2de5e35e"} Mar 20 09:02:12.203124 master-0 kubenswrapper[18707]: I0320 09:02:12.203064 18707 generic.go:334] "Generic (PLEG): container finished" podID="244ea8f3-b716-4c47-987d-1023bb049363" containerID="bf58aee730e66f3f121bdd31f237d3ed3855a8078951db50824463eab950d237" exitCode=0 Mar 20 09:02:12.203797 master-0 kubenswrapper[18707]: I0320 09:02:12.203131 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" event={"ID":"244ea8f3-b716-4c47-987d-1023bb049363","Type":"ContainerDied","Data":"bf58aee730e66f3f121bdd31f237d3ed3855a8078951db50824463eab950d237"} Mar 20 09:02:13.630600 master-0 kubenswrapper[18707]: I0320 09:02:13.630510 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" Mar 20 09:02:13.801933 master-0 kubenswrapper[18707]: I0320 09:02:13.801820 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/244ea8f3-b716-4c47-987d-1023bb049363-bundle\") pod \"244ea8f3-b716-4c47-987d-1023bb049363\" (UID: \"244ea8f3-b716-4c47-987d-1023bb049363\") " Mar 20 09:02:13.802290 master-0 kubenswrapper[18707]: I0320 09:02:13.802109 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/244ea8f3-b716-4c47-987d-1023bb049363-util\") pod \"244ea8f3-b716-4c47-987d-1023bb049363\" (UID: \"244ea8f3-b716-4c47-987d-1023bb049363\") " Mar 20 09:02:13.802290 master-0 kubenswrapper[18707]: I0320 09:02:13.802271 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z2zz\" (UniqueName: \"kubernetes.io/projected/244ea8f3-b716-4c47-987d-1023bb049363-kube-api-access-8z2zz\") pod \"244ea8f3-b716-4c47-987d-1023bb049363\" (UID: \"244ea8f3-b716-4c47-987d-1023bb049363\") " Mar 20 09:02:13.803539 master-0 kubenswrapper[18707]: I0320 09:02:13.803482 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/244ea8f3-b716-4c47-987d-1023bb049363-bundle" (OuterVolumeSpecName: "bundle") pod "244ea8f3-b716-4c47-987d-1023bb049363" (UID: "244ea8f3-b716-4c47-987d-1023bb049363"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:02:13.807385 master-0 kubenswrapper[18707]: I0320 09:02:13.807306 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/244ea8f3-b716-4c47-987d-1023bb049363-kube-api-access-8z2zz" (OuterVolumeSpecName: "kube-api-access-8z2zz") pod "244ea8f3-b716-4c47-987d-1023bb049363" (UID: "244ea8f3-b716-4c47-987d-1023bb049363"). InnerVolumeSpecName "kube-api-access-8z2zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:02:13.845880 master-0 kubenswrapper[18707]: I0320 09:02:13.845643 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/244ea8f3-b716-4c47-987d-1023bb049363-util" (OuterVolumeSpecName: "util") pod "244ea8f3-b716-4c47-987d-1023bb049363" (UID: "244ea8f3-b716-4c47-987d-1023bb049363"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:02:13.904744 master-0 kubenswrapper[18707]: I0320 09:02:13.904668 18707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/244ea8f3-b716-4c47-987d-1023bb049363-util\") on node \"master-0\" DevicePath \"\"" Mar 20 09:02:13.904744 master-0 kubenswrapper[18707]: I0320 09:02:13.904730 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z2zz\" (UniqueName: \"kubernetes.io/projected/244ea8f3-b716-4c47-987d-1023bb049363-kube-api-access-8z2zz\") on node \"master-0\" DevicePath \"\"" Mar 20 09:02:13.905499 master-0 kubenswrapper[18707]: I0320 09:02:13.904789 18707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/244ea8f3-b716-4c47-987d-1023bb049363-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:02:14.242737 master-0 kubenswrapper[18707]: I0320 09:02:14.242553 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" event={"ID":"244ea8f3-b716-4c47-987d-1023bb049363","Type":"ContainerDied","Data":"69af3fc4c7ff999ded362efc9afdde3f46ada19e0dfaf76f31b179582cf99b4d"} Mar 20 09:02:14.242737 master-0 kubenswrapper[18707]: I0320 09:02:14.242626 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69af3fc4c7ff999ded362efc9afdde3f46ada19e0dfaf76f31b179582cf99b4d" Mar 20 09:02:14.243109 master-0 kubenswrapper[18707]: I0320 09:02:14.242737 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514ckhsqj" Mar 20 09:02:20.009336 master-0 kubenswrapper[18707]: I0320 09:02:20.009268 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-9zm7k"] Mar 20 09:02:20.010031 master-0 kubenswrapper[18707]: E0320 09:02:20.009654 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244ea8f3-b716-4c47-987d-1023bb049363" containerName="pull" Mar 20 09:02:20.010031 master-0 kubenswrapper[18707]: I0320 09:02:20.009667 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="244ea8f3-b716-4c47-987d-1023bb049363" containerName="pull" Mar 20 09:02:20.010031 master-0 kubenswrapper[18707]: E0320 09:02:20.009715 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244ea8f3-b716-4c47-987d-1023bb049363" containerName="extract" Mar 20 09:02:20.010031 master-0 kubenswrapper[18707]: I0320 09:02:20.009721 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="244ea8f3-b716-4c47-987d-1023bb049363" containerName="extract" Mar 20 09:02:20.010031 master-0 kubenswrapper[18707]: E0320 09:02:20.009737 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244ea8f3-b716-4c47-987d-1023bb049363" containerName="util" Mar 20 09:02:20.010031 master-0 kubenswrapper[18707]: I0320 09:02:20.009744 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="244ea8f3-b716-4c47-987d-1023bb049363" containerName="util" Mar 20 09:02:20.010031 master-0 kubenswrapper[18707]: I0320 09:02:20.009892 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="244ea8f3-b716-4c47-987d-1023bb049363" containerName="extract" Mar 20 09:02:20.010609 master-0 kubenswrapper[18707]: I0320 09:02:20.010582 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9zm7k" Mar 20 09:02:20.045805 master-0 kubenswrapper[18707]: I0320 09:02:20.045746 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-9zm7k"] Mar 20 09:02:20.134127 master-0 kubenswrapper[18707]: I0320 09:02:20.134071 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjzjs\" (UniqueName: \"kubernetes.io/projected/5f732ba9-6fb6-4699-8123-c3f71ef9cebc-kube-api-access-bjzjs\") pod \"openstack-operator-controller-init-b85c4d696-9zm7k\" (UID: \"5f732ba9-6fb6-4699-8123-c3f71ef9cebc\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9zm7k" Mar 20 09:02:20.236645 master-0 kubenswrapper[18707]: I0320 09:02:20.236567 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjzjs\" (UniqueName: \"kubernetes.io/projected/5f732ba9-6fb6-4699-8123-c3f71ef9cebc-kube-api-access-bjzjs\") pod \"openstack-operator-controller-init-b85c4d696-9zm7k\" (UID: \"5f732ba9-6fb6-4699-8123-c3f71ef9cebc\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9zm7k" Mar 20 09:02:20.251882 master-0 kubenswrapper[18707]: I0320 09:02:20.251832 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjzjs\" (UniqueName: \"kubernetes.io/projected/5f732ba9-6fb6-4699-8123-c3f71ef9cebc-kube-api-access-bjzjs\") pod \"openstack-operator-controller-init-b85c4d696-9zm7k\" (UID: \"5f732ba9-6fb6-4699-8123-c3f71ef9cebc\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9zm7k" Mar 20 09:02:20.327761 master-0 kubenswrapper[18707]: I0320 09:02:20.327695 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9zm7k" Mar 20 09:02:20.812970 master-0 kubenswrapper[18707]: I0320 09:02:20.812905 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-9zm7k"] Mar 20 09:02:21.349529 master-0 kubenswrapper[18707]: I0320 09:02:21.349466 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9zm7k" event={"ID":"5f732ba9-6fb6-4699-8123-c3f71ef9cebc","Type":"ContainerStarted","Data":"ef3ea95f528bbf4a38cfeb2be81ff49d1d0699bdf122c95a9e8de8cbe4554fc4"} Mar 20 09:02:25.394124 master-0 kubenswrapper[18707]: I0320 09:02:25.392982 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9zm7k" event={"ID":"5f732ba9-6fb6-4699-8123-c3f71ef9cebc","Type":"ContainerStarted","Data":"d94d1a5e89043fe986e9e2d0ba32d24ca796ff94b6cf49b913acc1223e66fa04"} Mar 20 09:02:25.394124 master-0 kubenswrapper[18707]: I0320 09:02:25.393825 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9zm7k" Mar 20 09:02:25.434853 master-0 kubenswrapper[18707]: I0320 09:02:25.434764 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9zm7k" podStartSLOduration=2.108954704 podStartE2EDuration="6.434746408s" podCreationTimestamp="2026-03-20 09:02:19 +0000 UTC" firstStartedPulling="2026-03-20 09:02:20.807219101 +0000 UTC m=+1285.963399457" lastFinishedPulling="2026-03-20 09:02:25.133010805 +0000 UTC m=+1290.289191161" observedRunningTime="2026-03-20 09:02:25.426330287 +0000 UTC m=+1290.582510663" watchObservedRunningTime="2026-03-20 09:02:25.434746408 +0000 UTC m=+1290.590926764" Mar 20 09:02:30.332273 master-0 kubenswrapper[18707]: I0320 09:02:30.332208 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-9zm7k" Mar 20 09:02:50.563890 master-0 kubenswrapper[18707]: I0320 09:02:50.562849 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-vds6m"] Mar 20 09:02:50.564564 master-0 kubenswrapper[18707]: I0320 09:02:50.564119 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-vds6m" Mar 20 09:02:50.582208 master-0 kubenswrapper[18707]: I0320 09:02:50.575082 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-vds6m"] Mar 20 09:02:50.611210 master-0 kubenswrapper[18707]: I0320 09:02:50.603385 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-sfshp"] Mar 20 09:02:50.611210 master-0 kubenswrapper[18707]: I0320 09:02:50.604485 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sfshp" Mar 20 09:02:50.619913 master-0 kubenswrapper[18707]: I0320 09:02:50.619862 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-sfshp"] Mar 20 09:02:50.647867 master-0 kubenswrapper[18707]: I0320 09:02:50.647824 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-bghwc"] Mar 20 09:02:50.649464 master-0 kubenswrapper[18707]: I0320 09:02:50.649328 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bghwc" Mar 20 09:02:50.661221 master-0 kubenswrapper[18707]: I0320 09:02:50.653216 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-nt245"] Mar 20 09:02:50.661221 master-0 kubenswrapper[18707]: I0320 09:02:50.654568 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nt245" Mar 20 09:02:50.681332 master-0 kubenswrapper[18707]: I0320 09:02:50.666410 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-bghwc"] Mar 20 09:02:50.690207 master-0 kubenswrapper[18707]: I0320 09:02:50.684617 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9"] Mar 20 09:02:50.690207 master-0 kubenswrapper[18707]: I0320 09:02:50.685874 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9" Mar 20 09:02:50.733565 master-0 kubenswrapper[18707]: I0320 09:02:50.733505 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqp2j\" (UniqueName: \"kubernetes.io/projected/0709242a-39f4-46b7-a161-c28c18c272e9-kube-api-access-gqp2j\") pod \"cinder-operator-controller-manager-8d58dc466-sfshp\" (UID: \"0709242a-39f4-46b7-a161-c28c18c272e9\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sfshp" Mar 20 09:02:50.733801 master-0 kubenswrapper[18707]: I0320 09:02:50.733702 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc5gq\" (UniqueName: \"kubernetes.io/projected/2036498a-bbc9-49bb-9ae7-d3f9fbf5165f-kube-api-access-zc5gq\") pod \"barbican-operator-controller-manager-59bc569d95-vds6m\" (UID: \"2036498a-bbc9-49bb-9ae7-d3f9fbf5165f\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-vds6m" Mar 20 09:02:50.733801 master-0 kubenswrapper[18707]: I0320 09:02:50.733737 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsbb9\" (UniqueName: \"kubernetes.io/projected/426a6be9-aeba-4865-bea3-0eacca0f445f-kube-api-access-fsbb9\") pod \"designate-operator-controller-manager-588d4d986b-bghwc\" (UID: \"426a6be9-aeba-4865-bea3-0eacca0f445f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bghwc" Mar 20 09:02:50.733894 master-0 kubenswrapper[18707]: I0320 09:02:50.733875 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-nt245"] Mar 20 09:02:50.760968 master-0 kubenswrapper[18707]: I0320 09:02:50.760912 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-jbfpt"] Mar 20 09:02:50.790936 master-0 kubenswrapper[18707]: I0320 09:02:50.788667 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jbfpt" Mar 20 09:02:50.848042 master-0 kubenswrapper[18707]: I0320 09:02:50.844863 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9"] Mar 20 09:02:50.848042 master-0 kubenswrapper[18707]: I0320 09:02:50.846768 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc5gq\" (UniqueName: \"kubernetes.io/projected/2036498a-bbc9-49bb-9ae7-d3f9fbf5165f-kube-api-access-zc5gq\") pod \"barbican-operator-controller-manager-59bc569d95-vds6m\" (UID: \"2036498a-bbc9-49bb-9ae7-d3f9fbf5165f\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-vds6m" Mar 20 09:02:50.848042 master-0 kubenswrapper[18707]: I0320 09:02:50.846860 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsbb9\" (UniqueName: \"kubernetes.io/projected/426a6be9-aeba-4865-bea3-0eacca0f445f-kube-api-access-fsbb9\") pod \"designate-operator-controller-manager-588d4d986b-bghwc\" (UID: \"426a6be9-aeba-4865-bea3-0eacca0f445f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bghwc" Mar 20 09:02:50.848042 master-0 kubenswrapper[18707]: I0320 09:02:50.846921 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m42pz\" (UniqueName: \"kubernetes.io/projected/fbd2d905-b71b-408c-b669-1a0a43ce9b2f-kube-api-access-m42pz\") pod \"heat-operator-controller-manager-67dd5f86f5-5h2n9\" (UID: \"fbd2d905-b71b-408c-b669-1a0a43ce9b2f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9" Mar 20 09:02:50.848042 master-0 kubenswrapper[18707]: I0320 09:02:50.847002 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqp2j\" (UniqueName: \"kubernetes.io/projected/0709242a-39f4-46b7-a161-c28c18c272e9-kube-api-access-gqp2j\") pod \"cinder-operator-controller-manager-8d58dc466-sfshp\" (UID: \"0709242a-39f4-46b7-a161-c28c18c272e9\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sfshp" Mar 20 09:02:50.848042 master-0 kubenswrapper[18707]: I0320 09:02:50.847026 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4m9r\" (UniqueName: \"kubernetes.io/projected/d3637b14-6e2e-4cdf-b59f-7f13bcdff49d-kube-api-access-l4m9r\") pod \"glance-operator-controller-manager-79df6bcc97-nt245\" (UID: \"d3637b14-6e2e-4cdf-b59f-7f13bcdff49d\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nt245" Mar 20 09:02:50.904473 master-0 kubenswrapper[18707]: I0320 09:02:50.901628 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqp2j\" (UniqueName: \"kubernetes.io/projected/0709242a-39f4-46b7-a161-c28c18c272e9-kube-api-access-gqp2j\") pod \"cinder-operator-controller-manager-8d58dc466-sfshp\" (UID: \"0709242a-39f4-46b7-a161-c28c18c272e9\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sfshp" Mar 20 09:02:50.929769 master-0 kubenswrapper[18707]: I0320 09:02:50.917356 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc5gq\" (UniqueName: \"kubernetes.io/projected/2036498a-bbc9-49bb-9ae7-d3f9fbf5165f-kube-api-access-zc5gq\") pod \"barbican-operator-controller-manager-59bc569d95-vds6m\" (UID: \"2036498a-bbc9-49bb-9ae7-d3f9fbf5165f\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-vds6m" Mar 20 09:02:50.929769 master-0 kubenswrapper[18707]: I0320 09:02:50.922298 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsbb9\" (UniqueName: \"kubernetes.io/projected/426a6be9-aeba-4865-bea3-0eacca0f445f-kube-api-access-fsbb9\") pod \"designate-operator-controller-manager-588d4d986b-bghwc\" (UID: \"426a6be9-aeba-4865-bea3-0eacca0f445f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bghwc" Mar 20 09:02:50.945355 master-0 kubenswrapper[18707]: I0320 09:02:50.945289 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-jbfpt"] Mar 20 09:02:50.949748 master-0 kubenswrapper[18707]: I0320 09:02:50.949698 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4m9r\" (UniqueName: \"kubernetes.io/projected/d3637b14-6e2e-4cdf-b59f-7f13bcdff49d-kube-api-access-l4m9r\") pod \"glance-operator-controller-manager-79df6bcc97-nt245\" (UID: \"d3637b14-6e2e-4cdf-b59f-7f13bcdff49d\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nt245" Mar 20 09:02:50.949923 master-0 kubenswrapper[18707]: I0320 09:02:50.949841 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m42pz\" (UniqueName: \"kubernetes.io/projected/fbd2d905-b71b-408c-b669-1a0a43ce9b2f-kube-api-access-m42pz\") pod \"heat-operator-controller-manager-67dd5f86f5-5h2n9\" (UID: \"fbd2d905-b71b-408c-b669-1a0a43ce9b2f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9" Mar 20 09:02:50.949923 master-0 kubenswrapper[18707]: I0320 09:02:50.949875 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfzkf\" (UniqueName: \"kubernetes.io/projected/01d1e35b-ff25-4eb4-b550-30debdab6fa0-kube-api-access-dfzkf\") pod \"horizon-operator-controller-manager-8464cc45fb-jbfpt\" (UID: \"01d1e35b-ff25-4eb4-b550-30debdab6fa0\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jbfpt" Mar 20 09:02:50.958457 master-0 kubenswrapper[18707]: I0320 09:02:50.958400 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f"] Mar 20 09:02:50.960275 master-0 kubenswrapper[18707]: I0320 09:02:50.960238 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:02:50.965484 master-0 kubenswrapper[18707]: I0320 09:02:50.965443 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 09:02:50.993339 master-0 kubenswrapper[18707]: I0320 09:02:50.992540 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-vds6m" Mar 20 09:02:51.007220 master-0 kubenswrapper[18707]: I0320 09:02:51.007076 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4m9r\" (UniqueName: \"kubernetes.io/projected/d3637b14-6e2e-4cdf-b59f-7f13bcdff49d-kube-api-access-l4m9r\") pod \"glance-operator-controller-manager-79df6bcc97-nt245\" (UID: \"d3637b14-6e2e-4cdf-b59f-7f13bcdff49d\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nt245" Mar 20 09:02:51.007220 master-0 kubenswrapper[18707]: I0320 09:02:51.007159 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f"] Mar 20 09:02:51.009161 master-0 kubenswrapper[18707]: I0320 09:02:51.009086 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m42pz\" (UniqueName: \"kubernetes.io/projected/fbd2d905-b71b-408c-b669-1a0a43ce9b2f-kube-api-access-m42pz\") pod \"heat-operator-controller-manager-67dd5f86f5-5h2n9\" (UID: \"fbd2d905-b71b-408c-b669-1a0a43ce9b2f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9" Mar 20 09:02:51.042203 master-0 kubenswrapper[18707]: I0320 09:02:51.037350 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2"] Mar 20 09:02:51.042203 master-0 kubenswrapper[18707]: I0320 09:02:51.038512 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2" Mar 20 09:02:51.050985 master-0 kubenswrapper[18707]: I0320 09:02:51.050927 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sfshp" Mar 20 09:02:51.057058 master-0 kubenswrapper[18707]: I0320 09:02:51.057008 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfzkf\" (UniqueName: \"kubernetes.io/projected/01d1e35b-ff25-4eb4-b550-30debdab6fa0-kube-api-access-dfzkf\") pod \"horizon-operator-controller-manager-8464cc45fb-jbfpt\" (UID: \"01d1e35b-ff25-4eb4-b550-30debdab6fa0\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jbfpt" Mar 20 09:02:51.057285 master-0 kubenswrapper[18707]: I0320 09:02:51.057218 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8rnx\" (UniqueName: \"kubernetes.io/projected/f0c833ae-7f6c-4ead-b991-040550152e41-kube-api-access-t8rnx\") pod \"infra-operator-controller-manager-7dd6bb94c9-7rp5f\" (UID: \"f0c833ae-7f6c-4ead-b991-040550152e41\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:02:51.057327 master-0 kubenswrapper[18707]: I0320 09:02:51.057299 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-7rp5f\" (UID: \"f0c833ae-7f6c-4ead-b991-040550152e41\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:02:51.069521 master-0 kubenswrapper[18707]: I0320 09:02:51.069464 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2"] Mar 20 09:02:51.099609 master-0 kubenswrapper[18707]: I0320 09:02:51.099077 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfzkf\" (UniqueName: \"kubernetes.io/projected/01d1e35b-ff25-4eb4-b550-30debdab6fa0-kube-api-access-dfzkf\") pod \"horizon-operator-controller-manager-8464cc45fb-jbfpt\" (UID: \"01d1e35b-ff25-4eb4-b550-30debdab6fa0\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jbfpt" Mar 20 09:02:51.105238 master-0 kubenswrapper[18707]: I0320 09:02:51.105207 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nt245" Mar 20 09:02:51.163207 master-0 kubenswrapper[18707]: I0320 09:02:51.159417 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bghwc" Mar 20 09:02:51.178311 master-0 kubenswrapper[18707]: I0320 09:02:51.178269 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-2zdk8"] Mar 20 09:02:51.179892 master-0 kubenswrapper[18707]: I0320 09:02:51.179869 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-2zdk8" Mar 20 09:02:51.189834 master-0 kubenswrapper[18707]: I0320 09:02:51.188105 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8srwd\" (UniqueName: \"kubernetes.io/projected/b9627b82-4e5a-4ceb-a906-0657397e66e9-kube-api-access-8srwd\") pod \"ironic-operator-controller-manager-6f787dddc9-dq5f2\" (UID: \"b9627b82-4e5a-4ceb-a906-0657397e66e9\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2" Mar 20 09:02:51.189834 master-0 kubenswrapper[18707]: I0320 09:02:51.188215 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8rnx\" (UniqueName: \"kubernetes.io/projected/f0c833ae-7f6c-4ead-b991-040550152e41-kube-api-access-t8rnx\") pod \"infra-operator-controller-manager-7dd6bb94c9-7rp5f\" (UID: \"f0c833ae-7f6c-4ead-b991-040550152e41\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:02:51.189834 master-0 kubenswrapper[18707]: I0320 09:02:51.188883 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-7rp5f\" (UID: \"f0c833ae-7f6c-4ead-b991-040550152e41\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:02:51.189834 master-0 kubenswrapper[18707]: E0320 09:02:51.189039 18707 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 09:02:51.189834 master-0 kubenswrapper[18707]: E0320 09:02:51.189104 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert podName:f0c833ae-7f6c-4ead-b991-040550152e41 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:51.689080083 +0000 UTC m=+1316.845260439 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert") pod "infra-operator-controller-manager-7dd6bb94c9-7rp5f" (UID: "f0c833ae-7f6c-4ead-b991-040550152e41") : secret "infra-operator-webhook-server-cert" not found Mar 20 09:02:51.207641 master-0 kubenswrapper[18707]: I0320 09:02:51.207603 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9" Mar 20 09:02:51.235222 master-0 kubenswrapper[18707]: I0320 09:02:51.235118 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8rnx\" (UniqueName: \"kubernetes.io/projected/f0c833ae-7f6c-4ead-b991-040550152e41-kube-api-access-t8rnx\") pod \"infra-operator-controller-manager-7dd6bb94c9-7rp5f\" (UID: \"f0c833ae-7f6c-4ead-b991-040550152e41\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:02:51.249130 master-0 kubenswrapper[18707]: I0320 09:02:51.249060 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-2zdk8"] Mar 20 09:02:51.259789 master-0 kubenswrapper[18707]: I0320 09:02:51.257597 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-d49s9"] Mar 20 09:02:51.260243 master-0 kubenswrapper[18707]: I0320 09:02:51.260209 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-d49s9" Mar 20 09:02:51.265817 master-0 kubenswrapper[18707]: I0320 09:02:51.265091 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-d49s9"] Mar 20 09:02:51.273613 master-0 kubenswrapper[18707]: I0320 09:02:51.273555 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vxvcl"] Mar 20 09:02:51.274963 master-0 kubenswrapper[18707]: I0320 09:02:51.274929 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vxvcl" Mar 20 09:02:51.279785 master-0 kubenswrapper[18707]: I0320 09:02:51.279755 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-94tfj"] Mar 20 09:02:51.283122 master-0 kubenswrapper[18707]: I0320 09:02:51.281155 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94tfj" Mar 20 09:02:51.311107 master-0 kubenswrapper[18707]: I0320 09:02:51.295113 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkcl5\" (UniqueName: \"kubernetes.io/projected/6ca698eb-ae09-4cfd-ac04-7350744653a6-kube-api-access-bkcl5\") pod \"keystone-operator-controller-manager-768b96df4c-2zdk8\" (UID: \"6ca698eb-ae09-4cfd-ac04-7350744653a6\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-2zdk8" Mar 20 09:02:51.311107 master-0 kubenswrapper[18707]: I0320 09:02:51.295265 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8srwd\" (UniqueName: \"kubernetes.io/projected/b9627b82-4e5a-4ceb-a906-0657397e66e9-kube-api-access-8srwd\") pod \"ironic-operator-controller-manager-6f787dddc9-dq5f2\" (UID: \"b9627b82-4e5a-4ceb-a906-0657397e66e9\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2" Mar 20 09:02:51.329721 master-0 kubenswrapper[18707]: I0320 09:02:51.329677 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vxvcl"] Mar 20 09:02:51.342277 master-0 kubenswrapper[18707]: I0320 09:02:51.342228 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-94tfj"] Mar 20 09:02:51.343915 master-0 kubenswrapper[18707]: I0320 09:02:51.343881 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8srwd\" (UniqueName: \"kubernetes.io/projected/b9627b82-4e5a-4ceb-a906-0657397e66e9-kube-api-access-8srwd\") pod \"ironic-operator-controller-manager-6f787dddc9-dq5f2\" (UID: \"b9627b82-4e5a-4ceb-a906-0657397e66e9\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2" Mar 20 09:02:51.365887 master-0 kubenswrapper[18707]: I0320 09:02:51.364740 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-5rvds"] Mar 20 09:02:51.366073 master-0 kubenswrapper[18707]: I0320 09:02:51.365975 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5rvds" Mar 20 09:02:51.367078 master-0 kubenswrapper[18707]: I0320 09:02:51.367056 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jbfpt" Mar 20 09:02:51.381586 master-0 kubenswrapper[18707]: I0320 09:02:51.379332 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295"] Mar 20 09:02:51.381586 master-0 kubenswrapper[18707]: I0320 09:02:51.380672 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" Mar 20 09:02:51.400536 master-0 kubenswrapper[18707]: I0320 09:02:51.399750 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkcl5\" (UniqueName: \"kubernetes.io/projected/6ca698eb-ae09-4cfd-ac04-7350744653a6-kube-api-access-bkcl5\") pod \"keystone-operator-controller-manager-768b96df4c-2zdk8\" (UID: \"6ca698eb-ae09-4cfd-ac04-7350744653a6\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-2zdk8" Mar 20 09:02:51.400536 master-0 kubenswrapper[18707]: I0320 09:02:51.399817 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhhk5\" (UniqueName: \"kubernetes.io/projected/88c892a9-383f-47bf-a78c-7498cfdc49fa-kube-api-access-bhhk5\") pod \"manila-operator-controller-manager-55f864c847-d49s9\" (UID: \"88c892a9-383f-47bf-a78c-7498cfdc49fa\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-d49s9" Mar 20 09:02:51.400536 master-0 kubenswrapper[18707]: I0320 09:02:51.399869 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7wvl\" (UniqueName: \"kubernetes.io/projected/4e66ea7a-42bb-4b3b-bcb0-244bb08e3eba-kube-api-access-d7wvl\") pod \"neutron-operator-controller-manager-767865f676-94tfj\" (UID: \"4e66ea7a-42bb-4b3b-bcb0-244bb08e3eba\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-94tfj" Mar 20 09:02:51.400536 master-0 kubenswrapper[18707]: I0320 09:02:51.399952 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwrx9\" (UniqueName: \"kubernetes.io/projected/9e52ec39-7a12-433b-8d5c-2df52aa87657-kube-api-access-vwrx9\") pod \"mariadb-operator-controller-manager-67ccfc9778-vxvcl\" (UID: \"9e52ec39-7a12-433b-8d5c-2df52aa87657\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vxvcl" Mar 20 09:02:51.445132 master-0 kubenswrapper[18707]: I0320 09:02:51.443734 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-5rvds"] Mar 20 09:02:51.452348 master-0 kubenswrapper[18707]: I0320 09:02:51.452291 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295"] Mar 20 09:02:51.452874 master-0 kubenswrapper[18707]: I0320 09:02:51.452823 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkcl5\" (UniqueName: \"kubernetes.io/projected/6ca698eb-ae09-4cfd-ac04-7350744653a6-kube-api-access-bkcl5\") pod \"keystone-operator-controller-manager-768b96df4c-2zdk8\" (UID: \"6ca698eb-ae09-4cfd-ac04-7350744653a6\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-2zdk8" Mar 20 09:02:51.471919 master-0 kubenswrapper[18707]: I0320 09:02:51.471862 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4"] Mar 20 09:02:51.474774 master-0 kubenswrapper[18707]: I0320 09:02:51.474731 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:02:51.494810 master-0 kubenswrapper[18707]: I0320 09:02:51.477649 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2" Mar 20 09:02:51.494810 master-0 kubenswrapper[18707]: I0320 09:02:51.479770 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 09:02:51.503804 master-0 kubenswrapper[18707]: I0320 09:02:51.501320 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftzvq\" (UniqueName: \"kubernetes.io/projected/f296b061-4bb0-4fb1-9896-3cd75e58e81c-kube-api-access-ftzvq\") pod \"nova-operator-controller-manager-5d488d59fb-5rvds\" (UID: \"f296b061-4bb0-4fb1-9896-3cd75e58e81c\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5rvds" Mar 20 09:02:51.503804 master-0 kubenswrapper[18707]: I0320 09:02:51.501423 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhhk5\" (UniqueName: \"kubernetes.io/projected/88c892a9-383f-47bf-a78c-7498cfdc49fa-kube-api-access-bhhk5\") pod \"manila-operator-controller-manager-55f864c847-d49s9\" (UID: \"88c892a9-383f-47bf-a78c-7498cfdc49fa\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-d49s9" Mar 20 09:02:51.503804 master-0 kubenswrapper[18707]: I0320 09:02:51.501480 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7wvl\" (UniqueName: \"kubernetes.io/projected/4e66ea7a-42bb-4b3b-bcb0-244bb08e3eba-kube-api-access-d7wvl\") pod \"neutron-operator-controller-manager-767865f676-94tfj\" (UID: \"4e66ea7a-42bb-4b3b-bcb0-244bb08e3eba\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-94tfj" Mar 20 09:02:51.503804 master-0 kubenswrapper[18707]: I0320 09:02:51.501527 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffvv9\" (UniqueName: \"kubernetes.io/projected/109b7903-cd46-4a38-93c2-87253251c130-kube-api-access-ffvv9\") pod \"octavia-operator-controller-manager-5b9f45d989-fr295\" (UID: \"109b7903-cd46-4a38-93c2-87253251c130\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" Mar 20 09:02:51.503804 master-0 kubenswrapper[18707]: I0320 09:02:51.501610 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwrx9\" (UniqueName: \"kubernetes.io/projected/9e52ec39-7a12-433b-8d5c-2df52aa87657-kube-api-access-vwrx9\") pod \"mariadb-operator-controller-manager-67ccfc9778-vxvcl\" (UID: \"9e52ec39-7a12-433b-8d5c-2df52aa87657\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vxvcl" Mar 20 09:02:51.517714 master-0 kubenswrapper[18707]: I0320 09:02:51.510823 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-t875d"] Mar 20 09:02:51.519250 master-0 kubenswrapper[18707]: I0320 09:02:51.519211 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-t875d"] Mar 20 09:02:51.519436 master-0 kubenswrapper[18707]: I0320 09:02:51.519423 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t875d" Mar 20 09:02:51.536321 master-0 kubenswrapper[18707]: I0320 09:02:51.536265 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4"] Mar 20 09:02:51.546327 master-0 kubenswrapper[18707]: I0320 09:02:51.546254 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-9wvcj"] Mar 20 09:02:51.547726 master-0 kubenswrapper[18707]: I0320 09:02:51.547692 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9wvcj" Mar 20 09:02:51.553957 master-0 kubenswrapper[18707]: I0320 09:02:51.553930 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-2zdk8" Mar 20 09:02:51.554366 master-0 kubenswrapper[18707]: I0320 09:02:51.554330 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-9wvcj"] Mar 20 09:02:51.562082 master-0 kubenswrapper[18707]: I0320 09:02:51.561344 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7wvl\" (UniqueName: \"kubernetes.io/projected/4e66ea7a-42bb-4b3b-bcb0-244bb08e3eba-kube-api-access-d7wvl\") pod \"neutron-operator-controller-manager-767865f676-94tfj\" (UID: \"4e66ea7a-42bb-4b3b-bcb0-244bb08e3eba\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-94tfj" Mar 20 09:02:51.562082 master-0 kubenswrapper[18707]: I0320 09:02:51.561705 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-srzmq"] Mar 20 09:02:51.564160 master-0 kubenswrapper[18707]: I0320 09:02:51.563012 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-srzmq" Mar 20 09:02:51.567583 master-0 kubenswrapper[18707]: I0320 09:02:51.567549 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhhk5\" (UniqueName: \"kubernetes.io/projected/88c892a9-383f-47bf-a78c-7498cfdc49fa-kube-api-access-bhhk5\") pod \"manila-operator-controller-manager-55f864c847-d49s9\" (UID: \"88c892a9-383f-47bf-a78c-7498cfdc49fa\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-d49s9" Mar 20 09:02:51.595032 master-0 kubenswrapper[18707]: I0320 09:02:51.594106 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwrx9\" (UniqueName: \"kubernetes.io/projected/9e52ec39-7a12-433b-8d5c-2df52aa87657-kube-api-access-vwrx9\") pod \"mariadb-operator-controller-manager-67ccfc9778-vxvcl\" (UID: \"9e52ec39-7a12-433b-8d5c-2df52aa87657\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vxvcl" Mar 20 09:02:51.607245 master-0 kubenswrapper[18707]: I0320 09:02:51.607166 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftzvq\" (UniqueName: \"kubernetes.io/projected/f296b061-4bb0-4fb1-9896-3cd75e58e81c-kube-api-access-ftzvq\") pod \"nova-operator-controller-manager-5d488d59fb-5rvds\" (UID: \"f296b061-4bb0-4fb1-9896-3cd75e58e81c\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5rvds" Mar 20 09:02:51.607417 master-0 kubenswrapper[18707]: I0320 09:02:51.607272 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c952\" (UniqueName: \"kubernetes.io/projected/8053f444-54ad-4a79-8bac-8ee78d1d081b-kube-api-access-8c952\") pod \"openstack-baremetal-operator-controller-manager-74c4796899s6vn4\" (UID: \"8053f444-54ad-4a79-8bac-8ee78d1d081b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:02:51.607417 master-0 kubenswrapper[18707]: I0320 09:02:51.607333 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffvv9\" (UniqueName: \"kubernetes.io/projected/109b7903-cd46-4a38-93c2-87253251c130-kube-api-access-ffvv9\") pod \"octavia-operator-controller-manager-5b9f45d989-fr295\" (UID: \"109b7903-cd46-4a38-93c2-87253251c130\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" Mar 20 09:02:51.607417 master-0 kubenswrapper[18707]: I0320 09:02:51.607352 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lwzc\" (UniqueName: \"kubernetes.io/projected/f0712ef1-3983-4b74-8985-16e6f0d9ed18-kube-api-access-4lwzc\") pod \"ovn-operator-controller-manager-884679f54-t875d\" (UID: \"f0712ef1-3983-4b74-8985-16e6f0d9ed18\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-t875d" Mar 20 09:02:51.607417 master-0 kubenswrapper[18707]: I0320 09:02:51.607382 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899s6vn4\" (UID: \"8053f444-54ad-4a79-8bac-8ee78d1d081b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:02:51.610656 master-0 kubenswrapper[18707]: I0320 09:02:51.610472 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-d49s9" Mar 20 09:02:51.620848 master-0 kubenswrapper[18707]: I0320 09:02:51.620722 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vxvcl" Mar 20 09:02:51.633737 master-0 kubenswrapper[18707]: I0320 09:02:51.633687 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-srzmq"] Mar 20 09:02:51.633952 master-0 kubenswrapper[18707]: I0320 09:02:51.633759 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-lbvvg"] Mar 20 09:02:51.635264 master-0 kubenswrapper[18707]: I0320 09:02:51.635120 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lbvvg" Mar 20 09:02:51.638918 master-0 kubenswrapper[18707]: I0320 09:02:51.638326 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffvv9\" (UniqueName: \"kubernetes.io/projected/109b7903-cd46-4a38-93c2-87253251c130-kube-api-access-ffvv9\") pod \"octavia-operator-controller-manager-5b9f45d989-fr295\" (UID: \"109b7903-cd46-4a38-93c2-87253251c130\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" Mar 20 09:02:51.642214 master-0 kubenswrapper[18707]: I0320 09:02:51.642162 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftzvq\" (UniqueName: \"kubernetes.io/projected/f296b061-4bb0-4fb1-9896-3cd75e58e81c-kube-api-access-ftzvq\") pod \"nova-operator-controller-manager-5d488d59fb-5rvds\" (UID: \"f296b061-4bb0-4fb1-9896-3cd75e58e81c\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5rvds" Mar 20 09:02:51.643767 master-0 kubenswrapper[18707]: I0320 09:02:51.643718 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-lbvvg"] Mar 20 09:02:51.661906 master-0 kubenswrapper[18707]: I0320 09:02:51.661847 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94tfj" Mar 20 09:02:51.665419 master-0 kubenswrapper[18707]: I0320 09:02:51.665370 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn"] Mar 20 09:02:51.673892 master-0 kubenswrapper[18707]: I0320 09:02:51.673364 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" Mar 20 09:02:51.678309 master-0 kubenswrapper[18707]: I0320 09:02:51.676126 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kp7cg"] Mar 20 09:02:51.678309 master-0 kubenswrapper[18707]: I0320 09:02:51.677161 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kp7cg" Mar 20 09:02:51.687496 master-0 kubenswrapper[18707]: I0320 09:02:51.686316 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn"] Mar 20 09:02:51.689891 master-0 kubenswrapper[18707]: I0320 09:02:51.689833 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" Mar 20 09:02:51.706393 master-0 kubenswrapper[18707]: I0320 09:02:51.706349 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kp7cg"] Mar 20 09:02:51.708831 master-0 kubenswrapper[18707]: I0320 09:02:51.708697 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899s6vn4\" (UID: \"8053f444-54ad-4a79-8bac-8ee78d1d081b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:02:51.708831 master-0 kubenswrapper[18707]: I0320 09:02:51.708768 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-7rp5f\" (UID: \"f0c833ae-7f6c-4ead-b991-040550152e41\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:02:51.708985 master-0 kubenswrapper[18707]: I0320 09:02:51.708853 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9qng\" (UniqueName: \"kubernetes.io/projected/ca0c1d44-613d-4619-9a29-9cfeb3b57bc3-kube-api-access-p9qng\") pod \"placement-operator-controller-manager-5784578c99-9wvcj\" (UID: \"ca0c1d44-613d-4619-9a29-9cfeb3b57bc3\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-9wvcj" Mar 20 09:02:51.708985 master-0 kubenswrapper[18707]: I0320 09:02:51.708887 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c952\" (UniqueName: \"kubernetes.io/projected/8053f444-54ad-4a79-8bac-8ee78d1d081b-kube-api-access-8c952\") pod \"openstack-baremetal-operator-controller-manager-74c4796899s6vn4\" (UID: \"8053f444-54ad-4a79-8bac-8ee78d1d081b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:02:51.708985 master-0 kubenswrapper[18707]: I0320 09:02:51.708953 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lwzc\" (UniqueName: \"kubernetes.io/projected/f0712ef1-3983-4b74-8985-16e6f0d9ed18-kube-api-access-4lwzc\") pod \"ovn-operator-controller-manager-884679f54-t875d\" (UID: \"f0712ef1-3983-4b74-8985-16e6f0d9ed18\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-t875d" Mar 20 09:02:51.708985 master-0 kubenswrapper[18707]: I0320 09:02:51.708980 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhtfd\" (UniqueName: \"kubernetes.io/projected/9beedaa0-d8d6-415d-8c7d-ec3492dbcc16-kube-api-access-zhtfd\") pod \"swift-operator-controller-manager-c674c5965-srzmq\" (UID: \"9beedaa0-d8d6-415d-8c7d-ec3492dbcc16\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-srzmq" Mar 20 09:02:51.709170 master-0 kubenswrapper[18707]: E0320 09:02:51.709117 18707 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:02:51.709170 master-0 kubenswrapper[18707]: E0320 09:02:51.709167 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert podName:8053f444-54ad-4a79-8bac-8ee78d1d081b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:52.209148735 +0000 UTC m=+1317.365329091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899s6vn4" (UID: "8053f444-54ad-4a79-8bac-8ee78d1d081b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:02:51.709827 master-0 kubenswrapper[18707]: E0320 09:02:51.709448 18707 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 09:02:51.709827 master-0 kubenswrapper[18707]: E0320 09:02:51.709475 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert podName:f0c833ae-7f6c-4ead-b991-040550152e41 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:52.709467164 +0000 UTC m=+1317.865647520 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert") pod "infra-operator-controller-manager-7dd6bb94c9-7rp5f" (UID: "f0c833ae-7f6c-4ead-b991-040550152e41") : secret "infra-operator-webhook-server-cert" not found Mar 20 09:02:51.748235 master-0 kubenswrapper[18707]: I0320 09:02:51.747477 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lwzc\" (UniqueName: \"kubernetes.io/projected/f0712ef1-3983-4b74-8985-16e6f0d9ed18-kube-api-access-4lwzc\") pod \"ovn-operator-controller-manager-884679f54-t875d\" (UID: \"f0712ef1-3983-4b74-8985-16e6f0d9ed18\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-t875d" Mar 20 09:02:51.759536 master-0 kubenswrapper[18707]: I0320 09:02:51.756672 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c952\" (UniqueName: \"kubernetes.io/projected/8053f444-54ad-4a79-8bac-8ee78d1d081b-kube-api-access-8c952\") pod \"openstack-baremetal-operator-controller-manager-74c4796899s6vn4\" (UID: \"8053f444-54ad-4a79-8bac-8ee78d1d081b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:02:51.762649 master-0 kubenswrapper[18707]: I0320 09:02:51.762582 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp"] Mar 20 09:02:51.764284 master-0 kubenswrapper[18707]: I0320 09:02:51.764252 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:51.766312 master-0 kubenswrapper[18707]: I0320 09:02:51.766213 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 09:02:51.767388 master-0 kubenswrapper[18707]: I0320 09:02:51.767339 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 09:02:51.777635 master-0 kubenswrapper[18707]: I0320 09:02:51.777580 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp"] Mar 20 09:02:51.810160 master-0 kubenswrapper[18707]: I0320 09:02:51.810104 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92mkv\" (UniqueName: \"kubernetes.io/projected/ee78381d-189c-4734-b105-5a47fd7af734-kube-api-access-92mkv\") pod \"watcher-operator-controller-manager-6c4d75f7f9-kp7cg\" (UID: \"ee78381d-189c-4734-b105-5a47fd7af734\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kp7cg" Mar 20 09:02:51.810380 master-0 kubenswrapper[18707]: I0320 09:02:51.810173 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhtfd\" (UniqueName: \"kubernetes.io/projected/9beedaa0-d8d6-415d-8c7d-ec3492dbcc16-kube-api-access-zhtfd\") pod \"swift-operator-controller-manager-c674c5965-srzmq\" (UID: \"9beedaa0-d8d6-415d-8c7d-ec3492dbcc16\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-srzmq" Mar 20 09:02:51.810380 master-0 kubenswrapper[18707]: I0320 09:02:51.810311 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2zw2\" (UniqueName: \"kubernetes.io/projected/24bbc2db-37fd-4bae-a9e5-9edb02f2c783-kube-api-access-f2zw2\") pod \"test-operator-controller-manager-5c5cb9c4d7-nf9vn\" (UID: \"24bbc2db-37fd-4bae-a9e5-9edb02f2c783\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" Mar 20 09:02:51.810450 master-0 kubenswrapper[18707]: I0320 09:02:51.810389 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9qng\" (UniqueName: \"kubernetes.io/projected/ca0c1d44-613d-4619-9a29-9cfeb3b57bc3-kube-api-access-p9qng\") pod \"placement-operator-controller-manager-5784578c99-9wvcj\" (UID: \"ca0c1d44-613d-4619-9a29-9cfeb3b57bc3\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-9wvcj" Mar 20 09:02:51.810450 master-0 kubenswrapper[18707]: I0320 09:02:51.810431 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpxhd\" (UniqueName: \"kubernetes.io/projected/54ea9e6a-4a15-4797-9a0d-110409a277f1-kube-api-access-rpxhd\") pod \"telemetry-operator-controller-manager-d6b694c5-lbvvg\" (UID: \"54ea9e6a-4a15-4797-9a0d-110409a277f1\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lbvvg" Mar 20 09:02:51.815379 master-0 kubenswrapper[18707]: I0320 09:02:51.815338 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s2dqt"] Mar 20 09:02:51.817466 master-0 kubenswrapper[18707]: I0320 09:02:51.816570 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s2dqt" Mar 20 09:02:51.823667 master-0 kubenswrapper[18707]: I0320 09:02:51.823604 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s2dqt"] Mar 20 09:02:51.829179 master-0 kubenswrapper[18707]: I0320 09:02:51.829144 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhtfd\" (UniqueName: \"kubernetes.io/projected/9beedaa0-d8d6-415d-8c7d-ec3492dbcc16-kube-api-access-zhtfd\") pod \"swift-operator-controller-manager-c674c5965-srzmq\" (UID: \"9beedaa0-d8d6-415d-8c7d-ec3492dbcc16\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-srzmq" Mar 20 09:02:51.838193 master-0 kubenswrapper[18707]: I0320 09:02:51.838137 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9qng\" (UniqueName: \"kubernetes.io/projected/ca0c1d44-613d-4619-9a29-9cfeb3b57bc3-kube-api-access-p9qng\") pod \"placement-operator-controller-manager-5784578c99-9wvcj\" (UID: \"ca0c1d44-613d-4619-9a29-9cfeb3b57bc3\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-9wvcj" Mar 20 09:02:51.911177 master-0 kubenswrapper[18707]: I0320 09:02:51.902603 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t875d" Mar 20 09:02:51.922865 master-0 kubenswrapper[18707]: I0320 09:02:51.922292 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67tqr\" (UniqueName: \"kubernetes.io/projected/98570795-1969-459b-9703-aca836adab58-kube-api-access-67tqr\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:51.922865 master-0 kubenswrapper[18707]: I0320 09:02:51.922366 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wklrh\" (UniqueName: \"kubernetes.io/projected/ff74d988-3101-4477-b7fa-af7f53925853-kube-api-access-wklrh\") pod \"rabbitmq-cluster-operator-manager-668c99d594-s2dqt\" (UID: \"ff74d988-3101-4477-b7fa-af7f53925853\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s2dqt" Mar 20 09:02:51.923118 master-0 kubenswrapper[18707]: I0320 09:02:51.922951 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpxhd\" (UniqueName: \"kubernetes.io/projected/54ea9e6a-4a15-4797-9a0d-110409a277f1-kube-api-access-rpxhd\") pod \"telemetry-operator-controller-manager-d6b694c5-lbvvg\" (UID: \"54ea9e6a-4a15-4797-9a0d-110409a277f1\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lbvvg" Mar 20 09:02:51.923118 master-0 kubenswrapper[18707]: I0320 09:02:51.923085 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92mkv\" (UniqueName: \"kubernetes.io/projected/ee78381d-189c-4734-b105-5a47fd7af734-kube-api-access-92mkv\") pod \"watcher-operator-controller-manager-6c4d75f7f9-kp7cg\" (UID: \"ee78381d-189c-4734-b105-5a47fd7af734\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kp7cg" Mar 20 09:02:51.923205 master-0 kubenswrapper[18707]: I0320 09:02:51.923169 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:51.924153 master-0 kubenswrapper[18707]: I0320 09:02:51.923284 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2zw2\" (UniqueName: \"kubernetes.io/projected/24bbc2db-37fd-4bae-a9e5-9edb02f2c783-kube-api-access-f2zw2\") pod \"test-operator-controller-manager-5c5cb9c4d7-nf9vn\" (UID: \"24bbc2db-37fd-4bae-a9e5-9edb02f2c783\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" Mar 20 09:02:51.924153 master-0 kubenswrapper[18707]: I0320 09:02:51.923333 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:51.924153 master-0 kubenswrapper[18707]: I0320 09:02:51.923353 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9wvcj" Mar 20 09:02:51.938364 master-0 kubenswrapper[18707]: I0320 09:02:51.936758 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5rvds" Mar 20 09:02:51.985286 master-0 kubenswrapper[18707]: I0320 09:02:51.984931 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92mkv\" (UniqueName: \"kubernetes.io/projected/ee78381d-189c-4734-b105-5a47fd7af734-kube-api-access-92mkv\") pod \"watcher-operator-controller-manager-6c4d75f7f9-kp7cg\" (UID: \"ee78381d-189c-4734-b105-5a47fd7af734\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kp7cg" Mar 20 09:02:51.987522 master-0 kubenswrapper[18707]: I0320 09:02:51.987484 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2zw2\" (UniqueName: \"kubernetes.io/projected/24bbc2db-37fd-4bae-a9e5-9edb02f2c783-kube-api-access-f2zw2\") pod \"test-operator-controller-manager-5c5cb9c4d7-nf9vn\" (UID: \"24bbc2db-37fd-4bae-a9e5-9edb02f2c783\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" Mar 20 09:02:51.987829 master-0 kubenswrapper[18707]: I0320 09:02:51.987773 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpxhd\" (UniqueName: \"kubernetes.io/projected/54ea9e6a-4a15-4797-9a0d-110409a277f1-kube-api-access-rpxhd\") pod \"telemetry-operator-controller-manager-d6b694c5-lbvvg\" (UID: \"54ea9e6a-4a15-4797-9a0d-110409a277f1\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lbvvg" Mar 20 09:02:52.001510 master-0 kubenswrapper[18707]: I0320 09:02:51.998271 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-sfshp"] Mar 20 09:02:52.029726 master-0 kubenswrapper[18707]: I0320 09:02:52.029644 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:52.029797 master-0 kubenswrapper[18707]: I0320 09:02:52.029727 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:52.029832 master-0 kubenswrapper[18707]: I0320 09:02:52.029819 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67tqr\" (UniqueName: \"kubernetes.io/projected/98570795-1969-459b-9703-aca836adab58-kube-api-access-67tqr\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:52.029881 master-0 kubenswrapper[18707]: I0320 09:02:52.029850 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wklrh\" (UniqueName: \"kubernetes.io/projected/ff74d988-3101-4477-b7fa-af7f53925853-kube-api-access-wklrh\") pod \"rabbitmq-cluster-operator-manager-668c99d594-s2dqt\" (UID: \"ff74d988-3101-4477-b7fa-af7f53925853\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s2dqt" Mar 20 09:02:52.030331 master-0 kubenswrapper[18707]: E0320 09:02:52.030293 18707 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 09:02:52.030392 master-0 kubenswrapper[18707]: E0320 09:02:52.030378 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs podName:98570795-1969-459b-9703-aca836adab58 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:52.530356144 +0000 UTC m=+1317.686536520 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-cfkmp" (UID: "98570795-1969-459b-9703-aca836adab58") : secret "metrics-server-cert" not found Mar 20 09:02:52.030589 master-0 kubenswrapper[18707]: E0320 09:02:52.030550 18707 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 09:02:52.030642 master-0 kubenswrapper[18707]: E0320 09:02:52.030600 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs podName:98570795-1969-459b-9703-aca836adab58 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:52.53058649 +0000 UTC m=+1317.686766846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-cfkmp" (UID: "98570795-1969-459b-9703-aca836adab58") : secret "webhook-server-cert" not found Mar 20 09:02:52.070682 master-0 kubenswrapper[18707]: I0320 09:02:52.069919 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wklrh\" (UniqueName: \"kubernetes.io/projected/ff74d988-3101-4477-b7fa-af7f53925853-kube-api-access-wklrh\") pod \"rabbitmq-cluster-operator-manager-668c99d594-s2dqt\" (UID: \"ff74d988-3101-4477-b7fa-af7f53925853\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s2dqt" Mar 20 09:02:52.104608 master-0 kubenswrapper[18707]: I0320 09:02:52.103909 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67tqr\" (UniqueName: \"kubernetes.io/projected/98570795-1969-459b-9703-aca836adab58-kube-api-access-67tqr\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:52.200238 master-0 kubenswrapper[18707]: I0320 09:02:52.200169 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-vds6m"] Mar 20 09:02:52.233887 master-0 kubenswrapper[18707]: I0320 09:02:52.233826 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899s6vn4\" (UID: \"8053f444-54ad-4a79-8bac-8ee78d1d081b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:02:52.234061 master-0 kubenswrapper[18707]: E0320 09:02:52.234035 18707 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:02:52.234172 master-0 kubenswrapper[18707]: E0320 09:02:52.234105 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert podName:8053f444-54ad-4a79-8bac-8ee78d1d081b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:53.234082745 +0000 UTC m=+1318.390263101 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899s6vn4" (UID: "8053f444-54ad-4a79-8bac-8ee78d1d081b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:02:52.241159 master-0 kubenswrapper[18707]: I0320 09:02:52.240985 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-srzmq" Mar 20 09:02:52.251872 master-0 kubenswrapper[18707]: W0320 09:02:52.251499 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2036498a_bbc9_49bb_9ae7_d3f9fbf5165f.slice/crio-d7af84a64817847520e295b3df5420c564b538e069a1495ea0f73746e7c5473a WatchSource:0}: Error finding container d7af84a64817847520e295b3df5420c564b538e069a1495ea0f73746e7c5473a: Status 404 returned error can't find the container with id d7af84a64817847520e295b3df5420c564b538e069a1495ea0f73746e7c5473a Mar 20 09:02:52.329476 master-0 kubenswrapper[18707]: I0320 09:02:52.326727 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lbvvg" Mar 20 09:02:52.362023 master-0 kubenswrapper[18707]: I0320 09:02:52.361961 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" Mar 20 09:02:52.381878 master-0 kubenswrapper[18707]: I0320 09:02:52.381840 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kp7cg" Mar 20 09:02:52.383816 master-0 kubenswrapper[18707]: I0320 09:02:52.383770 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-bghwc"] Mar 20 09:02:52.408700 master-0 kubenswrapper[18707]: I0320 09:02:52.408653 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-nt245"] Mar 20 09:02:52.430119 master-0 kubenswrapper[18707]: W0320 09:02:52.430061 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3637b14_6e2e_4cdf_b59f_7f13bcdff49d.slice/crio-fbcb1c5174b203bc7284494766f57d4fd2ae8ffa1d6d900d26fd79144174b7cd WatchSource:0}: Error finding container fbcb1c5174b203bc7284494766f57d4fd2ae8ffa1d6d900d26fd79144174b7cd: Status 404 returned error can't find the container with id fbcb1c5174b203bc7284494766f57d4fd2ae8ffa1d6d900d26fd79144174b7cd Mar 20 09:02:52.433824 master-0 kubenswrapper[18707]: I0320 09:02:52.433453 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s2dqt" Mar 20 09:02:52.539751 master-0 kubenswrapper[18707]: I0320 09:02:52.539695 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:52.540026 master-0 kubenswrapper[18707]: I0320 09:02:52.539792 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:52.540026 master-0 kubenswrapper[18707]: E0320 09:02:52.539959 18707 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 09:02:52.540026 master-0 kubenswrapper[18707]: E0320 09:02:52.540017 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs podName:98570795-1969-459b-9703-aca836adab58 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:53.540000056 +0000 UTC m=+1318.696180412 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-cfkmp" (UID: "98570795-1969-459b-9703-aca836adab58") : secret "metrics-server-cert" not found Mar 20 09:02:52.541398 master-0 kubenswrapper[18707]: E0320 09:02:52.541325 18707 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 09:02:52.541398 master-0 kubenswrapper[18707]: E0320 09:02:52.541358 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs podName:98570795-1969-459b-9703-aca836adab58 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:53.541349965 +0000 UTC m=+1318.697530321 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-cfkmp" (UID: "98570795-1969-459b-9703-aca836adab58") : secret "webhook-server-cert" not found Mar 20 09:02:52.744240 master-0 kubenswrapper[18707]: I0320 09:02:52.744150 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-7rp5f\" (UID: \"f0c833ae-7f6c-4ead-b991-040550152e41\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:02:52.744798 master-0 kubenswrapper[18707]: E0320 09:02:52.744352 18707 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 09:02:52.744798 master-0 kubenswrapper[18707]: E0320 09:02:52.744439 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert podName:f0c833ae-7f6c-4ead-b991-040550152e41 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:54.744418818 +0000 UTC m=+1319.900599174 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert") pod "infra-operator-controller-manager-7dd6bb94c9-7rp5f" (UID: "f0c833ae-7f6c-4ead-b991-040550152e41") : secret "infra-operator-webhook-server-cert" not found Mar 20 09:02:52.768411 master-0 kubenswrapper[18707]: I0320 09:02:52.768256 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bghwc" event={"ID":"426a6be9-aeba-4865-bea3-0eacca0f445f","Type":"ContainerStarted","Data":"3dd8ba156f08f1f0fe88900a7f90fc124ac0559c280f3ec95145c172b9c18365"} Mar 20 09:02:52.772898 master-0 kubenswrapper[18707]: I0320 09:02:52.772866 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nt245" event={"ID":"d3637b14-6e2e-4cdf-b59f-7f13bcdff49d","Type":"ContainerStarted","Data":"fbcb1c5174b203bc7284494766f57d4fd2ae8ffa1d6d900d26fd79144174b7cd"} Mar 20 09:02:52.778271 master-0 kubenswrapper[18707]: I0320 09:02:52.778212 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sfshp" event={"ID":"0709242a-39f4-46b7-a161-c28c18c272e9","Type":"ContainerStarted","Data":"b5fe7d1c05b7cb9f6242f811a8bb7f975d0ebe55972b147a58184313a64fb8f4"} Mar 20 09:02:52.782672 master-0 kubenswrapper[18707]: I0320 09:02:52.782625 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-vds6m" event={"ID":"2036498a-bbc9-49bb-9ae7-d3f9fbf5165f","Type":"ContainerStarted","Data":"d7af84a64817847520e295b3df5420c564b538e069a1495ea0f73746e7c5473a"} Mar 20 09:02:52.840507 master-0 kubenswrapper[18707]: W0320 09:02:52.840275 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ca698eb_ae09_4cfd_ac04_7350744653a6.slice/crio-59b85cb463ddde7e65bfdbad9c8c2991fb26bffd7d7a2936f09cb9060c34236a WatchSource:0}: Error finding container 59b85cb463ddde7e65bfdbad9c8c2991fb26bffd7d7a2936f09cb9060c34236a: Status 404 returned error can't find the container with id 59b85cb463ddde7e65bfdbad9c8c2991fb26bffd7d7a2936f09cb9060c34236a Mar 20 09:02:52.846681 master-0 kubenswrapper[18707]: W0320 09:02:52.846649 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbd2d905_b71b_408c_b669_1a0a43ce9b2f.slice/crio-ad0e5fd03f46737759581657586577094e6fb8828aead8fc6f75ac007171aed3 WatchSource:0}: Error finding container ad0e5fd03f46737759581657586577094e6fb8828aead8fc6f75ac007171aed3: Status 404 returned error can't find the container with id ad0e5fd03f46737759581657586577094e6fb8828aead8fc6f75ac007171aed3 Mar 20 09:02:52.847605 master-0 kubenswrapper[18707]: I0320 09:02:52.847588 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9"] Mar 20 09:02:52.856012 master-0 kubenswrapper[18707]: I0320 09:02:52.855954 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-2zdk8"] Mar 20 09:02:52.886118 master-0 kubenswrapper[18707]: I0320 09:02:52.886051 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2"] Mar 20 09:02:52.952327 master-0 kubenswrapper[18707]: E0320 09:02:52.951853 18707 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900: can't talk to a V1 container registry" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900" Mar 20 09:02:52.952327 master-0 kubenswrapper[18707]: E0320 09:02:52.952061 18707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m42pz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-67dd5f86f5-5h2n9_openstack-operators(fbd2d905-b71b-408c-b669-1a0a43ce9b2f): ErrImagePull: initializing source docker://quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900: can't talk to a V1 container registry" logger="UnhandledError" Mar 20 09:02:52.953332 master-0 kubenswrapper[18707]: E0320 09:02:52.953277 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"initializing source docker://quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900: can't talk to a V1 container registry\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9" podUID="fbd2d905-b71b-408c-b669-1a0a43ce9b2f" Mar 20 09:02:53.264119 master-0 kubenswrapper[18707]: I0320 09:02:53.261290 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899s6vn4\" (UID: \"8053f444-54ad-4a79-8bac-8ee78d1d081b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:02:53.264119 master-0 kubenswrapper[18707]: E0320 09:02:53.263015 18707 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:02:53.264119 master-0 kubenswrapper[18707]: E0320 09:02:53.263057 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert podName:8053f444-54ad-4a79-8bac-8ee78d1d081b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:55.263043018 +0000 UTC m=+1320.419223364 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899s6vn4" (UID: "8053f444-54ad-4a79-8bac-8ee78d1d081b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:02:53.399784 master-0 kubenswrapper[18707]: W0320 09:02:53.399699 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf296b061_4bb0_4fb1_9896_3cd75e58e81c.slice/crio-e604967cf9249b17a9e992bdde7620381e360c47a7f05384701b50b84d1c4526 WatchSource:0}: Error finding container e604967cf9249b17a9e992bdde7620381e360c47a7f05384701b50b84d1c4526: Status 404 returned error can't find the container with id e604967cf9249b17a9e992bdde7620381e360c47a7f05384701b50b84d1c4526 Mar 20 09:02:53.406035 master-0 kubenswrapper[18707]: I0320 09:02:53.405773 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-5rvds"] Mar 20 09:02:53.457729 master-0 kubenswrapper[18707]: W0320 09:02:53.457670 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e52ec39_7a12_433b_8d5c_2df52aa87657.slice/crio-fb716e95c2973cd0cfc56e8ba9e3fe4ed9adc34fac8f571dad07fde443ba61bd WatchSource:0}: Error finding container fb716e95c2973cd0cfc56e8ba9e3fe4ed9adc34fac8f571dad07fde443ba61bd: Status 404 returned error can't find the container with id fb716e95c2973cd0cfc56e8ba9e3fe4ed9adc34fac8f571dad07fde443ba61bd Mar 20 09:02:53.485071 master-0 kubenswrapper[18707]: I0320 09:02:53.485001 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-d49s9"] Mar 20 09:02:53.492992 master-0 kubenswrapper[18707]: I0320 09:02:53.492929 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vxvcl"] Mar 20 09:02:53.506314 master-0 kubenswrapper[18707]: I0320 09:02:53.506256 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-94tfj"] Mar 20 09:02:53.569525 master-0 kubenswrapper[18707]: I0320 09:02:53.569175 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:53.569525 master-0 kubenswrapper[18707]: I0320 09:02:53.569420 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:53.569833 master-0 kubenswrapper[18707]: E0320 09:02:53.569423 18707 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 09:02:53.569833 master-0 kubenswrapper[18707]: E0320 09:02:53.569493 18707 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 09:02:53.569833 master-0 kubenswrapper[18707]: E0320 09:02:53.569686 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs podName:98570795-1969-459b-9703-aca836adab58 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:55.56965507 +0000 UTC m=+1320.725835426 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-cfkmp" (UID: "98570795-1969-459b-9703-aca836adab58") : secret "webhook-server-cert" not found Mar 20 09:02:53.569833 master-0 kubenswrapper[18707]: E0320 09:02:53.569757 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs podName:98570795-1969-459b-9703-aca836adab58 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:55.569726982 +0000 UTC m=+1320.725907428 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-cfkmp" (UID: "98570795-1969-459b-9703-aca836adab58") : secret "metrics-server-cert" not found Mar 20 09:02:53.626424 master-0 kubenswrapper[18707]: I0320 09:02:53.621130 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-t875d"] Mar 20 09:02:53.638917 master-0 kubenswrapper[18707]: W0320 09:02:53.638883 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0712ef1_3983_4b74_8985_16e6f0d9ed18.slice/crio-c7ca5985c800c4aede265fdb06f0606d443dd2fcf5559929032fb1f6c1e90658 WatchSource:0}: Error finding container c7ca5985c800c4aede265fdb06f0606d443dd2fcf5559929032fb1f6c1e90658: Status 404 returned error can't find the container with id c7ca5985c800c4aede265fdb06f0606d443dd2fcf5559929032fb1f6c1e90658 Mar 20 09:02:53.644258 master-0 kubenswrapper[18707]: W0320 09:02:53.644177 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01d1e35b_ff25_4eb4_b550_30debdab6fa0.slice/crio-aa453edd100d457f82f9e677bf381e1c90fd5cc60f5a7fcda682b9b010ec161f WatchSource:0}: Error finding container aa453edd100d457f82f9e677bf381e1c90fd5cc60f5a7fcda682b9b010ec161f: Status 404 returned error can't find the container with id aa453edd100d457f82f9e677bf381e1c90fd5cc60f5a7fcda682b9b010ec161f Mar 20 09:02:53.651343 master-0 kubenswrapper[18707]: I0320 09:02:53.651305 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295"] Mar 20 09:02:53.709306 master-0 kubenswrapper[18707]: I0320 09:02:53.701058 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-jbfpt"] Mar 20 09:02:53.863158 master-0 kubenswrapper[18707]: I0320 09:02:53.863033 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-9wvcj"] Mar 20 09:02:53.876025 master-0 kubenswrapper[18707]: I0320 09:02:53.875985 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kp7cg"] Mar 20 09:02:53.878862 master-0 kubenswrapper[18707]: I0320 09:02:53.878823 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jbfpt" event={"ID":"01d1e35b-ff25-4eb4-b550-30debdab6fa0","Type":"ContainerStarted","Data":"aa453edd100d457f82f9e677bf381e1c90fd5cc60f5a7fcda682b9b010ec161f"} Mar 20 09:02:53.881555 master-0 kubenswrapper[18707]: I0320 09:02:53.881329 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t875d" event={"ID":"f0712ef1-3983-4b74-8985-16e6f0d9ed18","Type":"ContainerStarted","Data":"c7ca5985c800c4aede265fdb06f0606d443dd2fcf5559929032fb1f6c1e90658"} Mar 20 09:02:53.886275 master-0 kubenswrapper[18707]: I0320 09:02:53.883745 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9" event={"ID":"fbd2d905-b71b-408c-b669-1a0a43ce9b2f","Type":"ContainerStarted","Data":"ad0e5fd03f46737759581657586577094e6fb8828aead8fc6f75ac007171aed3"} Mar 20 09:02:53.886931 master-0 kubenswrapper[18707]: I0320 09:02:53.886560 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn"] Mar 20 09:02:53.894276 master-0 kubenswrapper[18707]: I0320 09:02:53.894165 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94tfj" event={"ID":"4e66ea7a-42bb-4b3b-bcb0-244bb08e3eba","Type":"ContainerStarted","Data":"547ee95df659f53f5ca79334fc8e0522f7dc61335153fbb1b0dd1c8736d708fb"} Mar 20 09:02:53.902249 master-0 kubenswrapper[18707]: E0320 09:02:53.901362 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9" podUID="fbd2d905-b71b-408c-b669-1a0a43ce9b2f" Mar 20 09:02:53.904464 master-0 kubenswrapper[18707]: I0320 09:02:53.904041 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5rvds" event={"ID":"f296b061-4bb0-4fb1-9896-3cd75e58e81c","Type":"ContainerStarted","Data":"e604967cf9249b17a9e992bdde7620381e360c47a7f05384701b50b84d1c4526"} Mar 20 09:02:53.910055 master-0 kubenswrapper[18707]: I0320 09:02:53.910000 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-srzmq"] Mar 20 09:02:53.934431 master-0 kubenswrapper[18707]: I0320 09:02:53.929618 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2" event={"ID":"b9627b82-4e5a-4ceb-a906-0657397e66e9","Type":"ContainerStarted","Data":"4dc971a4e5374dde3196877272133724714299e0aa2911ff29b22a1b960afae0"} Mar 20 09:02:54.003735 master-0 kubenswrapper[18707]: I0320 09:02:54.003638 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vxvcl" event={"ID":"9e52ec39-7a12-433b-8d5c-2df52aa87657","Type":"ContainerStarted","Data":"fb716e95c2973cd0cfc56e8ba9e3fe4ed9adc34fac8f571dad07fde443ba61bd"} Mar 20 09:02:54.006158 master-0 kubenswrapper[18707]: I0320 09:02:54.006087 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-d49s9" event={"ID":"88c892a9-383f-47bf-a78c-7498cfdc49fa","Type":"ContainerStarted","Data":"4250291cfcab755617bc58c395ea950cf14e341ff7d2160d6fae44d74a671f62"} Mar 20 09:02:54.011517 master-0 kubenswrapper[18707]: I0320 09:02:54.011457 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" event={"ID":"109b7903-cd46-4a38-93c2-87253251c130","Type":"ContainerStarted","Data":"6938fa056ea8652f9880c3d0ff493631ba97de0ca99198e6dcdcf6a3b4091124"} Mar 20 09:02:54.013316 master-0 kubenswrapper[18707]: I0320 09:02:54.013092 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-2zdk8" event={"ID":"6ca698eb-ae09-4cfd-ac04-7350744653a6","Type":"ContainerStarted","Data":"59b85cb463ddde7e65bfdbad9c8c2991fb26bffd7d7a2936f09cb9060c34236a"} Mar 20 09:02:54.070096 master-0 kubenswrapper[18707]: I0320 09:02:54.069679 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-lbvvg"] Mar 20 09:02:54.087798 master-0 kubenswrapper[18707]: I0320 09:02:54.087610 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s2dqt"] Mar 20 09:02:54.798337 master-0 kubenswrapper[18707]: I0320 09:02:54.798286 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-7rp5f\" (UID: \"f0c833ae-7f6c-4ead-b991-040550152e41\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:02:54.798584 master-0 kubenswrapper[18707]: E0320 09:02:54.798484 18707 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 09:02:54.798584 master-0 kubenswrapper[18707]: E0320 09:02:54.798578 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert podName:f0c833ae-7f6c-4ead-b991-040550152e41 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:58.798555007 +0000 UTC m=+1323.954735413 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert") pod "infra-operator-controller-manager-7dd6bb94c9-7rp5f" (UID: "f0c833ae-7f6c-4ead-b991-040550152e41") : secret "infra-operator-webhook-server-cert" not found Mar 20 09:02:55.053515 master-0 kubenswrapper[18707]: I0320 09:02:55.053266 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-srzmq" event={"ID":"9beedaa0-d8d6-415d-8c7d-ec3492dbcc16","Type":"ContainerStarted","Data":"993a7cfd02d484ef4fecbda7cdd58c8b2be4eab86405b95b8e1ebb8063f8e937"} Mar 20 09:02:55.055762 master-0 kubenswrapper[18707]: I0320 09:02:55.055715 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" event={"ID":"24bbc2db-37fd-4bae-a9e5-9edb02f2c783","Type":"ContainerStarted","Data":"ec75ec670b2b8bdc35a9313167e0458868a43340620d443606536db936b76630"} Mar 20 09:02:55.058650 master-0 kubenswrapper[18707]: I0320 09:02:55.058591 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kp7cg" event={"ID":"ee78381d-189c-4734-b105-5a47fd7af734","Type":"ContainerStarted","Data":"49527d901d74dd01661c01d165991269b8f022988adb2f3504b534b382c19b15"} Mar 20 09:02:55.063988 master-0 kubenswrapper[18707]: I0320 09:02:55.063841 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9wvcj" event={"ID":"ca0c1d44-613d-4619-9a29-9cfeb3b57bc3","Type":"ContainerStarted","Data":"a10bafcc9e8585cc26dc0a7695903d25f72e920721fbec4823d850ec57136373"} Mar 20 09:02:55.068973 master-0 kubenswrapper[18707]: E0320 09:02:55.068924 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9" podUID="fbd2d905-b71b-408c-b669-1a0a43ce9b2f" Mar 20 09:02:55.364277 master-0 kubenswrapper[18707]: I0320 09:02:55.357562 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899s6vn4\" (UID: \"8053f444-54ad-4a79-8bac-8ee78d1d081b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:02:55.364277 master-0 kubenswrapper[18707]: E0320 09:02:55.358065 18707 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:02:55.364277 master-0 kubenswrapper[18707]: E0320 09:02:55.358246 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert podName:8053f444-54ad-4a79-8bac-8ee78d1d081b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:59.358139848 +0000 UTC m=+1324.514320204 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899s6vn4" (UID: "8053f444-54ad-4a79-8bac-8ee78d1d081b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:02:55.587355 master-0 kubenswrapper[18707]: I0320 09:02:55.587240 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:55.587636 master-0 kubenswrapper[18707]: I0320 09:02:55.587414 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:55.587707 master-0 kubenswrapper[18707]: E0320 09:02:55.587635 18707 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 09:02:55.587707 master-0 kubenswrapper[18707]: E0320 09:02:55.587683 18707 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 09:02:55.587784 master-0 kubenswrapper[18707]: E0320 09:02:55.587739 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs podName:98570795-1969-459b-9703-aca836adab58 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:59.587715448 +0000 UTC m=+1324.743895864 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-cfkmp" (UID: "98570795-1969-459b-9703-aca836adab58") : secret "webhook-server-cert" not found Mar 20 09:02:55.587784 master-0 kubenswrapper[18707]: E0320 09:02:55.587764 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs podName:98570795-1969-459b-9703-aca836adab58 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:59.5877559 +0000 UTC m=+1324.743936366 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-cfkmp" (UID: "98570795-1969-459b-9703-aca836adab58") : secret "metrics-server-cert" not found Mar 20 09:02:55.874845 master-0 kubenswrapper[18707]: W0320 09:02:55.874788 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff74d988_3101_4477_b7fa_af7f53925853.slice/crio-aaff76747a73bb6ac0b3b92051ff9f310d2f31bcedd2a6cdac02b3c7dc086bc7 WatchSource:0}: Error finding container aaff76747a73bb6ac0b3b92051ff9f310d2f31bcedd2a6cdac02b3c7dc086bc7: Status 404 returned error can't find the container with id aaff76747a73bb6ac0b3b92051ff9f310d2f31bcedd2a6cdac02b3c7dc086bc7 Mar 20 09:02:56.080465 master-0 kubenswrapper[18707]: I0320 09:02:56.080383 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s2dqt" event={"ID":"ff74d988-3101-4477-b7fa-af7f53925853","Type":"ContainerStarted","Data":"aaff76747a73bb6ac0b3b92051ff9f310d2f31bcedd2a6cdac02b3c7dc086bc7"} Mar 20 09:02:56.088937 master-0 kubenswrapper[18707]: I0320 09:02:56.088119 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lbvvg" event={"ID":"54ea9e6a-4a15-4797-9a0d-110409a277f1","Type":"ContainerStarted","Data":"af7ca5c4c836d5e2bc8cf6b4e418922a24dc81839e561c8c24386130a3b0c81b"} Mar 20 09:02:58.862455 master-0 kubenswrapper[18707]: I0320 09:02:58.862315 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-7rp5f\" (UID: \"f0c833ae-7f6c-4ead-b991-040550152e41\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:02:58.863222 master-0 kubenswrapper[18707]: E0320 09:02:58.862789 18707 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 09:02:58.863222 master-0 kubenswrapper[18707]: E0320 09:02:58.862842 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert podName:f0c833ae-7f6c-4ead-b991-040550152e41 nodeName:}" failed. No retries permitted until 2026-03-20 09:03:06.862827351 +0000 UTC m=+1332.019007707 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert") pod "infra-operator-controller-manager-7dd6bb94c9-7rp5f" (UID: "f0c833ae-7f6c-4ead-b991-040550152e41") : secret "infra-operator-webhook-server-cert" not found Mar 20 09:02:59.376925 master-0 kubenswrapper[18707]: I0320 09:02:59.376558 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899s6vn4\" (UID: \"8053f444-54ad-4a79-8bac-8ee78d1d081b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:02:59.377530 master-0 kubenswrapper[18707]: E0320 09:02:59.377086 18707 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:02:59.377530 master-0 kubenswrapper[18707]: E0320 09:02:59.377206 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert podName:8053f444-54ad-4a79-8bac-8ee78d1d081b nodeName:}" failed. No retries permitted until 2026-03-20 09:03:07.377166436 +0000 UTC m=+1332.533346792 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899s6vn4" (UID: "8053f444-54ad-4a79-8bac-8ee78d1d081b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:02:59.683478 master-0 kubenswrapper[18707]: I0320 09:02:59.683194 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:59.683478 master-0 kubenswrapper[18707]: I0320 09:02:59.683296 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:02:59.683809 master-0 kubenswrapper[18707]: E0320 09:02:59.683530 18707 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 09:02:59.683809 master-0 kubenswrapper[18707]: E0320 09:02:59.683558 18707 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 09:02:59.683809 master-0 kubenswrapper[18707]: E0320 09:02:59.683605 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs podName:98570795-1969-459b-9703-aca836adab58 nodeName:}" failed. No retries permitted until 2026-03-20 09:03:07.683585163 +0000 UTC m=+1332.839765519 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-cfkmp" (UID: "98570795-1969-459b-9703-aca836adab58") : secret "metrics-server-cert" not found Mar 20 09:02:59.683809 master-0 kubenswrapper[18707]: E0320 09:02:59.683683 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs podName:98570795-1969-459b-9703-aca836adab58 nodeName:}" failed. No retries permitted until 2026-03-20 09:03:07.683650894 +0000 UTC m=+1332.839831290 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-cfkmp" (UID: "98570795-1969-459b-9703-aca836adab58") : secret "webhook-server-cert" not found Mar 20 09:03:06.948133 master-0 kubenswrapper[18707]: I0320 09:03:06.947930 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-7rp5f\" (UID: \"f0c833ae-7f6c-4ead-b991-040550152e41\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:03:06.948133 master-0 kubenswrapper[18707]: E0320 09:03:06.948106 18707 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 09:03:06.948746 master-0 kubenswrapper[18707]: E0320 09:03:06.948203 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert podName:f0c833ae-7f6c-4ead-b991-040550152e41 nodeName:}" failed. No retries permitted until 2026-03-20 09:03:22.948168904 +0000 UTC m=+1348.104349260 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert") pod "infra-operator-controller-manager-7dd6bb94c9-7rp5f" (UID: "f0c833ae-7f6c-4ead-b991-040550152e41") : secret "infra-operator-webhook-server-cert" not found Mar 20 09:03:07.462808 master-0 kubenswrapper[18707]: I0320 09:03:07.462238 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899s6vn4\" (UID: \"8053f444-54ad-4a79-8bac-8ee78d1d081b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:03:07.462808 master-0 kubenswrapper[18707]: E0320 09:03:07.462497 18707 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:03:07.462808 master-0 kubenswrapper[18707]: E0320 09:03:07.462605 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert podName:8053f444-54ad-4a79-8bac-8ee78d1d081b nodeName:}" failed. No retries permitted until 2026-03-20 09:03:23.462580064 +0000 UTC m=+1348.618760440 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899s6vn4" (UID: "8053f444-54ad-4a79-8bac-8ee78d1d081b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:03:07.771160 master-0 kubenswrapper[18707]: I0320 09:03:07.770526 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:03:07.771160 master-0 kubenswrapper[18707]: I0320 09:03:07.770631 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:03:07.771160 master-0 kubenswrapper[18707]: E0320 09:03:07.770846 18707 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 09:03:07.771160 master-0 kubenswrapper[18707]: E0320 09:03:07.770906 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs podName:98570795-1969-459b-9703-aca836adab58 nodeName:}" failed. No retries permitted until 2026-03-20 09:03:23.770888394 +0000 UTC m=+1348.927068750 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-cfkmp" (UID: "98570795-1969-459b-9703-aca836adab58") : secret "metrics-server-cert" not found Mar 20 09:03:07.771668 master-0 kubenswrapper[18707]: E0320 09:03:07.771338 18707 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 09:03:07.771668 master-0 kubenswrapper[18707]: E0320 09:03:07.771376 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs podName:98570795-1969-459b-9703-aca836adab58 nodeName:}" failed. No retries permitted until 2026-03-20 09:03:23.771364348 +0000 UTC m=+1348.927544704 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-cfkmp" (UID: "98570795-1969-459b-9703-aca836adab58") : secret "webhook-server-cert" not found Mar 20 09:03:15.702179 master-0 kubenswrapper[18707]: I0320 09:03:15.701409 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sfshp" event={"ID":"0709242a-39f4-46b7-a161-c28c18c272e9","Type":"ContainerStarted","Data":"409b93d9bbac0db17de7f87900e57c1db32e953aa2361fe478ebd0ea2d0ac46b"} Mar 20 09:03:15.702179 master-0 kubenswrapper[18707]: I0320 09:03:15.701536 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sfshp" Mar 20 09:03:15.703662 master-0 kubenswrapper[18707]: I0320 09:03:15.703592 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-vds6m" event={"ID":"2036498a-bbc9-49bb-9ae7-d3f9fbf5165f","Type":"ContainerStarted","Data":"b9c9c948fce33961250794ddb4ff78acc541340a8b8223b706d3488b653e10b2"} Mar 20 09:03:15.703814 master-0 kubenswrapper[18707]: I0320 09:03:15.703777 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-vds6m" Mar 20 09:03:15.748384 master-0 kubenswrapper[18707]: I0320 09:03:15.748284 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sfshp" podStartSLOduration=11.425648823 podStartE2EDuration="25.748258785s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:02:52.067248818 +0000 UTC m=+1317.223429174" lastFinishedPulling="2026-03-20 09:03:06.38985878 +0000 UTC m=+1331.546039136" observedRunningTime="2026-03-20 09:03:15.723354574 +0000 UTC m=+1340.879534930" watchObservedRunningTime="2026-03-20 09:03:15.748258785 +0000 UTC m=+1340.904439151" Mar 20 09:03:15.777979 master-0 kubenswrapper[18707]: I0320 09:03:15.776713 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-vds6m" podStartSLOduration=11.644402614 podStartE2EDuration="25.776687338s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:02:52.258959006 +0000 UTC m=+1317.415139362" lastFinishedPulling="2026-03-20 09:03:06.39124373 +0000 UTC m=+1331.547424086" observedRunningTime="2026-03-20 09:03:15.745854597 +0000 UTC m=+1340.902034963" watchObservedRunningTime="2026-03-20 09:03:15.776687338 +0000 UTC m=+1340.932867684" Mar 20 09:03:16.732069 master-0 kubenswrapper[18707]: I0320 09:03:16.730813 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nt245" event={"ID":"d3637b14-6e2e-4cdf-b59f-7f13bcdff49d","Type":"ContainerStarted","Data":"8d019c4406d20a69685587608468acc302354c23df278a3f7026067bab15f8aa"} Mar 20 09:03:16.732069 master-0 kubenswrapper[18707]: I0320 09:03:16.731953 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nt245" Mar 20 09:03:16.771074 master-0 kubenswrapper[18707]: I0320 09:03:16.767551 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94tfj" event={"ID":"4e66ea7a-42bb-4b3b-bcb0-244bb08e3eba","Type":"ContainerStarted","Data":"0ecabcfffbec6984848ca997f5c86ffa6f4bd950eb85442fcfdb6d3b61b6eb80"} Mar 20 09:03:16.771074 master-0 kubenswrapper[18707]: I0320 09:03:16.768479 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94tfj" Mar 20 09:03:16.796215 master-0 kubenswrapper[18707]: I0320 09:03:16.792463 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s2dqt" event={"ID":"ff74d988-3101-4477-b7fa-af7f53925853","Type":"ContainerStarted","Data":"3b9c7919d2350281df4b62517789c5473951da9519f0922ed5b6b431170fbfde"} Mar 20 09:03:16.816553 master-0 kubenswrapper[18707]: I0320 09:03:16.816489 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-srzmq" event={"ID":"9beedaa0-d8d6-415d-8c7d-ec3492dbcc16","Type":"ContainerStarted","Data":"d6ef7526376dd0abdcd83a5dacdcc15acf27c4ffecfac2a8f746b84169a102d8"} Mar 20 09:03:16.817218 master-0 kubenswrapper[18707]: I0320 09:03:16.817166 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-srzmq" Mar 20 09:03:16.828708 master-0 kubenswrapper[18707]: I0320 09:03:16.828661 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jbfpt" event={"ID":"01d1e35b-ff25-4eb4-b550-30debdab6fa0","Type":"ContainerStarted","Data":"e2829306600fcd953387d33060ea12e37a62452a3ecdf3eb3e6511d6ca03a2c4"} Mar 20 09:03:16.832197 master-0 kubenswrapper[18707]: I0320 09:03:16.829540 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jbfpt" Mar 20 09:03:16.844211 master-0 kubenswrapper[18707]: I0320 09:03:16.842759 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-2zdk8" event={"ID":"6ca698eb-ae09-4cfd-ac04-7350744653a6","Type":"ContainerStarted","Data":"702522559fe6c448e63bd72811a9ff6deccac9683408a47194ec9717a884fa0b"} Mar 20 09:03:16.844211 master-0 kubenswrapper[18707]: I0320 09:03:16.843545 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-2zdk8" Mar 20 09:03:16.856666 master-0 kubenswrapper[18707]: I0320 09:03:16.856579 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5rvds" event={"ID":"f296b061-4bb0-4fb1-9896-3cd75e58e81c","Type":"ContainerStarted","Data":"c5a6ef0fb640d8db943473da5a5f2b493978ed44089a56974789748e018ee100"} Mar 20 09:03:16.860208 master-0 kubenswrapper[18707]: I0320 09:03:16.857424 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5rvds" Mar 20 09:03:16.872609 master-0 kubenswrapper[18707]: I0320 09:03:16.872573 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9wvcj" event={"ID":"ca0c1d44-613d-4619-9a29-9cfeb3b57bc3","Type":"ContainerStarted","Data":"7b184f841858ded42e2c9997e2bf8cdf42edc6fa2c3573b68ef948d051a264fa"} Mar 20 09:03:16.873208 master-0 kubenswrapper[18707]: I0320 09:03:16.873161 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9wvcj" Mar 20 09:03:16.887210 master-0 kubenswrapper[18707]: I0320 09:03:16.883249 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2" event={"ID":"b9627b82-4e5a-4ceb-a906-0657397e66e9","Type":"ContainerStarted","Data":"f53c606f4e16a36fa27913165478c4d3fb9a888f7b6c436d260e86fcf335ac1a"} Mar 20 09:03:16.887210 master-0 kubenswrapper[18707]: I0320 09:03:16.883540 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2" Mar 20 09:03:16.898206 master-0 kubenswrapper[18707]: I0320 09:03:16.894411 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lbvvg" event={"ID":"54ea9e6a-4a15-4797-9a0d-110409a277f1","Type":"ContainerStarted","Data":"4a24f684223b42ab6af844ec17afe1f2536942d1bc800748dfcbd7eb62a2b33c"} Mar 20 09:03:16.898206 master-0 kubenswrapper[18707]: I0320 09:03:16.894664 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lbvvg" Mar 20 09:03:16.915220 master-0 kubenswrapper[18707]: I0320 09:03:16.912137 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" event={"ID":"24bbc2db-37fd-4bae-a9e5-9edb02f2c783","Type":"ContainerStarted","Data":"e6482c6356b9a5da416c9d5e6c0c378863aa5a586768f387dea975a4671d40f8"} Mar 20 09:03:16.915220 master-0 kubenswrapper[18707]: I0320 09:03:16.912178 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" Mar 20 09:03:16.934216 master-0 kubenswrapper[18707]: I0320 09:03:16.930340 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kp7cg" event={"ID":"ee78381d-189c-4734-b105-5a47fd7af734","Type":"ContainerStarted","Data":"ab8237bef172602222f4fccceeb363519b63fb292eab7778c371ee88c060cb9e"} Mar 20 09:03:16.934216 master-0 kubenswrapper[18707]: I0320 09:03:16.930935 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kp7cg" Mar 20 09:03:16.941560 master-0 kubenswrapper[18707]: I0320 09:03:16.941502 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vxvcl" event={"ID":"9e52ec39-7a12-433b-8d5c-2df52aa87657","Type":"ContainerStarted","Data":"90f50edc8ed2e0cc65f15c203329680c1e9041338250723b98f21e04767b1051"} Mar 20 09:03:16.942333 master-0 kubenswrapper[18707]: I0320 09:03:16.942267 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vxvcl" Mar 20 09:03:16.954592 master-0 kubenswrapper[18707]: I0320 09:03:16.954532 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9" event={"ID":"fbd2d905-b71b-408c-b669-1a0a43ce9b2f","Type":"ContainerStarted","Data":"1b9453326bd9a7cb96022afcc440bb825bec854f4cbc7038e50291151c9f1f39"} Mar 20 09:03:16.955540 master-0 kubenswrapper[18707]: I0320 09:03:16.955476 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9" Mar 20 09:03:16.967659 master-0 kubenswrapper[18707]: I0320 09:03:16.967581 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t875d" event={"ID":"f0712ef1-3983-4b74-8985-16e6f0d9ed18","Type":"ContainerStarted","Data":"817e258d27e43bf081075618ceb7a199fe207a97cc978f75d37640f5c78a5768"} Mar 20 09:03:16.969215 master-0 kubenswrapper[18707]: I0320 09:03:16.968933 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t875d" Mar 20 09:03:16.981645 master-0 kubenswrapper[18707]: I0320 09:03:16.981403 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-d49s9" event={"ID":"88c892a9-383f-47bf-a78c-7498cfdc49fa","Type":"ContainerStarted","Data":"5db100a14de53dd76aa66cc3ce97a13f0bc1abaa9ba6e53dbbb8730f12514493"} Mar 20 09:03:16.984253 master-0 kubenswrapper[18707]: I0320 09:03:16.982552 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-d49s9" Mar 20 09:03:16.999682 master-0 kubenswrapper[18707]: I0320 09:03:16.999406 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bghwc" event={"ID":"426a6be9-aeba-4865-bea3-0eacca0f445f","Type":"ContainerStarted","Data":"859f2225e6a351c1437614f06cd8bc5ac945eec5b22dba832fb549c1c77a3ff1"} Mar 20 09:03:17.003212 master-0 kubenswrapper[18707]: I0320 09:03:17.000272 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bghwc" Mar 20 09:03:17.020398 master-0 kubenswrapper[18707]: I0320 09:03:17.020355 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" event={"ID":"109b7903-cd46-4a38-93c2-87253251c130","Type":"ContainerStarted","Data":"86d8f99eccf592938e10dac5be1c4825de6d5200d9e172d031c96eaa477ba53b"} Mar 20 09:03:17.020398 master-0 kubenswrapper[18707]: I0320 09:03:17.020391 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" Mar 20 09:03:17.192207 master-0 kubenswrapper[18707]: I0320 09:03:17.191117 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nt245" podStartSLOduration=6.219817223 podStartE2EDuration="27.191103237s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:02:52.452005023 +0000 UTC m=+1317.608185379" lastFinishedPulling="2026-03-20 09:03:13.423291007 +0000 UTC m=+1338.579471393" observedRunningTime="2026-03-20 09:03:17.043683864 +0000 UTC m=+1342.199864220" watchObservedRunningTime="2026-03-20 09:03:17.191103237 +0000 UTC m=+1342.347283593" Mar 20 09:03:17.393660 master-0 kubenswrapper[18707]: I0320 09:03:17.393591 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jbfpt" podStartSLOduration=10.923848954 podStartE2EDuration="27.393555802s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:02:53.64943866 +0000 UTC m=+1318.805619016" lastFinishedPulling="2026-03-20 09:03:10.119145498 +0000 UTC m=+1335.275325864" observedRunningTime="2026-03-20 09:03:17.347534337 +0000 UTC m=+1342.503714693" watchObservedRunningTime="2026-03-20 09:03:17.393555802 +0000 UTC m=+1342.549736158" Mar 20 09:03:17.405847 master-0 kubenswrapper[18707]: I0320 09:03:17.405760 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kp7cg" podStartSLOduration=6.831006103 podStartE2EDuration="26.40572774s" podCreationTimestamp="2026-03-20 09:02:51 +0000 UTC" firstStartedPulling="2026-03-20 09:02:53.872224386 +0000 UTC m=+1319.028404742" lastFinishedPulling="2026-03-20 09:03:13.446946013 +0000 UTC m=+1338.603126379" observedRunningTime="2026-03-20 09:03:17.393751307 +0000 UTC m=+1342.549931663" watchObservedRunningTime="2026-03-20 09:03:17.40572774 +0000 UTC m=+1342.561908096" Mar 20 09:03:17.520213 master-0 kubenswrapper[18707]: I0320 09:03:17.516490 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s2dqt" podStartSLOduration=6.680106211 podStartE2EDuration="26.516470924s" podCreationTimestamp="2026-03-20 09:02:51 +0000 UTC" firstStartedPulling="2026-03-20 09:02:55.878288052 +0000 UTC m=+1321.034468408" lastFinishedPulling="2026-03-20 09:03:15.714652765 +0000 UTC m=+1340.870833121" observedRunningTime="2026-03-20 09:03:17.513556711 +0000 UTC m=+1342.669737067" watchObservedRunningTime="2026-03-20 09:03:17.516470924 +0000 UTC m=+1342.672651280" Mar 20 09:03:17.580998 master-0 kubenswrapper[18707]: I0320 09:03:17.580919 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5rvds" podStartSLOduration=7.569039057 podStartE2EDuration="27.580902336s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:02:53.41251761 +0000 UTC m=+1318.568697976" lastFinishedPulling="2026-03-20 09:03:13.424380859 +0000 UTC m=+1338.580561255" observedRunningTime="2026-03-20 09:03:17.574206944 +0000 UTC m=+1342.730387300" watchObservedRunningTime="2026-03-20 09:03:17.580902336 +0000 UTC m=+1342.737082692" Mar 20 09:03:17.643327 master-0 kubenswrapper[18707]: I0320 09:03:17.643249 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-srzmq" podStartSLOduration=5.799029603 podStartE2EDuration="26.643228726s" podCreationTimestamp="2026-03-20 09:02:51 +0000 UTC" firstStartedPulling="2026-03-20 09:02:53.905174688 +0000 UTC m=+1319.061355044" lastFinishedPulling="2026-03-20 09:03:14.749373811 +0000 UTC m=+1339.905554167" observedRunningTime="2026-03-20 09:03:17.639489919 +0000 UTC m=+1342.795670275" watchObservedRunningTime="2026-03-20 09:03:17.643228726 +0000 UTC m=+1342.799409082" Mar 20 09:03:17.711564 master-0 kubenswrapper[18707]: I0320 09:03:17.711419 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-2zdk8" podStartSLOduration=14.165397843 podStartE2EDuration="27.711399954s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:02:52.843772567 +0000 UTC m=+1317.999952923" lastFinishedPulling="2026-03-20 09:03:06.389774678 +0000 UTC m=+1331.545955034" observedRunningTime="2026-03-20 09:03:17.705901417 +0000 UTC m=+1342.862081773" watchObservedRunningTime="2026-03-20 09:03:17.711399954 +0000 UTC m=+1342.867580310" Mar 20 09:03:17.740573 master-0 kubenswrapper[18707]: I0320 09:03:17.740495 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9wvcj" podStartSLOduration=8.194508558 podStartE2EDuration="27.740475434s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:02:53.900948317 +0000 UTC m=+1319.057128673" lastFinishedPulling="2026-03-20 09:03:13.446915163 +0000 UTC m=+1338.603095549" observedRunningTime="2026-03-20 09:03:17.731748785 +0000 UTC m=+1342.887929141" watchObservedRunningTime="2026-03-20 09:03:17.740475434 +0000 UTC m=+1342.896655791" Mar 20 09:03:17.788619 master-0 kubenswrapper[18707]: I0320 09:03:17.788515 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bghwc" podStartSLOduration=6.781825461 podStartE2EDuration="27.788492667s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:02:52.416336983 +0000 UTC m=+1317.572517339" lastFinishedPulling="2026-03-20 09:03:13.423004179 +0000 UTC m=+1338.579184545" observedRunningTime="2026-03-20 09:03:17.781280341 +0000 UTC m=+1342.937460697" watchObservedRunningTime="2026-03-20 09:03:17.788492667 +0000 UTC m=+1342.944673023" Mar 20 09:03:17.832050 master-0 kubenswrapper[18707]: I0320 09:03:17.831970 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-d49s9" podStartSLOduration=9.117627548 podStartE2EDuration="27.831954269s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:02:53.412079837 +0000 UTC m=+1318.568260193" lastFinishedPulling="2026-03-20 09:03:12.126406538 +0000 UTC m=+1337.282586914" observedRunningTime="2026-03-20 09:03:17.829397566 +0000 UTC m=+1342.985577922" watchObservedRunningTime="2026-03-20 09:03:17.831954269 +0000 UTC m=+1342.988134625" Mar 20 09:03:17.860483 master-0 kubenswrapper[18707]: I0320 09:03:17.860396 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94tfj" podStartSLOduration=11.160032192 podStartE2EDuration="27.860373011s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:02:53.418784999 +0000 UTC m=+1318.574965355" lastFinishedPulling="2026-03-20 09:03:10.119125808 +0000 UTC m=+1335.275306174" observedRunningTime="2026-03-20 09:03:17.856963963 +0000 UTC m=+1343.013144319" watchObservedRunningTime="2026-03-20 09:03:17.860373011 +0000 UTC m=+1343.016553367" Mar 20 09:03:17.896368 master-0 kubenswrapper[18707]: I0320 09:03:17.896302 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vxvcl" podStartSLOduration=7.914812976 podStartE2EDuration="27.896285287s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:02:53.464268638 +0000 UTC m=+1318.620448994" lastFinishedPulling="2026-03-20 09:03:13.445740949 +0000 UTC m=+1338.601921305" observedRunningTime="2026-03-20 09:03:17.888159755 +0000 UTC m=+1343.044340111" watchObservedRunningTime="2026-03-20 09:03:17.896285287 +0000 UTC m=+1343.052465643" Mar 20 09:03:17.929107 master-0 kubenswrapper[18707]: I0320 09:03:17.929001 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2" podStartSLOduration=10.687178959 podStartE2EDuration="27.928981451s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:02:52.878705595 +0000 UTC m=+1318.034885941" lastFinishedPulling="2026-03-20 09:03:10.120508067 +0000 UTC m=+1335.276688433" observedRunningTime="2026-03-20 09:03:17.914965581 +0000 UTC m=+1343.071145937" watchObservedRunningTime="2026-03-20 09:03:17.928981451 +0000 UTC m=+1343.085161807" Mar 20 09:03:17.982982 master-0 kubenswrapper[18707]: I0320 09:03:17.979695 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9" podStartSLOduration=5.036257879 podStartE2EDuration="27.97967369s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:02:52.852101885 +0000 UTC m=+1318.008282241" lastFinishedPulling="2026-03-20 09:03:15.795517696 +0000 UTC m=+1340.951698052" observedRunningTime="2026-03-20 09:03:17.950779684 +0000 UTC m=+1343.106960040" watchObservedRunningTime="2026-03-20 09:03:17.97967369 +0000 UTC m=+1343.135854046" Mar 20 09:03:17.982982 master-0 kubenswrapper[18707]: I0320 09:03:17.981453 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" podStartSLOduration=7.435681141 podStartE2EDuration="26.981444621s" podCreationTimestamp="2026-03-20 09:02:51 +0000 UTC" firstStartedPulling="2026-03-20 09:02:53.901199414 +0000 UTC m=+1319.057379770" lastFinishedPulling="2026-03-20 09:03:13.446962884 +0000 UTC m=+1338.603143250" observedRunningTime="2026-03-20 09:03:17.976149419 +0000 UTC m=+1343.132329775" watchObservedRunningTime="2026-03-20 09:03:17.981444621 +0000 UTC m=+1343.137624977" Mar 20 09:03:18.068229 master-0 kubenswrapper[18707]: I0320 09:03:18.068137 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" podStartSLOduration=8.255512132 podStartE2EDuration="28.068112117s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:02:53.634335008 +0000 UTC m=+1318.790515364" lastFinishedPulling="2026-03-20 09:03:13.446934953 +0000 UTC m=+1338.603115349" observedRunningTime="2026-03-20 09:03:18.055553298 +0000 UTC m=+1343.211733664" watchObservedRunningTime="2026-03-20 09:03:18.068112117 +0000 UTC m=+1343.224292473" Mar 20 09:03:18.078203 master-0 kubenswrapper[18707]: I0320 09:03:18.072553 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lbvvg" podStartSLOduration=9.477916401 podStartE2EDuration="27.072544484s" podCreationTimestamp="2026-03-20 09:02:51 +0000 UTC" firstStartedPulling="2026-03-20 09:02:55.871734635 +0000 UTC m=+1321.027914991" lastFinishedPulling="2026-03-20 09:03:13.466362688 +0000 UTC m=+1338.622543074" observedRunningTime="2026-03-20 09:03:18.02061132 +0000 UTC m=+1343.176791676" watchObservedRunningTime="2026-03-20 09:03:18.072544484 +0000 UTC m=+1343.228724840" Mar 20 09:03:18.108218 master-0 kubenswrapper[18707]: I0320 09:03:18.106567 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t875d" podStartSLOduration=8.326669065 podStartE2EDuration="28.106542415s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:02:53.644749986 +0000 UTC m=+1318.800930332" lastFinishedPulling="2026-03-20 09:03:13.424623326 +0000 UTC m=+1338.580803682" observedRunningTime="2026-03-20 09:03:18.086788441 +0000 UTC m=+1343.242968797" watchObservedRunningTime="2026-03-20 09:03:18.106542415 +0000 UTC m=+1343.262722771" Mar 20 09:03:21.000724 master-0 kubenswrapper[18707]: I0320 09:03:21.000621 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-vds6m" Mar 20 09:03:21.055852 master-0 kubenswrapper[18707]: I0320 09:03:21.055063 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-sfshp" Mar 20 09:03:21.114237 master-0 kubenswrapper[18707]: I0320 09:03:21.113813 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-nt245" Mar 20 09:03:21.164963 master-0 kubenswrapper[18707]: I0320 09:03:21.164914 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-bghwc" Mar 20 09:03:21.218399 master-0 kubenswrapper[18707]: I0320 09:03:21.218156 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5h2n9" Mar 20 09:03:21.373590 master-0 kubenswrapper[18707]: I0320 09:03:21.373530 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jbfpt" Mar 20 09:03:21.481365 master-0 kubenswrapper[18707]: I0320 09:03:21.481269 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2" Mar 20 09:03:21.563027 master-0 kubenswrapper[18707]: I0320 09:03:21.562956 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-2zdk8" Mar 20 09:03:21.615950 master-0 kubenswrapper[18707]: I0320 09:03:21.615137 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-d49s9" Mar 20 09:03:21.626570 master-0 kubenswrapper[18707]: I0320 09:03:21.625218 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-vxvcl" Mar 20 09:03:21.668200 master-0 kubenswrapper[18707]: I0320 09:03:21.668110 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94tfj" Mar 20 09:03:21.692941 master-0 kubenswrapper[18707]: I0320 09:03:21.692898 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" Mar 20 09:03:21.906111 master-0 kubenswrapper[18707]: I0320 09:03:21.905844 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-t875d" Mar 20 09:03:21.926546 master-0 kubenswrapper[18707]: I0320 09:03:21.926481 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9wvcj" Mar 20 09:03:21.946266 master-0 kubenswrapper[18707]: I0320 09:03:21.943454 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-5rvds" Mar 20 09:03:22.245705 master-0 kubenswrapper[18707]: I0320 09:03:22.245505 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-srzmq" Mar 20 09:03:22.331637 master-0 kubenswrapper[18707]: I0320 09:03:22.331555 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-lbvvg" Mar 20 09:03:22.365922 master-0 kubenswrapper[18707]: I0320 09:03:22.365838 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" Mar 20 09:03:22.388267 master-0 kubenswrapper[18707]: I0320 09:03:22.386252 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kp7cg" Mar 20 09:03:22.994225 master-0 kubenswrapper[18707]: I0320 09:03:22.993395 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-7rp5f\" (UID: \"f0c833ae-7f6c-4ead-b991-040550152e41\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:03:22.998685 master-0 kubenswrapper[18707]: I0320 09:03:22.998620 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0c833ae-7f6c-4ead-b991-040550152e41-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-7rp5f\" (UID: \"f0c833ae-7f6c-4ead-b991-040550152e41\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:03:23.229899 master-0 kubenswrapper[18707]: I0320 09:03:23.229828 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:03:23.505433 master-0 kubenswrapper[18707]: I0320 09:03:23.504714 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899s6vn4\" (UID: \"8053f444-54ad-4a79-8bac-8ee78d1d081b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:03:23.512790 master-0 kubenswrapper[18707]: I0320 09:03:23.511229 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8053f444-54ad-4a79-8bac-8ee78d1d081b-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899s6vn4\" (UID: \"8053f444-54ad-4a79-8bac-8ee78d1d081b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:03:23.635859 master-0 kubenswrapper[18707]: I0320 09:03:23.635782 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:03:23.729324 master-0 kubenswrapper[18707]: I0320 09:03:23.723404 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f"] Mar 20 09:03:23.734648 master-0 kubenswrapper[18707]: W0320 09:03:23.732588 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0c833ae_7f6c_4ead_b991_040550152e41.slice/crio-0ee0f0d43be489669ffde33959502d30d96f2fbd4daf6940c2c2033b8e4936f9 WatchSource:0}: Error finding container 0ee0f0d43be489669ffde33959502d30d96f2fbd4daf6940c2c2033b8e4936f9: Status 404 returned error can't find the container with id 0ee0f0d43be489669ffde33959502d30d96f2fbd4daf6940c2c2033b8e4936f9 Mar 20 09:03:23.810645 master-0 kubenswrapper[18707]: I0320 09:03:23.810573 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:03:23.810881 master-0 kubenswrapper[18707]: I0320 09:03:23.810849 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:03:23.814587 master-0 kubenswrapper[18707]: I0320 09:03:23.814543 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:03:23.818069 master-0 kubenswrapper[18707]: I0320 09:03:23.818020 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98570795-1969-459b-9703-aca836adab58-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-cfkmp\" (UID: \"98570795-1969-459b-9703-aca836adab58\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:03:23.918211 master-0 kubenswrapper[18707]: I0320 09:03:23.918133 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:03:24.126687 master-0 kubenswrapper[18707]: I0320 09:03:24.125988 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" event={"ID":"f0c833ae-7f6c-4ead-b991-040550152e41","Type":"ContainerStarted","Data":"0ee0f0d43be489669ffde33959502d30d96f2fbd4daf6940c2c2033b8e4936f9"} Mar 20 09:03:24.162986 master-0 kubenswrapper[18707]: I0320 09:03:24.162916 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4"] Mar 20 09:03:24.403854 master-0 kubenswrapper[18707]: I0320 09:03:24.403795 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp"] Mar 20 09:03:24.405407 master-0 kubenswrapper[18707]: W0320 09:03:24.405369 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98570795_1969_459b_9703_aca836adab58.slice/crio-27cd7456bfbbc301f1f75eb30dcbf66f7990480deac8a3ce97faac9fb315bfb9 WatchSource:0}: Error finding container 27cd7456bfbbc301f1f75eb30dcbf66f7990480deac8a3ce97faac9fb315bfb9: Status 404 returned error can't find the container with id 27cd7456bfbbc301f1f75eb30dcbf66f7990480deac8a3ce97faac9fb315bfb9 Mar 20 09:03:25.151159 master-0 kubenswrapper[18707]: I0320 09:03:25.151091 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" event={"ID":"98570795-1969-459b-9703-aca836adab58","Type":"ContainerStarted","Data":"34f762151f014d788e52d819ad7b33e1beb18449981bad9cc8b0de79a448dca7"} Mar 20 09:03:25.151159 master-0 kubenswrapper[18707]: I0320 09:03:25.151151 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" event={"ID":"98570795-1969-459b-9703-aca836adab58","Type":"ContainerStarted","Data":"27cd7456bfbbc301f1f75eb30dcbf66f7990480deac8a3ce97faac9fb315bfb9"} Mar 20 09:03:25.153741 master-0 kubenswrapper[18707]: I0320 09:03:25.151947 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:03:25.153741 master-0 kubenswrapper[18707]: I0320 09:03:25.153434 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" event={"ID":"8053f444-54ad-4a79-8bac-8ee78d1d081b","Type":"ContainerStarted","Data":"4c5da69c7ab4a31e8562290a486861987272084d5c0ecafe9d18b8bcddda918f"} Mar 20 09:03:25.235634 master-0 kubenswrapper[18707]: I0320 09:03:25.235539 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" podStartSLOduration=34.235512781 podStartE2EDuration="34.235512781s" podCreationTimestamp="2026-03-20 09:02:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:03:25.230276402 +0000 UTC m=+1350.386456748" watchObservedRunningTime="2026-03-20 09:03:25.235512781 +0000 UTC m=+1350.391693147" Mar 20 09:03:27.174170 master-0 kubenswrapper[18707]: I0320 09:03:27.174101 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" event={"ID":"f0c833ae-7f6c-4ead-b991-040550152e41","Type":"ContainerStarted","Data":"68ff2b4e1d49eff05b4e4cd36ce33892d4a6cd4a7b0b5b543adb8c584cb7a473"} Mar 20 09:03:27.174741 master-0 kubenswrapper[18707]: I0320 09:03:27.174243 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:03:27.178905 master-0 kubenswrapper[18707]: I0320 09:03:27.178841 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" event={"ID":"8053f444-54ad-4a79-8bac-8ee78d1d081b","Type":"ContainerStarted","Data":"891dfc58f84db38966e061317bdfe7ab0aa0a40372ac95b82f623fb3383a9447"} Mar 20 09:03:27.179250 master-0 kubenswrapper[18707]: I0320 09:03:27.179219 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:03:27.204973 master-0 kubenswrapper[18707]: I0320 09:03:27.204877 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" podStartSLOduration=34.186870937 podStartE2EDuration="37.204846638s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:03:23.752603167 +0000 UTC m=+1348.908783533" lastFinishedPulling="2026-03-20 09:03:26.770578878 +0000 UTC m=+1351.926759234" observedRunningTime="2026-03-20 09:03:27.191709512 +0000 UTC m=+1352.347889878" watchObservedRunningTime="2026-03-20 09:03:27.204846638 +0000 UTC m=+1352.361026994" Mar 20 09:03:27.244581 master-0 kubenswrapper[18707]: I0320 09:03:27.244502 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" podStartSLOduration=34.611125609 podStartE2EDuration="37.24448319s" podCreationTimestamp="2026-03-20 09:02:50 +0000 UTC" firstStartedPulling="2026-03-20 09:03:24.187848464 +0000 UTC m=+1349.344028840" lastFinishedPulling="2026-03-20 09:03:26.821206065 +0000 UTC m=+1351.977386421" observedRunningTime="2026-03-20 09:03:27.22697676 +0000 UTC m=+1352.383157116" watchObservedRunningTime="2026-03-20 09:03:27.24448319 +0000 UTC m=+1352.400663546" Mar 20 09:03:33.236628 master-0 kubenswrapper[18707]: I0320 09:03:33.235990 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-7rp5f" Mar 20 09:03:33.642943 master-0 kubenswrapper[18707]: I0320 09:03:33.642868 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:03:33.927359 master-0 kubenswrapper[18707]: I0320 09:03:33.925896 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-cfkmp" Mar 20 09:04:18.960013 master-0 kubenswrapper[18707]: I0320 09:04:18.958504 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-qwz94"] Mar 20 09:04:18.965232 master-0 kubenswrapper[18707]: I0320 09:04:18.960664 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-qwz94" Mar 20 09:04:18.965232 master-0 kubenswrapper[18707]: I0320 09:04:18.964691 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 09:04:18.965232 master-0 kubenswrapper[18707]: I0320 09:04:18.964882 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 09:04:18.966288 master-0 kubenswrapper[18707]: I0320 09:04:18.965690 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 09:04:18.970226 master-0 kubenswrapper[18707]: I0320 09:04:18.966738 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-qwz94"] Mar 20 09:04:19.003700 master-0 kubenswrapper[18707]: I0320 09:04:19.003642 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-24d8n"] Mar 20 09:04:19.013748 master-0 kubenswrapper[18707]: I0320 09:04:19.013688 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" Mar 20 09:04:19.017081 master-0 kubenswrapper[18707]: I0320 09:04:19.017037 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 09:04:19.036316 master-0 kubenswrapper[18707]: I0320 09:04:19.036260 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-24d8n"] Mar 20 09:04:19.068653 master-0 kubenswrapper[18707]: I0320 09:04:19.068606 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d69l\" (UniqueName: \"kubernetes.io/projected/5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f-kube-api-access-4d69l\") pod \"dnsmasq-dns-685c76cf85-qwz94\" (UID: \"5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f\") " pod="openstack/dnsmasq-dns-685c76cf85-qwz94" Mar 20 09:04:19.068802 master-0 kubenswrapper[18707]: I0320 09:04:19.068691 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f-config\") pod \"dnsmasq-dns-685c76cf85-qwz94\" (UID: \"5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f\") " pod="openstack/dnsmasq-dns-685c76cf85-qwz94" Mar 20 09:04:19.171272 master-0 kubenswrapper[18707]: I0320 09:04:19.171194 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d69l\" (UniqueName: \"kubernetes.io/projected/5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f-kube-api-access-4d69l\") pod \"dnsmasq-dns-685c76cf85-qwz94\" (UID: \"5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f\") " pod="openstack/dnsmasq-dns-685c76cf85-qwz94" Mar 20 09:04:19.172448 master-0 kubenswrapper[18707]: I0320 09:04:19.171822 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f-config\") pod \"dnsmasq-dns-685c76cf85-qwz94\" (UID: \"5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f\") " pod="openstack/dnsmasq-dns-685c76cf85-qwz94" Mar 20 09:04:19.172448 master-0 kubenswrapper[18707]: I0320 09:04:19.172281 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bmhb\" (UniqueName: \"kubernetes.io/projected/ccf0e477-d464-4b68-8116-f067791e0b48-kube-api-access-8bmhb\") pod \"dnsmasq-dns-8476fd89bc-24d8n\" (UID: \"ccf0e477-d464-4b68-8116-f067791e0b48\") " pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" Mar 20 09:04:19.172448 master-0 kubenswrapper[18707]: I0320 09:04:19.172349 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccf0e477-d464-4b68-8116-f067791e0b48-config\") pod \"dnsmasq-dns-8476fd89bc-24d8n\" (UID: \"ccf0e477-d464-4b68-8116-f067791e0b48\") " pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" Mar 20 09:04:19.172448 master-0 kubenswrapper[18707]: I0320 09:04:19.172424 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccf0e477-d464-4b68-8116-f067791e0b48-dns-svc\") pod \"dnsmasq-dns-8476fd89bc-24d8n\" (UID: \"ccf0e477-d464-4b68-8116-f067791e0b48\") " pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" Mar 20 09:04:19.172905 master-0 kubenswrapper[18707]: I0320 09:04:19.172863 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f-config\") pod \"dnsmasq-dns-685c76cf85-qwz94\" (UID: \"5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f\") " pod="openstack/dnsmasq-dns-685c76cf85-qwz94" Mar 20 09:04:19.188010 master-0 kubenswrapper[18707]: I0320 09:04:19.187957 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d69l\" (UniqueName: \"kubernetes.io/projected/5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f-kube-api-access-4d69l\") pod \"dnsmasq-dns-685c76cf85-qwz94\" (UID: \"5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f\") " pod="openstack/dnsmasq-dns-685c76cf85-qwz94" Mar 20 09:04:19.275231 master-0 kubenswrapper[18707]: I0320 09:04:19.275039 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bmhb\" (UniqueName: \"kubernetes.io/projected/ccf0e477-d464-4b68-8116-f067791e0b48-kube-api-access-8bmhb\") pod \"dnsmasq-dns-8476fd89bc-24d8n\" (UID: \"ccf0e477-d464-4b68-8116-f067791e0b48\") " pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" Mar 20 09:04:19.275615 master-0 kubenswrapper[18707]: I0320 09:04:19.275581 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccf0e477-d464-4b68-8116-f067791e0b48-config\") pod \"dnsmasq-dns-8476fd89bc-24d8n\" (UID: \"ccf0e477-d464-4b68-8116-f067791e0b48\") " pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" Mar 20 09:04:19.275805 master-0 kubenswrapper[18707]: I0320 09:04:19.275779 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccf0e477-d464-4b68-8116-f067791e0b48-dns-svc\") pod \"dnsmasq-dns-8476fd89bc-24d8n\" (UID: \"ccf0e477-d464-4b68-8116-f067791e0b48\") " pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" Mar 20 09:04:19.278261 master-0 kubenswrapper[18707]: I0320 09:04:19.276676 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccf0e477-d464-4b68-8116-f067791e0b48-config\") pod \"dnsmasq-dns-8476fd89bc-24d8n\" (UID: \"ccf0e477-d464-4b68-8116-f067791e0b48\") " pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" Mar 20 09:04:19.278261 master-0 kubenswrapper[18707]: I0320 09:04:19.276676 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccf0e477-d464-4b68-8116-f067791e0b48-dns-svc\") pod \"dnsmasq-dns-8476fd89bc-24d8n\" (UID: \"ccf0e477-d464-4b68-8116-f067791e0b48\") " pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" Mar 20 09:04:19.299033 master-0 kubenswrapper[18707]: I0320 09:04:19.298974 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bmhb\" (UniqueName: \"kubernetes.io/projected/ccf0e477-d464-4b68-8116-f067791e0b48-kube-api-access-8bmhb\") pod \"dnsmasq-dns-8476fd89bc-24d8n\" (UID: \"ccf0e477-d464-4b68-8116-f067791e0b48\") " pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" Mar 20 09:04:19.349201 master-0 kubenswrapper[18707]: I0320 09:04:19.349118 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-qwz94" Mar 20 09:04:19.361813 master-0 kubenswrapper[18707]: I0320 09:04:19.361759 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" Mar 20 09:04:19.865543 master-0 kubenswrapper[18707]: I0320 09:04:19.865393 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-24d8n"] Mar 20 09:04:19.877325 master-0 kubenswrapper[18707]: W0320 09:04:19.876659 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccf0e477_d464_4b68_8116_f067791e0b48.slice/crio-472bfcf4169aaac8d89260648a997394cad89285f067162897ebbbcc8fd9e23d WatchSource:0}: Error finding container 472bfcf4169aaac8d89260648a997394cad89285f067162897ebbbcc8fd9e23d: Status 404 returned error can't find the container with id 472bfcf4169aaac8d89260648a997394cad89285f067162897ebbbcc8fd9e23d Mar 20 09:04:19.903009 master-0 kubenswrapper[18707]: I0320 09:04:19.902964 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" event={"ID":"ccf0e477-d464-4b68-8116-f067791e0b48","Type":"ContainerStarted","Data":"472bfcf4169aaac8d89260648a997394cad89285f067162897ebbbcc8fd9e23d"} Mar 20 09:04:19.943612 master-0 kubenswrapper[18707]: W0320 09:04:19.943576 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f8e4b58_6209_4dde_94fd_4b8ddcf13a9f.slice/crio-650edb47d722e326c258d0a7578e69c6c7204da6a81edd628fa75a28d616f9d8 WatchSource:0}: Error finding container 650edb47d722e326c258d0a7578e69c6c7204da6a81edd628fa75a28d616f9d8: Status 404 returned error can't find the container with id 650edb47d722e326c258d0a7578e69c6c7204da6a81edd628fa75a28d616f9d8 Mar 20 09:04:19.948750 master-0 kubenswrapper[18707]: I0320 09:04:19.948684 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-qwz94"] Mar 20 09:04:20.917121 master-0 kubenswrapper[18707]: I0320 09:04:20.917046 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685c76cf85-qwz94" event={"ID":"5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f","Type":"ContainerStarted","Data":"650edb47d722e326c258d0a7578e69c6c7204da6a81edd628fa75a28d616f9d8"} Mar 20 09:04:21.528798 master-0 kubenswrapper[18707]: I0320 09:04:21.528695 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-qwz94"] Mar 20 09:04:21.546595 master-0 kubenswrapper[18707]: I0320 09:04:21.546533 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586dbdbb8c-28nq4"] Mar 20 09:04:21.554331 master-0 kubenswrapper[18707]: I0320 09:04:21.552041 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" Mar 20 09:04:21.576549 master-0 kubenswrapper[18707]: I0320 09:04:21.573816 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586dbdbb8c-28nq4"] Mar 20 09:04:21.633521 master-0 kubenswrapper[18707]: I0320 09:04:21.632618 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65x7g\" (UniqueName: \"kubernetes.io/projected/cbb818c8-8429-4748-a924-2b7f77d812da-kube-api-access-65x7g\") pod \"dnsmasq-dns-586dbdbb8c-28nq4\" (UID: \"cbb818c8-8429-4748-a924-2b7f77d812da\") " pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" Mar 20 09:04:21.633521 master-0 kubenswrapper[18707]: I0320 09:04:21.632732 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbb818c8-8429-4748-a924-2b7f77d812da-config\") pod \"dnsmasq-dns-586dbdbb8c-28nq4\" (UID: \"cbb818c8-8429-4748-a924-2b7f77d812da\") " pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" Mar 20 09:04:21.633521 master-0 kubenswrapper[18707]: I0320 09:04:21.632779 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbb818c8-8429-4748-a924-2b7f77d812da-dns-svc\") pod \"dnsmasq-dns-586dbdbb8c-28nq4\" (UID: \"cbb818c8-8429-4748-a924-2b7f77d812da\") " pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" Mar 20 09:04:21.734738 master-0 kubenswrapper[18707]: I0320 09:04:21.734680 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbb818c8-8429-4748-a924-2b7f77d812da-config\") pod \"dnsmasq-dns-586dbdbb8c-28nq4\" (UID: \"cbb818c8-8429-4748-a924-2b7f77d812da\") " pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" Mar 20 09:04:21.734969 master-0 kubenswrapper[18707]: I0320 09:04:21.734764 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbb818c8-8429-4748-a924-2b7f77d812da-dns-svc\") pod \"dnsmasq-dns-586dbdbb8c-28nq4\" (UID: \"cbb818c8-8429-4748-a924-2b7f77d812da\") " pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" Mar 20 09:04:21.734969 master-0 kubenswrapper[18707]: I0320 09:04:21.734899 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65x7g\" (UniqueName: \"kubernetes.io/projected/cbb818c8-8429-4748-a924-2b7f77d812da-kube-api-access-65x7g\") pod \"dnsmasq-dns-586dbdbb8c-28nq4\" (UID: \"cbb818c8-8429-4748-a924-2b7f77d812da\") " pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" Mar 20 09:04:21.735964 master-0 kubenswrapper[18707]: I0320 09:04:21.735911 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbb818c8-8429-4748-a924-2b7f77d812da-config\") pod \"dnsmasq-dns-586dbdbb8c-28nq4\" (UID: \"cbb818c8-8429-4748-a924-2b7f77d812da\") " pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" Mar 20 09:04:21.736046 master-0 kubenswrapper[18707]: I0320 09:04:21.735991 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbb818c8-8429-4748-a924-2b7f77d812da-dns-svc\") pod \"dnsmasq-dns-586dbdbb8c-28nq4\" (UID: \"cbb818c8-8429-4748-a924-2b7f77d812da\") " pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" Mar 20 09:04:21.775631 master-0 kubenswrapper[18707]: I0320 09:04:21.775580 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65x7g\" (UniqueName: \"kubernetes.io/projected/cbb818c8-8429-4748-a924-2b7f77d812da-kube-api-access-65x7g\") pod \"dnsmasq-dns-586dbdbb8c-28nq4\" (UID: \"cbb818c8-8429-4748-a924-2b7f77d812da\") " pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" Mar 20 09:04:21.923306 master-0 kubenswrapper[18707]: I0320 09:04:21.920944 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" Mar 20 09:04:21.954280 master-0 kubenswrapper[18707]: I0320 09:04:21.954215 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-24d8n"] Mar 20 09:04:22.015262 master-0 kubenswrapper[18707]: I0320 09:04:22.008455 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-47t6r"] Mar 20 09:04:22.028422 master-0 kubenswrapper[18707]: I0320 09:04:22.028366 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" Mar 20 09:04:22.037506 master-0 kubenswrapper[18707]: I0320 09:04:22.032035 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-47t6r"] Mar 20 09:04:22.049241 master-0 kubenswrapper[18707]: I0320 09:04:22.048436 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-dns-svc\") pod \"dnsmasq-dns-6ff8fd9d5c-47t6r\" (UID: \"01a3ea60-7016-443c-9fb2-aefe2bd1ee89\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" Mar 20 09:04:22.054546 master-0 kubenswrapper[18707]: I0320 09:04:22.054408 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-config\") pod \"dnsmasq-dns-6ff8fd9d5c-47t6r\" (UID: \"01a3ea60-7016-443c-9fb2-aefe2bd1ee89\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" Mar 20 09:04:22.054728 master-0 kubenswrapper[18707]: I0320 09:04:22.054703 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nvfv\" (UniqueName: \"kubernetes.io/projected/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-kube-api-access-4nvfv\") pod \"dnsmasq-dns-6ff8fd9d5c-47t6r\" (UID: \"01a3ea60-7016-443c-9fb2-aefe2bd1ee89\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" Mar 20 09:04:22.157573 master-0 kubenswrapper[18707]: I0320 09:04:22.157339 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nvfv\" (UniqueName: \"kubernetes.io/projected/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-kube-api-access-4nvfv\") pod \"dnsmasq-dns-6ff8fd9d5c-47t6r\" (UID: \"01a3ea60-7016-443c-9fb2-aefe2bd1ee89\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" Mar 20 09:04:22.157573 master-0 kubenswrapper[18707]: I0320 09:04:22.157456 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-dns-svc\") pod \"dnsmasq-dns-6ff8fd9d5c-47t6r\" (UID: \"01a3ea60-7016-443c-9fb2-aefe2bd1ee89\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" Mar 20 09:04:22.157573 master-0 kubenswrapper[18707]: I0320 09:04:22.157520 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-config\") pod \"dnsmasq-dns-6ff8fd9d5c-47t6r\" (UID: \"01a3ea60-7016-443c-9fb2-aefe2bd1ee89\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" Mar 20 09:04:22.160697 master-0 kubenswrapper[18707]: I0320 09:04:22.160669 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-dns-svc\") pod \"dnsmasq-dns-6ff8fd9d5c-47t6r\" (UID: \"01a3ea60-7016-443c-9fb2-aefe2bd1ee89\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" Mar 20 09:04:22.161364 master-0 kubenswrapper[18707]: I0320 09:04:22.161316 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-config\") pod \"dnsmasq-dns-6ff8fd9d5c-47t6r\" (UID: \"01a3ea60-7016-443c-9fb2-aefe2bd1ee89\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" Mar 20 09:04:22.192886 master-0 kubenswrapper[18707]: I0320 09:04:22.183064 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nvfv\" (UniqueName: \"kubernetes.io/projected/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-kube-api-access-4nvfv\") pod \"dnsmasq-dns-6ff8fd9d5c-47t6r\" (UID: \"01a3ea60-7016-443c-9fb2-aefe2bd1ee89\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" Mar 20 09:04:22.386443 master-0 kubenswrapper[18707]: I0320 09:04:22.382360 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" Mar 20 09:04:23.375695 master-0 kubenswrapper[18707]: I0320 09:04:23.375620 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586dbdbb8c-28nq4"] Mar 20 09:04:23.381422 master-0 kubenswrapper[18707]: W0320 09:04:23.381376 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbb818c8_8429_4748_a924_2b7f77d812da.slice/crio-17983fb7a58c903ba4d9a1f0ce014948be6414a68f01b83eb4a88a89876d673b WatchSource:0}: Error finding container 17983fb7a58c903ba4d9a1f0ce014948be6414a68f01b83eb4a88a89876d673b: Status 404 returned error can't find the container with id 17983fb7a58c903ba4d9a1f0ce014948be6414a68f01b83eb4a88a89876d673b Mar 20 09:04:23.385230 master-0 kubenswrapper[18707]: I0320 09:04:23.385178 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-47t6r"] Mar 20 09:04:23.980178 master-0 kubenswrapper[18707]: I0320 09:04:23.980104 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" event={"ID":"cbb818c8-8429-4748-a924-2b7f77d812da","Type":"ContainerStarted","Data":"17983fb7a58c903ba4d9a1f0ce014948be6414a68f01b83eb4a88a89876d673b"} Mar 20 09:04:24.028880 master-0 kubenswrapper[18707]: I0320 09:04:24.028802 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" event={"ID":"01a3ea60-7016-443c-9fb2-aefe2bd1ee89","Type":"ContainerStarted","Data":"82bee70d4c604fe1302f41dbcc75ae958b2f7d127c50166358639a16e86586d7"} Mar 20 09:04:25.754432 master-0 kubenswrapper[18707]: I0320 09:04:25.754358 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 09:04:25.757441 master-0 kubenswrapper[18707]: I0320 09:04:25.757388 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 09:04:25.766343 master-0 kubenswrapper[18707]: I0320 09:04:25.763731 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 09:04:25.766343 master-0 kubenswrapper[18707]: I0320 09:04:25.763939 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 09:04:25.766343 master-0 kubenswrapper[18707]: I0320 09:04:25.764068 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 09:04:25.766343 master-0 kubenswrapper[18707]: I0320 09:04:25.764259 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 09:04:25.766343 master-0 kubenswrapper[18707]: I0320 09:04:25.764371 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 09:04:25.766343 master-0 kubenswrapper[18707]: I0320 09:04:25.764498 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 09:04:25.768642 master-0 kubenswrapper[18707]: I0320 09:04:25.768558 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 09:04:25.917294 master-0 kubenswrapper[18707]: I0320 09:04:25.917245 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjm9g\" (UniqueName: \"kubernetes.io/projected/ce336090-2814-4550-a4c2-dd726e9b6ad2-kube-api-access-jjm9g\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:25.918458 master-0 kubenswrapper[18707]: I0320 09:04:25.918220 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ce336090-2814-4550-a4c2-dd726e9b6ad2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:25.918588 master-0 kubenswrapper[18707]: I0320 09:04:25.918573 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ce336090-2814-4550-a4c2-dd726e9b6ad2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:25.919591 master-0 kubenswrapper[18707]: I0320 09:04:25.919572 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ce336090-2814-4550-a4c2-dd726e9b6ad2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:25.920369 master-0 kubenswrapper[18707]: I0320 09:04:25.919897 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ce336090-2814-4550-a4c2-dd726e9b6ad2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:25.920499 master-0 kubenswrapper[18707]: I0320 09:04:25.920482 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ce336090-2814-4550-a4c2-dd726e9b6ad2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:25.920983 master-0 kubenswrapper[18707]: I0320 09:04:25.920965 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ce336090-2814-4550-a4c2-dd726e9b6ad2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:25.921275 master-0 kubenswrapper[18707]: I0320 09:04:25.921258 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3a9e7590-481d-455b-97e9-a4c82858f1df\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a69f0ee9-152c-44ca-8f6d-f979a3f16d1d\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:25.921620 master-0 kubenswrapper[18707]: I0320 09:04:25.921600 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ce336090-2814-4550-a4c2-dd726e9b6ad2-config-data\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:25.921715 master-0 kubenswrapper[18707]: I0320 09:04:25.921701 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ce336090-2814-4550-a4c2-dd726e9b6ad2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:25.921835 master-0 kubenswrapper[18707]: I0320 09:04:25.921819 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ce336090-2814-4550-a4c2-dd726e9b6ad2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.024719 master-0 kubenswrapper[18707]: I0320 09:04:26.024225 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ce336090-2814-4550-a4c2-dd726e9b6ad2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.024719 master-0 kubenswrapper[18707]: I0320 09:04:26.024299 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ce336090-2814-4550-a4c2-dd726e9b6ad2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.024719 master-0 kubenswrapper[18707]: I0320 09:04:26.024414 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ce336090-2814-4550-a4c2-dd726e9b6ad2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.024719 master-0 kubenswrapper[18707]: I0320 09:04:26.024449 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ce336090-2814-4550-a4c2-dd726e9b6ad2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.024719 master-0 kubenswrapper[18707]: I0320 09:04:26.024476 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ce336090-2814-4550-a4c2-dd726e9b6ad2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.024719 master-0 kubenswrapper[18707]: I0320 09:04:26.024511 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ce336090-2814-4550-a4c2-dd726e9b6ad2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.025135 master-0 kubenswrapper[18707]: I0320 09:04:26.024922 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ce336090-2814-4550-a4c2-dd726e9b6ad2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.025135 master-0 kubenswrapper[18707]: I0320 09:04:26.025074 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ce336090-2814-4550-a4c2-dd726e9b6ad2-config-data\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.025135 master-0 kubenswrapper[18707]: I0320 09:04:26.025134 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ce336090-2814-4550-a4c2-dd726e9b6ad2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.025288 master-0 kubenswrapper[18707]: I0320 09:04:26.025257 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ce336090-2814-4550-a4c2-dd726e9b6ad2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.026123 master-0 kubenswrapper[18707]: I0320 09:04:26.025392 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjm9g\" (UniqueName: \"kubernetes.io/projected/ce336090-2814-4550-a4c2-dd726e9b6ad2-kube-api-access-jjm9g\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.026721 master-0 kubenswrapper[18707]: I0320 09:04:26.026370 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ce336090-2814-4550-a4c2-dd726e9b6ad2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.026721 master-0 kubenswrapper[18707]: I0320 09:04:26.026580 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ce336090-2814-4550-a4c2-dd726e9b6ad2-config-data\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.027031 master-0 kubenswrapper[18707]: I0320 09:04:26.026990 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ce336090-2814-4550-a4c2-dd726e9b6ad2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.028409 master-0 kubenswrapper[18707]: I0320 09:04:26.028356 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ce336090-2814-4550-a4c2-dd726e9b6ad2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.032460 master-0 kubenswrapper[18707]: I0320 09:04:26.032392 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ce336090-2814-4550-a4c2-dd726e9b6ad2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.037428 master-0 kubenswrapper[18707]: I0320 09:04:26.035779 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ce336090-2814-4550-a4c2-dd726e9b6ad2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.037602 master-0 kubenswrapper[18707]: I0320 09:04:26.035789 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ce336090-2814-4550-a4c2-dd726e9b6ad2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.038591 master-0 kubenswrapper[18707]: I0320 09:04:26.038533 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ce336090-2814-4550-a4c2-dd726e9b6ad2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.451897 master-0 kubenswrapper[18707]: I0320 09:04:26.451813 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3a9e7590-481d-455b-97e9-a4c82858f1df\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a69f0ee9-152c-44ca-8f6d-f979a3f16d1d\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.468615 master-0 kubenswrapper[18707]: I0320 09:04:26.465593 18707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 09:04:26.468615 master-0 kubenswrapper[18707]: I0320 09:04:26.465656 18707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3a9e7590-481d-455b-97e9-a4c82858f1df\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a69f0ee9-152c-44ca-8f6d-f979a3f16d1d\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/f0677222ec8cbdd67d2ca39419310a51dd6bf476075ca0fb8c82f74692304c7c/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.476870 master-0 kubenswrapper[18707]: I0320 09:04:26.476821 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjm9g\" (UniqueName: \"kubernetes.io/projected/ce336090-2814-4550-a4c2-dd726e9b6ad2-kube-api-access-jjm9g\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:26.619670 master-0 kubenswrapper[18707]: I0320 09:04:26.614366 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 09:04:26.619670 master-0 kubenswrapper[18707]: I0320 09:04:26.615808 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 09:04:26.628152 master-0 kubenswrapper[18707]: I0320 09:04:26.627579 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 09:04:26.644197 master-0 kubenswrapper[18707]: I0320 09:04:26.631620 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 09:04:26.644197 master-0 kubenswrapper[18707]: I0320 09:04:26.635062 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 09:04:26.646877 master-0 kubenswrapper[18707]: I0320 09:04:26.646749 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 09:04:26.779261 master-0 kubenswrapper[18707]: I0320 09:04:26.779118 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e517ef1-932a-41e6-85ac-e7004261a504-config-data\") pod \"memcached-0\" (UID: \"8e517ef1-932a-41e6-85ac-e7004261a504\") " pod="openstack/memcached-0" Mar 20 09:04:26.779261 master-0 kubenswrapper[18707]: I0320 09:04:26.779176 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7rxr\" (UniqueName: \"kubernetes.io/projected/8e517ef1-932a-41e6-85ac-e7004261a504-kube-api-access-v7rxr\") pod \"memcached-0\" (UID: \"8e517ef1-932a-41e6-85ac-e7004261a504\") " pod="openstack/memcached-0" Mar 20 09:04:26.779261 master-0 kubenswrapper[18707]: I0320 09:04:26.779232 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e517ef1-932a-41e6-85ac-e7004261a504-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8e517ef1-932a-41e6-85ac-e7004261a504\") " pod="openstack/memcached-0" Mar 20 09:04:26.779991 master-0 kubenswrapper[18707]: I0320 09:04:26.779313 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8e517ef1-932a-41e6-85ac-e7004261a504-kolla-config\") pod \"memcached-0\" (UID: \"8e517ef1-932a-41e6-85ac-e7004261a504\") " pod="openstack/memcached-0" Mar 20 09:04:26.779991 master-0 kubenswrapper[18707]: I0320 09:04:26.779445 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e517ef1-932a-41e6-85ac-e7004261a504-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8e517ef1-932a-41e6-85ac-e7004261a504\") " pod="openstack/memcached-0" Mar 20 09:04:26.882096 master-0 kubenswrapper[18707]: I0320 09:04:26.880941 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8e517ef1-932a-41e6-85ac-e7004261a504-kolla-config\") pod \"memcached-0\" (UID: \"8e517ef1-932a-41e6-85ac-e7004261a504\") " pod="openstack/memcached-0" Mar 20 09:04:26.882096 master-0 kubenswrapper[18707]: I0320 09:04:26.881071 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e517ef1-932a-41e6-85ac-e7004261a504-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8e517ef1-932a-41e6-85ac-e7004261a504\") " pod="openstack/memcached-0" Mar 20 09:04:26.882096 master-0 kubenswrapper[18707]: I0320 09:04:26.881174 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e517ef1-932a-41e6-85ac-e7004261a504-config-data\") pod \"memcached-0\" (UID: \"8e517ef1-932a-41e6-85ac-e7004261a504\") " pod="openstack/memcached-0" Mar 20 09:04:26.882096 master-0 kubenswrapper[18707]: I0320 09:04:26.881234 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7rxr\" (UniqueName: \"kubernetes.io/projected/8e517ef1-932a-41e6-85ac-e7004261a504-kube-api-access-v7rxr\") pod \"memcached-0\" (UID: \"8e517ef1-932a-41e6-85ac-e7004261a504\") " pod="openstack/memcached-0" Mar 20 09:04:26.882096 master-0 kubenswrapper[18707]: I0320 09:04:26.881276 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e517ef1-932a-41e6-85ac-e7004261a504-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8e517ef1-932a-41e6-85ac-e7004261a504\") " pod="openstack/memcached-0" Mar 20 09:04:26.883964 master-0 kubenswrapper[18707]: I0320 09:04:26.882777 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e517ef1-932a-41e6-85ac-e7004261a504-config-data\") pod \"memcached-0\" (UID: \"8e517ef1-932a-41e6-85ac-e7004261a504\") " pod="openstack/memcached-0" Mar 20 09:04:26.883964 master-0 kubenswrapper[18707]: I0320 09:04:26.883777 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8e517ef1-932a-41e6-85ac-e7004261a504-kolla-config\") pod \"memcached-0\" (UID: \"8e517ef1-932a-41e6-85ac-e7004261a504\") " pod="openstack/memcached-0" Mar 20 09:04:26.887832 master-0 kubenswrapper[18707]: I0320 09:04:26.887776 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e517ef1-932a-41e6-85ac-e7004261a504-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8e517ef1-932a-41e6-85ac-e7004261a504\") " pod="openstack/memcached-0" Mar 20 09:04:26.891260 master-0 kubenswrapper[18707]: I0320 09:04:26.891211 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e517ef1-932a-41e6-85ac-e7004261a504-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8e517ef1-932a-41e6-85ac-e7004261a504\") " pod="openstack/memcached-0" Mar 20 09:04:26.899662 master-0 kubenswrapper[18707]: I0320 09:04:26.899619 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7rxr\" (UniqueName: \"kubernetes.io/projected/8e517ef1-932a-41e6-85ac-e7004261a504-kube-api-access-v7rxr\") pod \"memcached-0\" (UID: \"8e517ef1-932a-41e6-85ac-e7004261a504\") " pod="openstack/memcached-0" Mar 20 09:04:26.978087 master-0 kubenswrapper[18707]: I0320 09:04:26.977931 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 09:04:27.207347 master-0 kubenswrapper[18707]: I0320 09:04:27.207244 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 09:04:27.215264 master-0 kubenswrapper[18707]: I0320 09:04:27.211332 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.226709 master-0 kubenswrapper[18707]: I0320 09:04:27.226637 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 09:04:27.226961 master-0 kubenswrapper[18707]: I0320 09:04:27.226916 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 09:04:27.227225 master-0 kubenswrapper[18707]: I0320 09:04:27.227139 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 09:04:27.227351 master-0 kubenswrapper[18707]: I0320 09:04:27.227326 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 09:04:27.227540 master-0 kubenswrapper[18707]: I0320 09:04:27.227509 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 09:04:27.227727 master-0 kubenswrapper[18707]: I0320 09:04:27.227689 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 09:04:27.268649 master-0 kubenswrapper[18707]: I0320 09:04:27.268584 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 09:04:27.400643 master-0 kubenswrapper[18707]: I0320 09:04:27.398412 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.400643 master-0 kubenswrapper[18707]: I0320 09:04:27.398487 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.400643 master-0 kubenswrapper[18707]: I0320 09:04:27.398518 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.400643 master-0 kubenswrapper[18707]: I0320 09:04:27.398539 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.400643 master-0 kubenswrapper[18707]: I0320 09:04:27.398570 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.400643 master-0 kubenswrapper[18707]: I0320 09:04:27.398590 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnw5h\" (UniqueName: \"kubernetes.io/projected/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-kube-api-access-hnw5h\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.400643 master-0 kubenswrapper[18707]: I0320 09:04:27.398616 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.400643 master-0 kubenswrapper[18707]: I0320 09:04:27.398636 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.400643 master-0 kubenswrapper[18707]: I0320 09:04:27.398655 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.400643 master-0 kubenswrapper[18707]: I0320 09:04:27.398676 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.400643 master-0 kubenswrapper[18707]: I0320 09:04:27.398703 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-83923b42-a350-4692-9612-54b29c1c60d2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^283ebff6-332f-4029-94e3-76d7654172d1\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.500053 master-0 kubenswrapper[18707]: I0320 09:04:27.499917 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.500053 master-0 kubenswrapper[18707]: I0320 09:04:27.499997 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.500053 master-0 kubenswrapper[18707]: I0320 09:04:27.500017 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.500053 master-0 kubenswrapper[18707]: I0320 09:04:27.500052 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.500444 master-0 kubenswrapper[18707]: I0320 09:04:27.500069 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnw5h\" (UniqueName: \"kubernetes.io/projected/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-kube-api-access-hnw5h\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.500444 master-0 kubenswrapper[18707]: I0320 09:04:27.500096 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.500444 master-0 kubenswrapper[18707]: I0320 09:04:27.500115 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.500444 master-0 kubenswrapper[18707]: I0320 09:04:27.500137 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.500444 master-0 kubenswrapper[18707]: I0320 09:04:27.500155 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.500444 master-0 kubenswrapper[18707]: I0320 09:04:27.500198 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-83923b42-a350-4692-9612-54b29c1c60d2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^283ebff6-332f-4029-94e3-76d7654172d1\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.500444 master-0 kubenswrapper[18707]: I0320 09:04:27.500279 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.501452 master-0 kubenswrapper[18707]: I0320 09:04:27.501232 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.504677 master-0 kubenswrapper[18707]: I0320 09:04:27.504636 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.510412 master-0 kubenswrapper[18707]: I0320 09:04:27.509808 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.510659 master-0 kubenswrapper[18707]: I0320 09:04:27.510482 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.513302 master-0 kubenswrapper[18707]: I0320 09:04:27.513024 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.516264 master-0 kubenswrapper[18707]: I0320 09:04:27.515095 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.519997 master-0 kubenswrapper[18707]: I0320 09:04:27.519874 18707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 09:04:27.519997 master-0 kubenswrapper[18707]: I0320 09:04:27.519923 18707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-83923b42-a350-4692-9612-54b29c1c60d2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^283ebff6-332f-4029-94e3-76d7654172d1\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c09a53762ee9113d78df2216a3b69fe91751392601ad7f72489185d9cd3988ec/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.521699 master-0 kubenswrapper[18707]: I0320 09:04:27.521359 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.524469 master-0 kubenswrapper[18707]: I0320 09:04:27.523960 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.534880 master-0 kubenswrapper[18707]: I0320 09:04:27.526563 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:27.535848 master-0 kubenswrapper[18707]: I0320 09:04:27.535811 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnw5h\" (UniqueName: \"kubernetes.io/projected/211c6c3f-43f8-4ae9-86a1-ca7d393db4e7-kube-api-access-hnw5h\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:28.129925 master-0 kubenswrapper[18707]: I0320 09:04:28.129852 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 09:04:28.139212 master-0 kubenswrapper[18707]: I0320 09:04:28.137646 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 09:04:28.169108 master-0 kubenswrapper[18707]: I0320 09:04:28.161839 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 09:04:28.169108 master-0 kubenswrapper[18707]: I0320 09:04:28.162202 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 09:04:28.169108 master-0 kubenswrapper[18707]: I0320 09:04:28.163664 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 09:04:28.175987 master-0 kubenswrapper[18707]: I0320 09:04:28.175366 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3a9e7590-481d-455b-97e9-a4c82858f1df\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a69f0ee9-152c-44ca-8f6d-f979a3f16d1d\") pod \"rabbitmq-server-0\" (UID: \"ce336090-2814-4550-a4c2-dd726e9b6ad2\") " pod="openstack/rabbitmq-server-0" Mar 20 09:04:28.200669 master-0 kubenswrapper[18707]: I0320 09:04:28.200611 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 09:04:28.240542 master-0 kubenswrapper[18707]: I0320 09:04:28.240265 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 09:04:28.241103 master-0 kubenswrapper[18707]: I0320 09:04:28.241011 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7f5b2eea-768b-474c-90fe-c7bb50fdf552\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f2afd0d9-bfd5-439f-9870-4251d96903b4\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.241103 master-0 kubenswrapper[18707]: I0320 09:04:28.241088 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ff08b89-230c-4277-8558-42f92623f53f-kolla-config\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.241369 master-0 kubenswrapper[18707]: I0320 09:04:28.241339 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8ff08b89-230c-4277-8558-42f92623f53f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.241452 master-0 kubenswrapper[18707]: I0320 09:04:28.241427 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ff08b89-230c-4277-8558-42f92623f53f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.241724 master-0 kubenswrapper[18707]: I0320 09:04:28.241611 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8ff08b89-230c-4277-8558-42f92623f53f-config-data-default\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.242659 master-0 kubenswrapper[18707]: I0320 09:04:28.241750 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwbvn\" (UniqueName: \"kubernetes.io/projected/8ff08b89-230c-4277-8558-42f92623f53f-kube-api-access-rwbvn\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.242659 master-0 kubenswrapper[18707]: I0320 09:04:28.241859 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ff08b89-230c-4277-8558-42f92623f53f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.242659 master-0 kubenswrapper[18707]: I0320 09:04:28.241906 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff08b89-230c-4277-8558-42f92623f53f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.344397 master-0 kubenswrapper[18707]: I0320 09:04:28.343989 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8ff08b89-230c-4277-8558-42f92623f53f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.344397 master-0 kubenswrapper[18707]: I0320 09:04:28.344050 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ff08b89-230c-4277-8558-42f92623f53f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.344844 master-0 kubenswrapper[18707]: I0320 09:04:28.344488 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8ff08b89-230c-4277-8558-42f92623f53f-config-data-default\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.344844 master-0 kubenswrapper[18707]: I0320 09:04:28.344709 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwbvn\" (UniqueName: \"kubernetes.io/projected/8ff08b89-230c-4277-8558-42f92623f53f-kube-api-access-rwbvn\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.344844 master-0 kubenswrapper[18707]: I0320 09:04:28.344767 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8ff08b89-230c-4277-8558-42f92623f53f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.345347 master-0 kubenswrapper[18707]: I0320 09:04:28.344899 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ff08b89-230c-4277-8558-42f92623f53f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.345347 master-0 kubenswrapper[18707]: I0320 09:04:28.344950 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff08b89-230c-4277-8558-42f92623f53f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.345668 master-0 kubenswrapper[18707]: I0320 09:04:28.345403 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7f5b2eea-768b-474c-90fe-c7bb50fdf552\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f2afd0d9-bfd5-439f-9870-4251d96903b4\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.345668 master-0 kubenswrapper[18707]: I0320 09:04:28.345433 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ff08b89-230c-4277-8558-42f92623f53f-kolla-config\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.346542 master-0 kubenswrapper[18707]: I0320 09:04:28.346319 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8ff08b89-230c-4277-8558-42f92623f53f-config-data-default\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.346667 master-0 kubenswrapper[18707]: I0320 09:04:28.346643 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ff08b89-230c-4277-8558-42f92623f53f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.351035 master-0 kubenswrapper[18707]: I0320 09:04:28.350979 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ff08b89-230c-4277-8558-42f92623f53f-kolla-config\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.351477 master-0 kubenswrapper[18707]: I0320 09:04:28.351436 18707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 09:04:28.351576 master-0 kubenswrapper[18707]: I0320 09:04:28.351500 18707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7f5b2eea-768b-474c-90fe-c7bb50fdf552\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f2afd0d9-bfd5-439f-9870-4251d96903b4\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/2ddfc0c80fa3c7364b6a78dd843e305cb6793b50f71ead0d1b27257b86beb850/globalmount\"" pod="openstack/openstack-galera-0" Mar 20 09:04:28.358638 master-0 kubenswrapper[18707]: I0320 09:04:28.353076 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ff08b89-230c-4277-8558-42f92623f53f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.374320 master-0 kubenswrapper[18707]: I0320 09:04:28.360583 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff08b89-230c-4277-8558-42f92623f53f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:28.374320 master-0 kubenswrapper[18707]: I0320 09:04:28.370026 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwbvn\" (UniqueName: \"kubernetes.io/projected/8ff08b89-230c-4277-8558-42f92623f53f-kube-api-access-rwbvn\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:29.020204 master-0 kubenswrapper[18707]: I0320 09:04:29.013659 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 09:04:29.020204 master-0 kubenswrapper[18707]: I0320 09:04:29.015658 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.020204 master-0 kubenswrapper[18707]: I0320 09:04:29.020008 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 09:04:29.020539 master-0 kubenswrapper[18707]: I0320 09:04:29.020449 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 09:04:29.020576 master-0 kubenswrapper[18707]: I0320 09:04:29.020568 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 09:04:29.033060 master-0 kubenswrapper[18707]: I0320 09:04:29.032999 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 09:04:29.067042 master-0 kubenswrapper[18707]: I0320 09:04:29.066978 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccecdff5-9216-446b-bfcc-0406fc64996b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.067042 master-0 kubenswrapper[18707]: I0320 09:04:29.067039 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-759b0178-3d51-437c-811b-ccaeafcbe2c4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^23be835f-0701-4d21-8226-3f72c49cfcf0\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.067398 master-0 kubenswrapper[18707]: I0320 09:04:29.067077 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ccecdff5-9216-446b-bfcc-0406fc64996b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.067398 master-0 kubenswrapper[18707]: I0320 09:04:29.067109 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccecdff5-9216-446b-bfcc-0406fc64996b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.067398 master-0 kubenswrapper[18707]: I0320 09:04:29.067156 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ccecdff5-9216-446b-bfcc-0406fc64996b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.067398 master-0 kubenswrapper[18707]: I0320 09:04:29.067241 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ccecdff5-9216-446b-bfcc-0406fc64996b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.067398 master-0 kubenswrapper[18707]: I0320 09:04:29.067260 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccecdff5-9216-446b-bfcc-0406fc64996b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.067398 master-0 kubenswrapper[18707]: I0320 09:04:29.067285 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwsl2\" (UniqueName: \"kubernetes.io/projected/ccecdff5-9216-446b-bfcc-0406fc64996b-kube-api-access-gwsl2\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.169505 master-0 kubenswrapper[18707]: I0320 09:04:29.169169 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ccecdff5-9216-446b-bfcc-0406fc64996b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.169505 master-0 kubenswrapper[18707]: I0320 09:04:29.169258 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccecdff5-9216-446b-bfcc-0406fc64996b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.173557 master-0 kubenswrapper[18707]: I0320 09:04:29.169515 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ccecdff5-9216-446b-bfcc-0406fc64996b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.173557 master-0 kubenswrapper[18707]: I0320 09:04:29.169935 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ccecdff5-9216-446b-bfcc-0406fc64996b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.173557 master-0 kubenswrapper[18707]: I0320 09:04:29.170336 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ccecdff5-9216-446b-bfcc-0406fc64996b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.173557 master-0 kubenswrapper[18707]: I0320 09:04:29.170682 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ccecdff5-9216-446b-bfcc-0406fc64996b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.173557 master-0 kubenswrapper[18707]: I0320 09:04:29.170740 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccecdff5-9216-446b-bfcc-0406fc64996b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.173557 master-0 kubenswrapper[18707]: I0320 09:04:29.170820 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwsl2\" (UniqueName: \"kubernetes.io/projected/ccecdff5-9216-446b-bfcc-0406fc64996b-kube-api-access-gwsl2\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.173557 master-0 kubenswrapper[18707]: I0320 09:04:29.171646 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccecdff5-9216-446b-bfcc-0406fc64996b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.173557 master-0 kubenswrapper[18707]: I0320 09:04:29.171711 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-759b0178-3d51-437c-811b-ccaeafcbe2c4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^23be835f-0701-4d21-8226-3f72c49cfcf0\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.173557 master-0 kubenswrapper[18707]: I0320 09:04:29.172288 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccecdff5-9216-446b-bfcc-0406fc64996b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.174432 master-0 kubenswrapper[18707]: I0320 09:04:29.174354 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ccecdff5-9216-446b-bfcc-0406fc64996b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.179178 master-0 kubenswrapper[18707]: I0320 09:04:29.179131 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccecdff5-9216-446b-bfcc-0406fc64996b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.179398 master-0 kubenswrapper[18707]: I0320 09:04:29.179359 18707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 09:04:29.179470 master-0 kubenswrapper[18707]: I0320 09:04:29.179418 18707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-759b0178-3d51-437c-811b-ccaeafcbe2c4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^23be835f-0701-4d21-8226-3f72c49cfcf0\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/0d576015a8e90c82c785a652a2e9851c837d26ca445f5aac58bf4faf53d919d0/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.187696 master-0 kubenswrapper[18707]: I0320 09:04:29.187642 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccecdff5-9216-446b-bfcc-0406fc64996b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.193534 master-0 kubenswrapper[18707]: I0320 09:04:29.193473 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwsl2\" (UniqueName: \"kubernetes.io/projected/ccecdff5-9216-446b-bfcc-0406fc64996b-kube-api-access-gwsl2\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:29.541245 master-0 kubenswrapper[18707]: I0320 09:04:29.541172 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-83923b42-a350-4692-9612-54b29c1c60d2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^283ebff6-332f-4029-94e3-76d7654172d1\") pod \"rabbitmq-cell1-server-0\" (UID: \"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:29.662046 master-0 kubenswrapper[18707]: I0320 09:04:29.659757 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:04:30.612254 master-0 kubenswrapper[18707]: I0320 09:04:30.612151 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7f5b2eea-768b-474c-90fe-c7bb50fdf552\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f2afd0d9-bfd5-439f-9870-4251d96903b4\") pod \"openstack-galera-0\" (UID: \"8ff08b89-230c-4277-8558-42f92623f53f\") " pod="openstack/openstack-galera-0" Mar 20 09:04:30.905411 master-0 kubenswrapper[18707]: I0320 09:04:30.905038 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 09:04:31.651823 master-0 kubenswrapper[18707]: I0320 09:04:31.651752 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-759b0178-3d51-437c-811b-ccaeafcbe2c4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^23be835f-0701-4d21-8226-3f72c49cfcf0\") pod \"openstack-cell1-galera-0\" (UID: \"ccecdff5-9216-446b-bfcc-0406fc64996b\") " pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:31.687811 master-0 kubenswrapper[18707]: I0320 09:04:31.687744 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-q5t48"] Mar 20 09:04:31.689155 master-0 kubenswrapper[18707]: I0320 09:04:31.689113 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.692891 master-0 kubenswrapper[18707]: I0320 09:04:31.692849 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 09:04:31.693275 master-0 kubenswrapper[18707]: I0320 09:04:31.693020 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 09:04:31.730540 master-0 kubenswrapper[18707]: I0320 09:04:31.729782 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q5t48"] Mar 20 09:04:31.741474 master-0 kubenswrapper[18707]: I0320 09:04:31.737880 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b75e9b7b-6504-4904-96af-66385e6649e4-scripts\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.741474 master-0 kubenswrapper[18707]: I0320 09:04:31.737999 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b75e9b7b-6504-4904-96af-66385e6649e4-var-log-ovn\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.741474 master-0 kubenswrapper[18707]: I0320 09:04:31.738043 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75e9b7b-6504-4904-96af-66385e6649e4-combined-ca-bundle\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.741474 master-0 kubenswrapper[18707]: I0320 09:04:31.738083 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr7pt\" (UniqueName: \"kubernetes.io/projected/b75e9b7b-6504-4904-96af-66385e6649e4-kube-api-access-rr7pt\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.741474 master-0 kubenswrapper[18707]: I0320 09:04:31.738106 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b75e9b7b-6504-4904-96af-66385e6649e4-ovn-controller-tls-certs\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.741474 master-0 kubenswrapper[18707]: I0320 09:04:31.738127 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b75e9b7b-6504-4904-96af-66385e6649e4-var-run-ovn\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.741474 master-0 kubenswrapper[18707]: I0320 09:04:31.738231 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b75e9b7b-6504-4904-96af-66385e6649e4-var-run\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.773149 master-0 kubenswrapper[18707]: I0320 09:04:31.773018 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 09:04:31.776490 master-0 kubenswrapper[18707]: I0320 09:04:31.774124 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-hz4j4"] Mar 20 09:04:31.778303 master-0 kubenswrapper[18707]: I0320 09:04:31.778254 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.805808 master-0 kubenswrapper[18707]: I0320 09:04:31.805755 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hz4j4"] Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.840091 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b75e9b7b-6504-4904-96af-66385e6649e4-var-log-ovn\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.840148 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75e9b7b-6504-4904-96af-66385e6649e4-combined-ca-bundle\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.840174 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/14a23f75-fa11-462d-8277-c7611103bf48-etc-ovs\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.840213 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr7pt\" (UniqueName: \"kubernetes.io/projected/b75e9b7b-6504-4904-96af-66385e6649e4-kube-api-access-rr7pt\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.840234 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b75e9b7b-6504-4904-96af-66385e6649e4-ovn-controller-tls-certs\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.840252 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b75e9b7b-6504-4904-96af-66385e6649e4-var-run-ovn\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.840287 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj7ld\" (UniqueName: \"kubernetes.io/projected/14a23f75-fa11-462d-8277-c7611103bf48-kube-api-access-pj7ld\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.840330 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b75e9b7b-6504-4904-96af-66385e6649e4-var-run\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.840349 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/14a23f75-fa11-462d-8277-c7611103bf48-var-lib\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.840399 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14a23f75-fa11-462d-8277-c7611103bf48-var-run\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.840416 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b75e9b7b-6504-4904-96af-66385e6649e4-scripts\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.840438 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a23f75-fa11-462d-8277-c7611103bf48-scripts\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.840456 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/14a23f75-fa11-462d-8277-c7611103bf48-var-log\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.840915 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b75e9b7b-6504-4904-96af-66385e6649e4-var-log-ovn\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.845689 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b75e9b7b-6504-4904-96af-66385e6649e4-var-run\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.845858 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b75e9b7b-6504-4904-96af-66385e6649e4-var-run-ovn\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.846355 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75e9b7b-6504-4904-96af-66385e6649e4-combined-ca-bundle\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.847238 master-0 kubenswrapper[18707]: I0320 09:04:31.846416 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b75e9b7b-6504-4904-96af-66385e6649e4-ovn-controller-tls-certs\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.848157 master-0 kubenswrapper[18707]: I0320 09:04:31.847539 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b75e9b7b-6504-4904-96af-66385e6649e4-scripts\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.868589 master-0 kubenswrapper[18707]: I0320 09:04:31.862902 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr7pt\" (UniqueName: \"kubernetes.io/projected/b75e9b7b-6504-4904-96af-66385e6649e4-kube-api-access-rr7pt\") pod \"ovn-controller-q5t48\" (UID: \"b75e9b7b-6504-4904-96af-66385e6649e4\") " pod="openstack/ovn-controller-q5t48" Mar 20 09:04:31.942133 master-0 kubenswrapper[18707]: I0320 09:04:31.942078 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/14a23f75-fa11-462d-8277-c7611103bf48-etc-ovs\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.942377 master-0 kubenswrapper[18707]: I0320 09:04:31.942170 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj7ld\" (UniqueName: \"kubernetes.io/projected/14a23f75-fa11-462d-8277-c7611103bf48-kube-api-access-pj7ld\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.942377 master-0 kubenswrapper[18707]: I0320 09:04:31.942235 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/14a23f75-fa11-462d-8277-c7611103bf48-var-lib\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.942377 master-0 kubenswrapper[18707]: I0320 09:04:31.942288 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14a23f75-fa11-462d-8277-c7611103bf48-var-run\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.942377 master-0 kubenswrapper[18707]: I0320 09:04:31.942312 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a23f75-fa11-462d-8277-c7611103bf48-scripts\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.942377 master-0 kubenswrapper[18707]: I0320 09:04:31.942329 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/14a23f75-fa11-462d-8277-c7611103bf48-var-log\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.942695 master-0 kubenswrapper[18707]: I0320 09:04:31.942620 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/14a23f75-fa11-462d-8277-c7611103bf48-var-log\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.942780 master-0 kubenswrapper[18707]: I0320 09:04:31.942695 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14a23f75-fa11-462d-8277-c7611103bf48-var-run\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.942780 master-0 kubenswrapper[18707]: I0320 09:04:31.942689 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/14a23f75-fa11-462d-8277-c7611103bf48-var-lib\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.942858 master-0 kubenswrapper[18707]: I0320 09:04:31.942830 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/14a23f75-fa11-462d-8277-c7611103bf48-etc-ovs\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.945546 master-0 kubenswrapper[18707]: I0320 09:04:31.945279 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a23f75-fa11-462d-8277-c7611103bf48-scripts\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:31.973514 master-0 kubenswrapper[18707]: I0320 09:04:31.960236 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj7ld\" (UniqueName: \"kubernetes.io/projected/14a23f75-fa11-462d-8277-c7611103bf48-kube-api-access-pj7ld\") pod \"ovn-controller-ovs-hz4j4\" (UID: \"14a23f75-fa11-462d-8277-c7611103bf48\") " pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:32.029397 master-0 kubenswrapper[18707]: I0320 09:04:32.029342 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5t48" Mar 20 09:04:32.140822 master-0 kubenswrapper[18707]: I0320 09:04:32.140689 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:33.286801 master-0 kubenswrapper[18707]: I0320 09:04:33.286720 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 09:04:33.291824 master-0 kubenswrapper[18707]: I0320 09:04:33.291764 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.295751 master-0 kubenswrapper[18707]: I0320 09:04:33.295697 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 09:04:33.295970 master-0 kubenswrapper[18707]: I0320 09:04:33.295942 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 09:04:33.296301 master-0 kubenswrapper[18707]: I0320 09:04:33.296178 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 09:04:33.296698 master-0 kubenswrapper[18707]: I0320 09:04:33.296667 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 09:04:33.321309 master-0 kubenswrapper[18707]: I0320 09:04:33.312298 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 09:04:33.384035 master-0 kubenswrapper[18707]: I0320 09:04:33.383966 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c9f7f5f2-b30e-4ea6-89f6-a25b5651e9cf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a3918185-01a0-401e-a028-45f645215a4e\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.384281 master-0 kubenswrapper[18707]: I0320 09:04:33.384046 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2783ea79-a8c6-46e1-b089-957a5276b691-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.384281 master-0 kubenswrapper[18707]: I0320 09:04:33.384082 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2783ea79-a8c6-46e1-b089-957a5276b691-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.384281 master-0 kubenswrapper[18707]: I0320 09:04:33.384118 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2783ea79-a8c6-46e1-b089-957a5276b691-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.384281 master-0 kubenswrapper[18707]: I0320 09:04:33.384191 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2783ea79-a8c6-46e1-b089-957a5276b691-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.384281 master-0 kubenswrapper[18707]: I0320 09:04:33.384250 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2783ea79-a8c6-46e1-b089-957a5276b691-config\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.384281 master-0 kubenswrapper[18707]: I0320 09:04:33.384281 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2783ea79-a8c6-46e1-b089-957a5276b691-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.384619 master-0 kubenswrapper[18707]: I0320 09:04:33.384329 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2nkn\" (UniqueName: \"kubernetes.io/projected/2783ea79-a8c6-46e1-b089-957a5276b691-kube-api-access-k2nkn\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.488438 master-0 kubenswrapper[18707]: I0320 09:04:33.487943 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2nkn\" (UniqueName: \"kubernetes.io/projected/2783ea79-a8c6-46e1-b089-957a5276b691-kube-api-access-k2nkn\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.488438 master-0 kubenswrapper[18707]: I0320 09:04:33.488075 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c9f7f5f2-b30e-4ea6-89f6-a25b5651e9cf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a3918185-01a0-401e-a028-45f645215a4e\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.488438 master-0 kubenswrapper[18707]: I0320 09:04:33.488107 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2783ea79-a8c6-46e1-b089-957a5276b691-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.488438 master-0 kubenswrapper[18707]: I0320 09:04:33.488166 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2783ea79-a8c6-46e1-b089-957a5276b691-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.488438 master-0 kubenswrapper[18707]: I0320 09:04:33.488226 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2783ea79-a8c6-46e1-b089-957a5276b691-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.488438 master-0 kubenswrapper[18707]: I0320 09:04:33.488295 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2783ea79-a8c6-46e1-b089-957a5276b691-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.488438 master-0 kubenswrapper[18707]: I0320 09:04:33.488388 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2783ea79-a8c6-46e1-b089-957a5276b691-config\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.488438 master-0 kubenswrapper[18707]: I0320 09:04:33.488421 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2783ea79-a8c6-46e1-b089-957a5276b691-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.489886 master-0 kubenswrapper[18707]: I0320 09:04:33.489805 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2783ea79-a8c6-46e1-b089-957a5276b691-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.491302 master-0 kubenswrapper[18707]: I0320 09:04:33.491066 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2783ea79-a8c6-46e1-b089-957a5276b691-config\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.491302 master-0 kubenswrapper[18707]: I0320 09:04:33.491164 18707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 09:04:33.491302 master-0 kubenswrapper[18707]: I0320 09:04:33.491205 18707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c9f7f5f2-b30e-4ea6-89f6-a25b5651e9cf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a3918185-01a0-401e-a028-45f645215a4e\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/f86fd77120f49c1f5401a20b5164ca154d1d69b503f7f4e74696c21659f751c5/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.493304 master-0 kubenswrapper[18707]: I0320 09:04:33.493247 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2783ea79-a8c6-46e1-b089-957a5276b691-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.496796 master-0 kubenswrapper[18707]: I0320 09:04:33.496704 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2783ea79-a8c6-46e1-b089-957a5276b691-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.500863 master-0 kubenswrapper[18707]: I0320 09:04:33.500801 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2783ea79-a8c6-46e1-b089-957a5276b691-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.503824 master-0 kubenswrapper[18707]: I0320 09:04:33.503787 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2783ea79-a8c6-46e1-b089-957a5276b691-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:33.549812 master-0 kubenswrapper[18707]: I0320 09:04:33.549764 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2nkn\" (UniqueName: \"kubernetes.io/projected/2783ea79-a8c6-46e1-b089-957a5276b691-kube-api-access-k2nkn\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:36.722565 master-0 kubenswrapper[18707]: I0320 09:04:36.720445 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c9f7f5f2-b30e-4ea6-89f6-a25b5651e9cf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a3918185-01a0-401e-a028-45f645215a4e\") pod \"ovsdbserver-nb-0\" (UID: \"2783ea79-a8c6-46e1-b089-957a5276b691\") " pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:36.934799 master-0 kubenswrapper[18707]: I0320 09:04:36.934647 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 09:04:38.752638 master-0 kubenswrapper[18707]: I0320 09:04:38.752557 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 09:04:38.754986 master-0 kubenswrapper[18707]: I0320 09:04:38.754929 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.760968 master-0 kubenswrapper[18707]: I0320 09:04:38.760909 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 09:04:38.761790 master-0 kubenswrapper[18707]: I0320 09:04:38.761749 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 09:04:38.762846 master-0 kubenswrapper[18707]: I0320 09:04:38.762822 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 09:04:38.776371 master-0 kubenswrapper[18707]: I0320 09:04:38.776314 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 09:04:38.833475 master-0 kubenswrapper[18707]: I0320 09:04:38.833389 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-config\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.833652 master-0 kubenswrapper[18707]: I0320 09:04:38.833518 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.833652 master-0 kubenswrapper[18707]: I0320 09:04:38.833599 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.833731 master-0 kubenswrapper[18707]: I0320 09:04:38.833684 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.833731 master-0 kubenswrapper[18707]: I0320 09:04:38.833719 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.833838 master-0 kubenswrapper[18707]: I0320 09:04:38.833810 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.833889 master-0 kubenswrapper[18707]: I0320 09:04:38.833861 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5b494783-e9ed-438b-9ae2-64b525749c17\" (UniqueName: \"kubernetes.io/csi/topolvm.io^2cd01b01-a706-4c8e-8e8f-8699a146d4e9\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.833925 master-0 kubenswrapper[18707]: I0320 09:04:38.833897 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5f4j\" (UniqueName: \"kubernetes.io/projected/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-kube-api-access-d5f4j\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.936150 master-0 kubenswrapper[18707]: I0320 09:04:38.936077 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.936368 master-0 kubenswrapper[18707]: I0320 09:04:38.936210 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.936368 master-0 kubenswrapper[18707]: I0320 09:04:38.936235 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.936368 master-0 kubenswrapper[18707]: I0320 09:04:38.936313 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.936368 master-0 kubenswrapper[18707]: I0320 09:04:38.936338 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5b494783-e9ed-438b-9ae2-64b525749c17\" (UniqueName: \"kubernetes.io/csi/topolvm.io^2cd01b01-a706-4c8e-8e8f-8699a146d4e9\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.936368 master-0 kubenswrapper[18707]: I0320 09:04:38.936362 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5f4j\" (UniqueName: \"kubernetes.io/projected/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-kube-api-access-d5f4j\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.936538 master-0 kubenswrapper[18707]: I0320 09:04:38.936410 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-config\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.936538 master-0 kubenswrapper[18707]: I0320 09:04:38.936450 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.938045 master-0 kubenswrapper[18707]: I0320 09:04:38.937795 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.938953 master-0 kubenswrapper[18707]: I0320 09:04:38.938924 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.939347 master-0 kubenswrapper[18707]: I0320 09:04:38.939310 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-config\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.939669 master-0 kubenswrapper[18707]: I0320 09:04:38.939591 18707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 09:04:38.940637 master-0 kubenswrapper[18707]: I0320 09:04:38.940597 18707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5b494783-e9ed-438b-9ae2-64b525749c17\" (UniqueName: \"kubernetes.io/csi/topolvm.io^2cd01b01-a706-4c8e-8e8f-8699a146d4e9\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/2c06a6e4432dc2a204228c08622fd7f7f360ab58ac6ab5479881babd635934ea/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.940706 master-0 kubenswrapper[18707]: I0320 09:04:38.940245 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.942014 master-0 kubenswrapper[18707]: I0320 09:04:38.941951 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.946685 master-0 kubenswrapper[18707]: I0320 09:04:38.946429 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:38.955440 master-0 kubenswrapper[18707]: I0320 09:04:38.955395 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5f4j\" (UniqueName: \"kubernetes.io/projected/0efe880d-c00f-4de4-bf1b-26958fb5ac9b-kube-api-access-d5f4j\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:40.425268 master-0 kubenswrapper[18707]: I0320 09:04:40.425191 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 09:04:40.529789 master-0 kubenswrapper[18707]: I0320 09:04:40.523391 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5b494783-e9ed-438b-9ae2-64b525749c17\" (UniqueName: \"kubernetes.io/csi/topolvm.io^2cd01b01-a706-4c8e-8e8f-8699a146d4e9\") pod \"ovsdbserver-sb-0\" (UID: \"0efe880d-c00f-4de4-bf1b-26958fb5ac9b\") " pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:40.587232 master-0 kubenswrapper[18707]: I0320 09:04:40.587135 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 09:04:40.722415 master-0 kubenswrapper[18707]: I0320 09:04:40.722309 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 09:04:40.736956 master-0 kubenswrapper[18707]: I0320 09:04:40.736867 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 09:04:40.790739 master-0 kubenswrapper[18707]: W0320 09:04:40.790647 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod211c6c3f_43f8_4ae9_86a1_ca7d393db4e7.slice/crio-3e569e082c8e68a02bbf4d1ca770654808174732eb05d4d37138708a500dfa9b WatchSource:0}: Error finding container 3e569e082c8e68a02bbf4d1ca770654808174732eb05d4d37138708a500dfa9b: Status 404 returned error can't find the container with id 3e569e082c8e68a02bbf4d1ca770654808174732eb05d4d37138708a500dfa9b Mar 20 09:04:40.972015 master-0 kubenswrapper[18707]: I0320 09:04:40.970820 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 09:04:40.984807 master-0 kubenswrapper[18707]: W0320 09:04:40.984723 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2783ea79_a8c6_46e1_b089_957a5276b691.slice/crio-6a237153608f8cd12483f1d3da08e319e78e9ed2a05caa56ca2e4cd792c7036e WatchSource:0}: Error finding container 6a237153608f8cd12483f1d3da08e319e78e9ed2a05caa56ca2e4cd792c7036e: Status 404 returned error can't find the container with id 6a237153608f8cd12483f1d3da08e319e78e9ed2a05caa56ca2e4cd792c7036e Mar 20 09:04:41.032415 master-0 kubenswrapper[18707]: W0320 09:04:41.032350 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e517ef1_932a_41e6_85ac_e7004261a504.slice/crio-7a93fe074f818c34b5e5b04ae5e74e0b937ab7c3a3e3d7725dde0207c90b298f WatchSource:0}: Error finding container 7a93fe074f818c34b5e5b04ae5e74e0b937ab7c3a3e3d7725dde0207c90b298f: Status 404 returned error can't find the container with id 7a93fe074f818c34b5e5b04ae5e74e0b937ab7c3a3e3d7725dde0207c90b298f Mar 20 09:04:41.040807 master-0 kubenswrapper[18707]: I0320 09:04:41.040745 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 09:04:41.049452 master-0 kubenswrapper[18707]: I0320 09:04:41.049406 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q5t48"] Mar 20 09:04:41.062687 master-0 kubenswrapper[18707]: I0320 09:04:41.062630 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 09:04:41.134872 master-0 kubenswrapper[18707]: I0320 09:04:41.134623 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hz4j4"] Mar 20 09:04:41.258274 master-0 kubenswrapper[18707]: I0320 09:04:41.258209 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 09:04:41.259882 master-0 kubenswrapper[18707]: W0320 09:04:41.259841 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0efe880d_c00f_4de4_bf1b_26958fb5ac9b.slice/crio-53132dcf7937c067c8524848be17804b37698063f20063a186dec35ddc7d3963 WatchSource:0}: Error finding container 53132dcf7937c067c8524848be17804b37698063f20063a186dec35ddc7d3963: Status 404 returned error can't find the container with id 53132dcf7937c067c8524848be17804b37698063f20063a186dec35ddc7d3963 Mar 20 09:04:41.284883 master-0 kubenswrapper[18707]: I0320 09:04:41.284843 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ce336090-2814-4550-a4c2-dd726e9b6ad2","Type":"ContainerStarted","Data":"59e67ad5773c5601a3dc566e13238f3809333fb229d17cf3959a34641da6d328"} Mar 20 09:04:41.286178 master-0 kubenswrapper[18707]: I0320 09:04:41.286136 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5t48" event={"ID":"b75e9b7b-6504-4904-96af-66385e6649e4","Type":"ContainerStarted","Data":"de43fee98a263f1bb2a82868788505d09bc029cd8f20966eacf619884f5d10ec"} Mar 20 09:04:41.288307 master-0 kubenswrapper[18707]: I0320 09:04:41.288274 18707 generic.go:334] "Generic (PLEG): container finished" podID="cbb818c8-8429-4748-a924-2b7f77d812da" containerID="0f7693c5a948e8096407c340e462caf42636f6a9506019017fa6f16affcf147a" exitCode=0 Mar 20 09:04:41.288683 master-0 kubenswrapper[18707]: I0320 09:04:41.288664 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" event={"ID":"cbb818c8-8429-4748-a924-2b7f77d812da","Type":"ContainerDied","Data":"0f7693c5a948e8096407c340e462caf42636f6a9506019017fa6f16affcf147a"} Mar 20 09:04:41.290022 master-0 kubenswrapper[18707]: I0320 09:04:41.289997 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7","Type":"ContainerStarted","Data":"3e569e082c8e68a02bbf4d1ca770654808174732eb05d4d37138708a500dfa9b"} Mar 20 09:04:41.292299 master-0 kubenswrapper[18707]: I0320 09:04:41.292265 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hz4j4" event={"ID":"14a23f75-fa11-462d-8277-c7611103bf48","Type":"ContainerStarted","Data":"b82ed51714af39c1576980df91103e3c9276c414d6747beaa9fe4e301f50eb3b"} Mar 20 09:04:41.295775 master-0 kubenswrapper[18707]: I0320 09:04:41.295727 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ccecdff5-9216-446b-bfcc-0406fc64996b","Type":"ContainerStarted","Data":"984eaa419ab8344c483e759a2a4a64b6000806fe64b3327647875984e29592f8"} Mar 20 09:04:41.297416 master-0 kubenswrapper[18707]: I0320 09:04:41.297355 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2783ea79-a8c6-46e1-b089-957a5276b691","Type":"ContainerStarted","Data":"6a237153608f8cd12483f1d3da08e319e78e9ed2a05caa56ca2e4cd792c7036e"} Mar 20 09:04:41.301228 master-0 kubenswrapper[18707]: I0320 09:04:41.301075 18707 generic.go:334] "Generic (PLEG): container finished" podID="01a3ea60-7016-443c-9fb2-aefe2bd1ee89" containerID="20e1fb47458174b393c284c7a68a9ab748298f53e07a47be1792318c06503872" exitCode=0 Mar 20 09:04:41.301228 master-0 kubenswrapper[18707]: I0320 09:04:41.301147 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" event={"ID":"01a3ea60-7016-443c-9fb2-aefe2bd1ee89","Type":"ContainerDied","Data":"20e1fb47458174b393c284c7a68a9ab748298f53e07a47be1792318c06503872"} Mar 20 09:04:41.304276 master-0 kubenswrapper[18707]: I0320 09:04:41.304098 18707 generic.go:334] "Generic (PLEG): container finished" podID="5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f" containerID="a780de2adb2d97b70bb6d278df777093845f8f03cf60be9083754d2854bebe5b" exitCode=0 Mar 20 09:04:41.304276 master-0 kubenswrapper[18707]: I0320 09:04:41.304205 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685c76cf85-qwz94" event={"ID":"5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f","Type":"ContainerDied","Data":"a780de2adb2d97b70bb6d278df777093845f8f03cf60be9083754d2854bebe5b"} Mar 20 09:04:41.318777 master-0 kubenswrapper[18707]: I0320 09:04:41.318722 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0efe880d-c00f-4de4-bf1b-26958fb5ac9b","Type":"ContainerStarted","Data":"53132dcf7937c067c8524848be17804b37698063f20063a186dec35ddc7d3963"} Mar 20 09:04:41.323588 master-0 kubenswrapper[18707]: I0320 09:04:41.323541 18707 generic.go:334] "Generic (PLEG): container finished" podID="ccf0e477-d464-4b68-8116-f067791e0b48" containerID="b1c27633889ab8ca2c9f4bf4862b62f21fdef8e75ce59e6db5bb48fa212a7c6e" exitCode=0 Mar 20 09:04:41.323733 master-0 kubenswrapper[18707]: I0320 09:04:41.323650 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" event={"ID":"ccf0e477-d464-4b68-8116-f067791e0b48","Type":"ContainerDied","Data":"b1c27633889ab8ca2c9f4bf4862b62f21fdef8e75ce59e6db5bb48fa212a7c6e"} Mar 20 09:04:41.328143 master-0 kubenswrapper[18707]: I0320 09:04:41.327530 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8e517ef1-932a-41e6-85ac-e7004261a504","Type":"ContainerStarted","Data":"7a93fe074f818c34b5e5b04ae5e74e0b937ab7c3a3e3d7725dde0207c90b298f"} Mar 20 09:04:41.329762 master-0 kubenswrapper[18707]: I0320 09:04:41.329723 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8ff08b89-230c-4277-8558-42f92623f53f","Type":"ContainerStarted","Data":"492c49f394ba430417d59ea48dad11a928e50106c850e0cc732426151238c49b"} Mar 20 09:04:41.557241 master-0 kubenswrapper[18707]: E0320 09:04:41.557157 18707 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 20 09:04:41.557241 master-0 kubenswrapper[18707]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/cbb818c8-8429-4748-a924-2b7f77d812da/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 09:04:41.557241 master-0 kubenswrapper[18707]: > podSandboxID="17983fb7a58c903ba4d9a1f0ce014948be6414a68f01b83eb4a88a89876d673b" Mar 20 09:04:41.557678 master-0 kubenswrapper[18707]: E0320 09:04:41.557476 18707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 09:04:41.557678 master-0 kubenswrapper[18707]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbchf8h696h5ffh5cdh585hc5hbfh597h58dhfh554h67bh9bh5c9hfch7dh5fbhbbh567h78h669hf8h65dh55dh588h5ddh88h694h669h95h8q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-65x7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000800000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-586dbdbb8c-28nq4_openstack(cbb818c8-8429-4748-a924-2b7f77d812da): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/cbb818c8-8429-4748-a924-2b7f77d812da/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 09:04:41.557678 master-0 kubenswrapper[18707]: > logger="UnhandledError" Mar 20 09:04:41.558690 master-0 kubenswrapper[18707]: E0320 09:04:41.558628 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/cbb818c8-8429-4748-a924-2b7f77d812da/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" podUID="cbb818c8-8429-4748-a924-2b7f77d812da" Mar 20 09:04:42.568926 master-0 kubenswrapper[18707]: I0320 09:04:42.349658 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" event={"ID":"ccf0e477-d464-4b68-8116-f067791e0b48","Type":"ContainerDied","Data":"472bfcf4169aaac8d89260648a997394cad89285f067162897ebbbcc8fd9e23d"} Mar 20 09:04:42.568926 master-0 kubenswrapper[18707]: I0320 09:04:42.349980 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="472bfcf4169aaac8d89260648a997394cad89285f067162897ebbbcc8fd9e23d" Mar 20 09:04:42.568926 master-0 kubenswrapper[18707]: I0320 09:04:42.353175 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685c76cf85-qwz94" event={"ID":"5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f","Type":"ContainerDied","Data":"650edb47d722e326c258d0a7578e69c6c7204da6a81edd628fa75a28d616f9d8"} Mar 20 09:04:42.568926 master-0 kubenswrapper[18707]: I0320 09:04:42.353267 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="650edb47d722e326c258d0a7578e69c6c7204da6a81edd628fa75a28d616f9d8" Mar 20 09:04:42.568926 master-0 kubenswrapper[18707]: I0320 09:04:42.356878 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" event={"ID":"01a3ea60-7016-443c-9fb2-aefe2bd1ee89","Type":"ContainerStarted","Data":"6ad23b2a7e70eee9f718200f4669404d729789cbf65e935f164c88a96c881295"} Mar 20 09:04:42.568926 master-0 kubenswrapper[18707]: I0320 09:04:42.356922 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" Mar 20 09:04:42.640731 master-0 kubenswrapper[18707]: I0320 09:04:42.640371 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" podStartSLOduration=5.286082682 podStartE2EDuration="21.640343472s" podCreationTimestamp="2026-03-20 09:04:21 +0000 UTC" firstStartedPulling="2026-03-20 09:04:23.380783911 +0000 UTC m=+1408.536964267" lastFinishedPulling="2026-03-20 09:04:39.735044701 +0000 UTC m=+1424.891225057" observedRunningTime="2026-03-20 09:04:42.588974804 +0000 UTC m=+1427.745155160" watchObservedRunningTime="2026-03-20 09:04:42.640343472 +0000 UTC m=+1427.796523848" Mar 20 09:04:42.713635 master-0 kubenswrapper[18707]: I0320 09:04:42.713593 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" Mar 20 09:04:42.724549 master-0 kubenswrapper[18707]: I0320 09:04:42.724491 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-qwz94" Mar 20 09:04:42.839969 master-0 kubenswrapper[18707]: I0320 09:04:42.839924 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bmhb\" (UniqueName: \"kubernetes.io/projected/ccf0e477-d464-4b68-8116-f067791e0b48-kube-api-access-8bmhb\") pod \"ccf0e477-d464-4b68-8116-f067791e0b48\" (UID: \"ccf0e477-d464-4b68-8116-f067791e0b48\") " Mar 20 09:04:42.840243 master-0 kubenswrapper[18707]: I0320 09:04:42.840176 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccf0e477-d464-4b68-8116-f067791e0b48-dns-svc\") pod \"ccf0e477-d464-4b68-8116-f067791e0b48\" (UID: \"ccf0e477-d464-4b68-8116-f067791e0b48\") " Mar 20 09:04:42.840375 master-0 kubenswrapper[18707]: I0320 09:04:42.840273 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccf0e477-d464-4b68-8116-f067791e0b48-config\") pod \"ccf0e477-d464-4b68-8116-f067791e0b48\" (UID: \"ccf0e477-d464-4b68-8116-f067791e0b48\") " Mar 20 09:04:42.840781 master-0 kubenswrapper[18707]: I0320 09:04:42.840759 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d69l\" (UniqueName: \"kubernetes.io/projected/5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f-kube-api-access-4d69l\") pod \"5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f\" (UID: \"5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f\") " Mar 20 09:04:42.840843 master-0 kubenswrapper[18707]: I0320 09:04:42.840803 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f-config\") pod \"5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f\" (UID: \"5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f\") " Mar 20 09:04:42.845261 master-0 kubenswrapper[18707]: I0320 09:04:42.844888 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccf0e477-d464-4b68-8116-f067791e0b48-kube-api-access-8bmhb" (OuterVolumeSpecName: "kube-api-access-8bmhb") pod "ccf0e477-d464-4b68-8116-f067791e0b48" (UID: "ccf0e477-d464-4b68-8116-f067791e0b48"). InnerVolumeSpecName "kube-api-access-8bmhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:04:42.869937 master-0 kubenswrapper[18707]: I0320 09:04:42.869879 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f-kube-api-access-4d69l" (OuterVolumeSpecName: "kube-api-access-4d69l") pod "5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f" (UID: "5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f"). InnerVolumeSpecName "kube-api-access-4d69l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:04:42.890665 master-0 kubenswrapper[18707]: I0320 09:04:42.890582 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccf0e477-d464-4b68-8116-f067791e0b48-config" (OuterVolumeSpecName: "config") pod "ccf0e477-d464-4b68-8116-f067791e0b48" (UID: "ccf0e477-d464-4b68-8116-f067791e0b48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:04:42.900407 master-0 kubenswrapper[18707]: I0320 09:04:42.900203 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccf0e477-d464-4b68-8116-f067791e0b48-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ccf0e477-d464-4b68-8116-f067791e0b48" (UID: "ccf0e477-d464-4b68-8116-f067791e0b48"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:04:42.901909 master-0 kubenswrapper[18707]: I0320 09:04:42.901856 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f-config" (OuterVolumeSpecName: "config") pod "5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f" (UID: "5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:04:42.944083 master-0 kubenswrapper[18707]: I0320 09:04:42.943975 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bmhb\" (UniqueName: \"kubernetes.io/projected/ccf0e477-d464-4b68-8116-f067791e0b48-kube-api-access-8bmhb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:04:42.944083 master-0 kubenswrapper[18707]: I0320 09:04:42.944009 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccf0e477-d464-4b68-8116-f067791e0b48-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:04:42.944083 master-0 kubenswrapper[18707]: I0320 09:04:42.944020 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccf0e477-d464-4b68-8116-f067791e0b48-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:04:42.944083 master-0 kubenswrapper[18707]: I0320 09:04:42.944029 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d69l\" (UniqueName: \"kubernetes.io/projected/5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f-kube-api-access-4d69l\") on node \"master-0\" DevicePath \"\"" Mar 20 09:04:42.944083 master-0 kubenswrapper[18707]: I0320 09:04:42.944048 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:04:43.371250 master-0 kubenswrapper[18707]: I0320 09:04:43.371093 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" event={"ID":"cbb818c8-8429-4748-a924-2b7f77d812da","Type":"ContainerStarted","Data":"1f68623babcbfef2f95ec412480e0a967b13466e2b4b15865af2d3afea813af4"} Mar 20 09:04:43.371476 master-0 kubenswrapper[18707]: I0320 09:04:43.371366 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-qwz94" Mar 20 09:04:43.371476 master-0 kubenswrapper[18707]: I0320 09:04:43.371361 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-24d8n" Mar 20 09:04:43.409710 master-0 kubenswrapper[18707]: I0320 09:04:43.409598 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" podStartSLOduration=6.052371941 podStartE2EDuration="22.409577724s" podCreationTimestamp="2026-03-20 09:04:21 +0000 UTC" firstStartedPulling="2026-03-20 09:04:23.384050325 +0000 UTC m=+1408.540230681" lastFinishedPulling="2026-03-20 09:04:39.741256108 +0000 UTC m=+1424.897436464" observedRunningTime="2026-03-20 09:04:43.396087809 +0000 UTC m=+1428.552268175" watchObservedRunningTime="2026-03-20 09:04:43.409577724 +0000 UTC m=+1428.565758080" Mar 20 09:04:43.514787 master-0 kubenswrapper[18707]: I0320 09:04:43.514652 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-24d8n"] Mar 20 09:04:43.531511 master-0 kubenswrapper[18707]: I0320 09:04:43.531418 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-24d8n"] Mar 20 09:04:43.549769 master-0 kubenswrapper[18707]: I0320 09:04:43.549722 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-qwz94"] Mar 20 09:04:43.559076 master-0 kubenswrapper[18707]: I0320 09:04:43.559030 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-qwz94"] Mar 20 09:04:45.107369 master-0 kubenswrapper[18707]: I0320 09:04:45.107288 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f" path="/var/lib/kubelet/pods/5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f/volumes" Mar 20 09:04:45.135338 master-0 kubenswrapper[18707]: I0320 09:04:45.135285 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccf0e477-d464-4b68-8116-f067791e0b48" path="/var/lib/kubelet/pods/ccf0e477-d464-4b68-8116-f067791e0b48/volumes" Mar 20 09:04:46.922909 master-0 kubenswrapper[18707]: I0320 09:04:46.922828 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" Mar 20 09:04:47.385612 master-0 kubenswrapper[18707]: I0320 09:04:47.385367 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" Mar 20 09:04:47.488602 master-0 kubenswrapper[18707]: I0320 09:04:47.488536 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586dbdbb8c-28nq4"] Mar 20 09:04:47.488910 master-0 kubenswrapper[18707]: I0320 09:04:47.488864 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" podUID="cbb818c8-8429-4748-a924-2b7f77d812da" containerName="dnsmasq-dns" containerID="cri-o://1f68623babcbfef2f95ec412480e0a967b13466e2b4b15865af2d3afea813af4" gracePeriod=10 Mar 20 09:04:47.496242 master-0 kubenswrapper[18707]: I0320 09:04:47.495244 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" Mar 20 09:04:48.454661 master-0 kubenswrapper[18707]: I0320 09:04:48.454600 18707 generic.go:334] "Generic (PLEG): container finished" podID="cbb818c8-8429-4748-a924-2b7f77d812da" containerID="1f68623babcbfef2f95ec412480e0a967b13466e2b4b15865af2d3afea813af4" exitCode=0 Mar 20 09:04:48.454661 master-0 kubenswrapper[18707]: I0320 09:04:48.454661 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" event={"ID":"cbb818c8-8429-4748-a924-2b7f77d812da","Type":"ContainerDied","Data":"1f68623babcbfef2f95ec412480e0a967b13466e2b4b15865af2d3afea813af4"} Mar 20 09:04:50.854635 master-0 kubenswrapper[18707]: I0320 09:04:50.854558 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" Mar 20 09:04:50.913038 master-0 kubenswrapper[18707]: I0320 09:04:50.912970 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbb818c8-8429-4748-a924-2b7f77d812da-config\") pod \"cbb818c8-8429-4748-a924-2b7f77d812da\" (UID: \"cbb818c8-8429-4748-a924-2b7f77d812da\") " Mar 20 09:04:50.913313 master-0 kubenswrapper[18707]: I0320 09:04:50.913264 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65x7g\" (UniqueName: \"kubernetes.io/projected/cbb818c8-8429-4748-a924-2b7f77d812da-kube-api-access-65x7g\") pod \"cbb818c8-8429-4748-a924-2b7f77d812da\" (UID: \"cbb818c8-8429-4748-a924-2b7f77d812da\") " Mar 20 09:04:50.913313 master-0 kubenswrapper[18707]: I0320 09:04:50.913293 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbb818c8-8429-4748-a924-2b7f77d812da-dns-svc\") pod \"cbb818c8-8429-4748-a924-2b7f77d812da\" (UID: \"cbb818c8-8429-4748-a924-2b7f77d812da\") " Mar 20 09:04:50.922224 master-0 kubenswrapper[18707]: I0320 09:04:50.922129 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbb818c8-8429-4748-a924-2b7f77d812da-kube-api-access-65x7g" (OuterVolumeSpecName: "kube-api-access-65x7g") pod "cbb818c8-8429-4748-a924-2b7f77d812da" (UID: "cbb818c8-8429-4748-a924-2b7f77d812da"). InnerVolumeSpecName "kube-api-access-65x7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:04:50.979685 master-0 kubenswrapper[18707]: I0320 09:04:50.979613 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbb818c8-8429-4748-a924-2b7f77d812da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cbb818c8-8429-4748-a924-2b7f77d812da" (UID: "cbb818c8-8429-4748-a924-2b7f77d812da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:04:50.984978 master-0 kubenswrapper[18707]: I0320 09:04:50.984862 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbb818c8-8429-4748-a924-2b7f77d812da-config" (OuterVolumeSpecName: "config") pod "cbb818c8-8429-4748-a924-2b7f77d812da" (UID: "cbb818c8-8429-4748-a924-2b7f77d812da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:04:51.018376 master-0 kubenswrapper[18707]: I0320 09:04:51.018291 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65x7g\" (UniqueName: \"kubernetes.io/projected/cbb818c8-8429-4748-a924-2b7f77d812da-kube-api-access-65x7g\") on node \"master-0\" DevicePath \"\"" Mar 20 09:04:51.018376 master-0 kubenswrapper[18707]: I0320 09:04:51.018351 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbb818c8-8429-4748-a924-2b7f77d812da-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:04:51.018376 master-0 kubenswrapper[18707]: I0320 09:04:51.018366 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbb818c8-8429-4748-a924-2b7f77d812da-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:04:51.494106 master-0 kubenswrapper[18707]: I0320 09:04:51.493946 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" event={"ID":"cbb818c8-8429-4748-a924-2b7f77d812da","Type":"ContainerDied","Data":"17983fb7a58c903ba4d9a1f0ce014948be6414a68f01b83eb4a88a89876d673b"} Mar 20 09:04:51.494375 master-0 kubenswrapper[18707]: I0320 09:04:51.493967 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586dbdbb8c-28nq4" Mar 20 09:04:51.494748 master-0 kubenswrapper[18707]: I0320 09:04:51.494723 18707 scope.go:117] "RemoveContainer" containerID="1f68623babcbfef2f95ec412480e0a967b13466e2b4b15865af2d3afea813af4" Mar 20 09:04:51.709057 master-0 kubenswrapper[18707]: I0320 09:04:51.705899 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586dbdbb8c-28nq4"] Mar 20 09:04:51.826535 master-0 kubenswrapper[18707]: I0320 09:04:51.826465 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586dbdbb8c-28nq4"] Mar 20 09:04:53.105688 master-0 kubenswrapper[18707]: I0320 09:04:53.105624 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbb818c8-8429-4748-a924-2b7f77d812da" path="/var/lib/kubelet/pods/cbb818c8-8429-4748-a924-2b7f77d812da/volumes" Mar 20 09:04:53.293390 master-0 kubenswrapper[18707]: I0320 09:04:53.292668 18707 scope.go:117] "RemoveContainer" containerID="0f7693c5a948e8096407c340e462caf42636f6a9506019017fa6f16affcf147a" Mar 20 09:04:54.538837 master-0 kubenswrapper[18707]: I0320 09:04:54.538613 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8e517ef1-932a-41e6-85ac-e7004261a504","Type":"ContainerStarted","Data":"a16901619e2ef6b3b8dacf15ede6caa3e12a81fbf49823d8236b0d00dce3273d"} Mar 20 09:04:54.538837 master-0 kubenswrapper[18707]: I0320 09:04:54.538803 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 09:04:54.545471 master-0 kubenswrapper[18707]: I0320 09:04:54.545375 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8ff08b89-230c-4277-8558-42f92623f53f","Type":"ContainerStarted","Data":"aa4e4f786f3e46cbe6e71623c87dfb4f46270ebfb0c9e2d8cd1f45c71a0edb43"} Mar 20 09:04:54.549421 master-0 kubenswrapper[18707]: I0320 09:04:54.549367 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0efe880d-c00f-4de4-bf1b-26958fb5ac9b","Type":"ContainerStarted","Data":"2605f879aebdace91035a03a4e8a45c18fe73e96c91f071cb73260e1b3ede8c1"} Mar 20 09:04:54.553171 master-0 kubenswrapper[18707]: I0320 09:04:54.553124 18707 generic.go:334] "Generic (PLEG): container finished" podID="14a23f75-fa11-462d-8277-c7611103bf48" containerID="186e5660aa655ef6016f961bb191cfafae60b6580f77f4af70f242c70d214a4c" exitCode=0 Mar 20 09:04:54.553566 master-0 kubenswrapper[18707]: I0320 09:04:54.553234 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hz4j4" event={"ID":"14a23f75-fa11-462d-8277-c7611103bf48","Type":"ContainerDied","Data":"186e5660aa655ef6016f961bb191cfafae60b6580f77f4af70f242c70d214a4c"} Mar 20 09:04:54.556867 master-0 kubenswrapper[18707]: I0320 09:04:54.556704 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5t48" event={"ID":"b75e9b7b-6504-4904-96af-66385e6649e4","Type":"ContainerStarted","Data":"f695ee0889d56d2520b7e8131c9389a5afe19128f4c02cdf3602260d0411b61b"} Mar 20 09:04:54.557151 master-0 kubenswrapper[18707]: I0320 09:04:54.556893 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-q5t48" Mar 20 09:04:54.565346 master-0 kubenswrapper[18707]: I0320 09:04:54.565283 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ccecdff5-9216-446b-bfcc-0406fc64996b","Type":"ContainerStarted","Data":"fdffc491ef12ffc3cc0ab15b579bb2e97c8ffa87d4fcb37ec288898f67f5e7f3"} Mar 20 09:04:54.571308 master-0 kubenswrapper[18707]: I0320 09:04:54.571240 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.411086602 podStartE2EDuration="28.571216858s" podCreationTimestamp="2026-03-20 09:04:26 +0000 UTC" firstStartedPulling="2026-03-20 09:04:41.039040273 +0000 UTC m=+1426.195220629" lastFinishedPulling="2026-03-20 09:04:51.199170529 +0000 UTC m=+1436.355350885" observedRunningTime="2026-03-20 09:04:54.558064412 +0000 UTC m=+1439.714244788" watchObservedRunningTime="2026-03-20 09:04:54.571216858 +0000 UTC m=+1439.727397214" Mar 20 09:04:54.571961 master-0 kubenswrapper[18707]: I0320 09:04:54.571738 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2783ea79-a8c6-46e1-b089-957a5276b691","Type":"ContainerStarted","Data":"5018c1e91679fe35e069b86a60a78b5e226755e42ecdfbd2fd364e71f2f757a1"} Mar 20 09:04:54.670102 master-0 kubenswrapper[18707]: I0320 09:04:54.669975 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-q5t48" podStartSLOduration=11.397515562 podStartE2EDuration="23.66995179s" podCreationTimestamp="2026-03-20 09:04:31 +0000 UTC" firstStartedPulling="2026-03-20 09:04:41.064693146 +0000 UTC m=+1426.220873502" lastFinishedPulling="2026-03-20 09:04:53.337129374 +0000 UTC m=+1438.493309730" observedRunningTime="2026-03-20 09:04:54.653629853 +0000 UTC m=+1439.809810209" watchObservedRunningTime="2026-03-20 09:04:54.66995179 +0000 UTC m=+1439.826132136" Mar 20 09:04:55.589764 master-0 kubenswrapper[18707]: I0320 09:04:55.589637 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ce336090-2814-4550-a4c2-dd726e9b6ad2","Type":"ContainerStarted","Data":"99b1b565bb48f20f483733cfb3b051761fc87701e5eeb684b66b85bb100e5cbd"} Mar 20 09:04:55.591771 master-0 kubenswrapper[18707]: I0320 09:04:55.591724 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7","Type":"ContainerStarted","Data":"07dca153b771d2fd4ce2f7a87ea4777a2092378eaa23ae39894a407bc0ca0115"} Mar 20 09:04:55.594679 master-0 kubenswrapper[18707]: I0320 09:04:55.594638 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hz4j4" event={"ID":"14a23f75-fa11-462d-8277-c7611103bf48","Type":"ContainerStarted","Data":"21c71df904e181d87869b2a9fccc37cefe86c7bff55a46c24aa482d97ab177b9"} Mar 20 09:04:55.594679 master-0 kubenswrapper[18707]: I0320 09:04:55.594671 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hz4j4" event={"ID":"14a23f75-fa11-462d-8277-c7611103bf48","Type":"ContainerStarted","Data":"6577a3559be9112c4f32f408bdf1a229a18a01c6c64dc31223fd69988662ac5a"} Mar 20 09:04:55.595655 master-0 kubenswrapper[18707]: I0320 09:04:55.595609 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:55.595655 master-0 kubenswrapper[18707]: I0320 09:04:55.595647 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:04:55.716224 master-0 kubenswrapper[18707]: I0320 09:04:55.714091 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-hz4j4" podStartSLOduration=12.538368424 podStartE2EDuration="24.714072107s" podCreationTimestamp="2026-03-20 09:04:31 +0000 UTC" firstStartedPulling="2026-03-20 09:04:41.129632602 +0000 UTC m=+1426.285812958" lastFinishedPulling="2026-03-20 09:04:53.305336285 +0000 UTC m=+1438.461516641" observedRunningTime="2026-03-20 09:04:55.703082063 +0000 UTC m=+1440.859262439" watchObservedRunningTime="2026-03-20 09:04:55.714072107 +0000 UTC m=+1440.870252463" Mar 20 09:04:59.652341 master-0 kubenswrapper[18707]: I0320 09:04:59.652286 18707 generic.go:334] "Generic (PLEG): container finished" podID="ccecdff5-9216-446b-bfcc-0406fc64996b" containerID="fdffc491ef12ffc3cc0ab15b579bb2e97c8ffa87d4fcb37ec288898f67f5e7f3" exitCode=0 Mar 20 09:04:59.653292 master-0 kubenswrapper[18707]: I0320 09:04:59.652372 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ccecdff5-9216-446b-bfcc-0406fc64996b","Type":"ContainerDied","Data":"fdffc491ef12ffc3cc0ab15b579bb2e97c8ffa87d4fcb37ec288898f67f5e7f3"} Mar 20 09:04:59.658833 master-0 kubenswrapper[18707]: I0320 09:04:59.658787 18707 generic.go:334] "Generic (PLEG): container finished" podID="8ff08b89-230c-4277-8558-42f92623f53f" containerID="aa4e4f786f3e46cbe6e71623c87dfb4f46270ebfb0c9e2d8cd1f45c71a0edb43" exitCode=0 Mar 20 09:04:59.658949 master-0 kubenswrapper[18707]: I0320 09:04:59.658841 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8ff08b89-230c-4277-8558-42f92623f53f","Type":"ContainerDied","Data":"aa4e4f786f3e46cbe6e71623c87dfb4f46270ebfb0c9e2d8cd1f45c71a0edb43"} Mar 20 09:05:00.678708 master-0 kubenswrapper[18707]: I0320 09:05:00.678329 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ccecdff5-9216-446b-bfcc-0406fc64996b","Type":"ContainerStarted","Data":"3f826e2fa2fb8df85d2013192154e32c1dea35855606b6ba87521efc7e3c326c"} Mar 20 09:05:00.683580 master-0 kubenswrapper[18707]: I0320 09:05:00.683490 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2783ea79-a8c6-46e1-b089-957a5276b691","Type":"ContainerStarted","Data":"517dedb99129579cc2b1a4e4302b6bf2b426daf509b2a05fa74490e8fbd3a69b"} Mar 20 09:05:00.687977 master-0 kubenswrapper[18707]: I0320 09:05:00.687917 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8ff08b89-230c-4277-8558-42f92623f53f","Type":"ContainerStarted","Data":"44fe262b9f0ff65ae2b8dec03612e1ff4d2cb0db079ee92001943b84c07be429"} Mar 20 09:05:00.700681 master-0 kubenswrapper[18707]: I0320 09:05:00.700583 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0efe880d-c00f-4de4-bf1b-26958fb5ac9b","Type":"ContainerStarted","Data":"dbdf6f2754ed632f6327a4667126c2912d25df31662792b6bba27595d4b4ab25"} Mar 20 09:05:00.726731 master-0 kubenswrapper[18707]: I0320 09:05:00.726642 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.374423119 podStartE2EDuration="36.726624686s" podCreationTimestamp="2026-03-20 09:04:24 +0000 UTC" firstStartedPulling="2026-03-20 09:04:41.030067777 +0000 UTC m=+1426.186248133" lastFinishedPulling="2026-03-20 09:04:53.382269344 +0000 UTC m=+1438.538449700" observedRunningTime="2026-03-20 09:05:00.720358547 +0000 UTC m=+1445.876538993" watchObservedRunningTime="2026-03-20 09:05:00.726624686 +0000 UTC m=+1445.882805042" Mar 20 09:05:00.750845 master-0 kubenswrapper[18707]: I0320 09:05:00.750628 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=24.243954831 podStartE2EDuration="36.750606201s" podCreationTimestamp="2026-03-20 09:04:24 +0000 UTC" firstStartedPulling="2026-03-20 09:04:40.811667916 +0000 UTC m=+1425.967848272" lastFinishedPulling="2026-03-20 09:04:53.318319286 +0000 UTC m=+1438.474499642" observedRunningTime="2026-03-20 09:05:00.748278034 +0000 UTC m=+1445.904458400" watchObservedRunningTime="2026-03-20 09:05:00.750606201 +0000 UTC m=+1445.906786567" Mar 20 09:05:00.775122 master-0 kubenswrapper[18707]: I0320 09:05:00.775013 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.222560184 podStartE2EDuration="29.774994708s" podCreationTimestamp="2026-03-20 09:04:31 +0000 UTC" firstStartedPulling="2026-03-20 09:04:40.988622302 +0000 UTC m=+1426.144802678" lastFinishedPulling="2026-03-20 09:04:59.541056826 +0000 UTC m=+1444.697237202" observedRunningTime="2026-03-20 09:05:00.771903039 +0000 UTC m=+1445.928083405" watchObservedRunningTime="2026-03-20 09:05:00.774994708 +0000 UTC m=+1445.931175074" Mar 20 09:05:00.805702 master-0 kubenswrapper[18707]: I0320 09:05:00.805572 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.51462019 podStartE2EDuration="26.805548761s" podCreationTimestamp="2026-03-20 09:04:34 +0000 UTC" firstStartedPulling="2026-03-20 09:04:41.262739906 +0000 UTC m=+1426.418920262" lastFinishedPulling="2026-03-20 09:04:59.553668457 +0000 UTC m=+1444.709848833" observedRunningTime="2026-03-20 09:05:00.801481515 +0000 UTC m=+1445.957661891" watchObservedRunningTime="2026-03-20 09:05:00.805548761 +0000 UTC m=+1445.961729127" Mar 20 09:05:00.905409 master-0 kubenswrapper[18707]: I0320 09:05:00.905327 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 09:05:00.905409 master-0 kubenswrapper[18707]: I0320 09:05:00.905409 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 09:05:00.935258 master-0 kubenswrapper[18707]: I0320 09:05:00.934884 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 09:05:00.990704 master-0 kubenswrapper[18707]: I0320 09:05:00.990624 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 09:05:01.589478 master-0 kubenswrapper[18707]: I0320 09:05:01.588832 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 09:05:01.646195 master-0 kubenswrapper[18707]: I0320 09:05:01.646095 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 09:05:01.712778 master-0 kubenswrapper[18707]: I0320 09:05:01.712728 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 09:05:01.712778 master-0 kubenswrapper[18707]: I0320 09:05:01.712772 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 09:05:01.764182 master-0 kubenswrapper[18707]: I0320 09:05:01.764103 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 09:05:01.771540 master-0 kubenswrapper[18707]: I0320 09:05:01.771483 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 09:05:01.774105 master-0 kubenswrapper[18707]: I0320 09:05:01.774056 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 09:05:01.774488 master-0 kubenswrapper[18707]: I0320 09:05:01.774407 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 09:05:01.982505 master-0 kubenswrapper[18707]: I0320 09:05:01.982363 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 09:05:02.090099 master-0 kubenswrapper[18707]: I0320 09:05:02.088212 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65db7fd8ff-r6h25"] Mar 20 09:05:02.090099 master-0 kubenswrapper[18707]: E0320 09:05:02.088705 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb818c8-8429-4748-a924-2b7f77d812da" containerName="init" Mar 20 09:05:02.090099 master-0 kubenswrapper[18707]: I0320 09:05:02.088720 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb818c8-8429-4748-a924-2b7f77d812da" containerName="init" Mar 20 09:05:02.090099 master-0 kubenswrapper[18707]: E0320 09:05:02.089967 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f" containerName="init" Mar 20 09:05:02.090099 master-0 kubenswrapper[18707]: I0320 09:05:02.089984 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f" containerName="init" Mar 20 09:05:02.090099 master-0 kubenswrapper[18707]: E0320 09:05:02.090015 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccf0e477-d464-4b68-8116-f067791e0b48" containerName="init" Mar 20 09:05:02.090099 master-0 kubenswrapper[18707]: I0320 09:05:02.090024 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf0e477-d464-4b68-8116-f067791e0b48" containerName="init" Mar 20 09:05:02.090099 master-0 kubenswrapper[18707]: E0320 09:05:02.090036 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb818c8-8429-4748-a924-2b7f77d812da" containerName="dnsmasq-dns" Mar 20 09:05:02.090099 master-0 kubenswrapper[18707]: I0320 09:05:02.090042 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb818c8-8429-4748-a924-2b7f77d812da" containerName="dnsmasq-dns" Mar 20 09:05:02.090772 master-0 kubenswrapper[18707]: I0320 09:05:02.090749 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccf0e477-d464-4b68-8116-f067791e0b48" containerName="init" Mar 20 09:05:02.090826 master-0 kubenswrapper[18707]: I0320 09:05:02.090791 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbb818c8-8429-4748-a924-2b7f77d812da" containerName="dnsmasq-dns" Mar 20 09:05:02.090826 master-0 kubenswrapper[18707]: I0320 09:05:02.090814 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f8e4b58-6209-4dde-94fd-4b8ddcf13a9f" containerName="init" Mar 20 09:05:02.098327 master-0 kubenswrapper[18707]: I0320 09:05:02.098288 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:02.104981 master-0 kubenswrapper[18707]: I0320 09:05:02.104863 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 09:05:02.124370 master-0 kubenswrapper[18707]: I0320 09:05:02.120953 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65db7fd8ff-r6h25"] Mar 20 09:05:02.218787 master-0 kubenswrapper[18707]: I0320 09:05:02.218734 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-config\") pod \"dnsmasq-dns-65db7fd8ff-r6h25\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:02.218980 master-0 kubenswrapper[18707]: I0320 09:05:02.218808 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-ovsdbserver-sb\") pod \"dnsmasq-dns-65db7fd8ff-r6h25\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:02.219222 master-0 kubenswrapper[18707]: I0320 09:05:02.219174 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-dns-svc\") pod \"dnsmasq-dns-65db7fd8ff-r6h25\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:02.219406 master-0 kubenswrapper[18707]: I0320 09:05:02.219357 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlp8k\" (UniqueName: \"kubernetes.io/projected/beeeac58-ff43-450e-bf75-6687d339672c-kube-api-access-xlp8k\") pod \"dnsmasq-dns-65db7fd8ff-r6h25\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:02.321088 master-0 kubenswrapper[18707]: I0320 09:05:02.321025 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-config\") pod \"dnsmasq-dns-65db7fd8ff-r6h25\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:02.321374 master-0 kubenswrapper[18707]: I0320 09:05:02.321119 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-ovsdbserver-sb\") pod \"dnsmasq-dns-65db7fd8ff-r6h25\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:02.321374 master-0 kubenswrapper[18707]: I0320 09:05:02.321246 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-dns-svc\") pod \"dnsmasq-dns-65db7fd8ff-r6h25\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:02.321374 master-0 kubenswrapper[18707]: I0320 09:05:02.321319 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlp8k\" (UniqueName: \"kubernetes.io/projected/beeeac58-ff43-450e-bf75-6687d339672c-kube-api-access-xlp8k\") pod \"dnsmasq-dns-65db7fd8ff-r6h25\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:02.323285 master-0 kubenswrapper[18707]: I0320 09:05:02.322839 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-config\") pod \"dnsmasq-dns-65db7fd8ff-r6h25\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:02.323422 master-0 kubenswrapper[18707]: I0320 09:05:02.323387 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-ovsdbserver-sb\") pod \"dnsmasq-dns-65db7fd8ff-r6h25\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:02.323664 master-0 kubenswrapper[18707]: I0320 09:05:02.323630 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-dns-svc\") pod \"dnsmasq-dns-65db7fd8ff-r6h25\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:02.329034 master-0 kubenswrapper[18707]: I0320 09:05:02.328986 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-p4sw4"] Mar 20 09:05:02.334862 master-0 kubenswrapper[18707]: I0320 09:05:02.330789 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.334862 master-0 kubenswrapper[18707]: I0320 09:05:02.333418 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 09:05:02.342482 master-0 kubenswrapper[18707]: I0320 09:05:02.339779 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-p4sw4"] Mar 20 09:05:02.343020 master-0 kubenswrapper[18707]: I0320 09:05:02.342988 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlp8k\" (UniqueName: \"kubernetes.io/projected/beeeac58-ff43-450e-bf75-6687d339672c-kube-api-access-xlp8k\") pod \"dnsmasq-dns-65db7fd8ff-r6h25\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:02.406724 master-0 kubenswrapper[18707]: I0320 09:05:02.406666 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65db7fd8ff-r6h25"] Mar 20 09:05:02.407932 master-0 kubenswrapper[18707]: I0320 09:05:02.407651 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:02.425644 master-0 kubenswrapper[18707]: I0320 09:05:02.425291 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-combined-ca-bundle\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.425644 master-0 kubenswrapper[18707]: I0320 09:05:02.425380 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-config\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.425644 master-0 kubenswrapper[18707]: I0320 09:05:02.425485 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-ovn-rundir\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.425644 master-0 kubenswrapper[18707]: I0320 09:05:02.425532 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.425644 master-0 kubenswrapper[18707]: I0320 09:05:02.425564 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-ovs-rundir\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.426028 master-0 kubenswrapper[18707]: I0320 09:05:02.425925 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmc22\" (UniqueName: \"kubernetes.io/projected/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-kube-api-access-dmc22\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.452460 master-0 kubenswrapper[18707]: I0320 09:05:02.445616 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76f498f559-52t76"] Mar 20 09:05:02.452460 master-0 kubenswrapper[18707]: I0320 09:05:02.447367 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.452460 master-0 kubenswrapper[18707]: I0320 09:05:02.449718 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 09:05:02.472232 master-0 kubenswrapper[18707]: I0320 09:05:02.471697 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f498f559-52t76"] Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.517643 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.519741 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.529579 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4kqs\" (UniqueName: \"kubernetes.io/projected/8cdeb72a-9d14-480c-8d70-3a574f19755b-kube-api-access-g4kqs\") pod \"dnsmasq-dns-76f498f559-52t76\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.529644 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-config\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.529719 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-ovn-rundir\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.529792 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.529828 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-ovs-rundir\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.529862 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-ovsdbserver-sb\") pod \"dnsmasq-dns-76f498f559-52t76\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.529895 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-config\") pod \"dnsmasq-dns-76f498f559-52t76\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.529976 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-ovn-rundir\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.529993 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-ovs-rundir\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.530029 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-dns-svc\") pod \"dnsmasq-dns-76f498f559-52t76\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.530129 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmc22\" (UniqueName: \"kubernetes.io/projected/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-kube-api-access-dmc22\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.530266 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-combined-ca-bundle\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.530293 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-ovsdbserver-nb\") pod \"dnsmasq-dns-76f498f559-52t76\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.530801 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-config\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.531725 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.531945 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 09:05:02.535228 master-0 kubenswrapper[18707]: I0320 09:05:02.532061 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 09:05:02.546130 master-0 kubenswrapper[18707]: I0320 09:05:02.542314 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-combined-ca-bundle\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.546130 master-0 kubenswrapper[18707]: I0320 09:05:02.543363 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.558341 master-0 kubenswrapper[18707]: I0320 09:05:02.558005 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmc22\" (UniqueName: \"kubernetes.io/projected/a7ba2e77-de53-49c0-a55f-c7c4efc75c09-kube-api-access-dmc22\") pod \"ovn-controller-metrics-p4sw4\" (UID: \"a7ba2e77-de53-49c0-a55f-c7c4efc75c09\") " pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.571938 master-0 kubenswrapper[18707]: I0320 09:05:02.568811 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 09:05:02.632160 master-0 kubenswrapper[18707]: I0320 09:05:02.632115 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rmvp\" (UniqueName: \"kubernetes.io/projected/d46010f3-17ac-485b-a618-8d9bab33146e-kube-api-access-6rmvp\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.632816 master-0 kubenswrapper[18707]: I0320 09:05:02.632333 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-ovsdbserver-sb\") pod \"dnsmasq-dns-76f498f559-52t76\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.632816 master-0 kubenswrapper[18707]: I0320 09:05:02.632740 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-config\") pod \"dnsmasq-dns-76f498f559-52t76\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.632816 master-0 kubenswrapper[18707]: I0320 09:05:02.632778 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-dns-svc\") pod \"dnsmasq-dns-76f498f559-52t76\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.633037 master-0 kubenswrapper[18707]: I0320 09:05:02.632956 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d46010f3-17ac-485b-a618-8d9bab33146e-scripts\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.633095 master-0 kubenswrapper[18707]: I0320 09:05:02.633073 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d46010f3-17ac-485b-a618-8d9bab33146e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.633180 master-0 kubenswrapper[18707]: I0320 09:05:02.633156 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d46010f3-17ac-485b-a618-8d9bab33146e-config\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.633234 master-0 kubenswrapper[18707]: I0320 09:05:02.633222 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-ovsdbserver-nb\") pod \"dnsmasq-dns-76f498f559-52t76\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.633322 master-0 kubenswrapper[18707]: I0320 09:05:02.633294 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4kqs\" (UniqueName: \"kubernetes.io/projected/8cdeb72a-9d14-480c-8d70-3a574f19755b-kube-api-access-g4kqs\") pod \"dnsmasq-dns-76f498f559-52t76\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.633433 master-0 kubenswrapper[18707]: I0320 09:05:02.633403 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46010f3-17ac-485b-a618-8d9bab33146e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.633466 master-0 kubenswrapper[18707]: I0320 09:05:02.633448 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d46010f3-17ac-485b-a618-8d9bab33146e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.633933 master-0 kubenswrapper[18707]: I0320 09:05:02.633554 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46010f3-17ac-485b-a618-8d9bab33146e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.635045 master-0 kubenswrapper[18707]: I0320 09:05:02.635010 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-ovsdbserver-sb\") pod \"dnsmasq-dns-76f498f559-52t76\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.635454 master-0 kubenswrapper[18707]: I0320 09:05:02.635417 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-config\") pod \"dnsmasq-dns-76f498f559-52t76\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.635499 master-0 kubenswrapper[18707]: I0320 09:05:02.635462 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-ovsdbserver-nb\") pod \"dnsmasq-dns-76f498f559-52t76\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.644119 master-0 kubenswrapper[18707]: I0320 09:05:02.644073 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-dns-svc\") pod \"dnsmasq-dns-76f498f559-52t76\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.658720 master-0 kubenswrapper[18707]: I0320 09:05:02.657232 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4kqs\" (UniqueName: \"kubernetes.io/projected/8cdeb72a-9d14-480c-8d70-3a574f19755b-kube-api-access-g4kqs\") pod \"dnsmasq-dns-76f498f559-52t76\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.707342 master-0 kubenswrapper[18707]: I0320 09:05:02.707279 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-p4sw4" Mar 20 09:05:02.740745 master-0 kubenswrapper[18707]: I0320 09:05:02.740385 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46010f3-17ac-485b-a618-8d9bab33146e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.740745 master-0 kubenswrapper[18707]: I0320 09:05:02.740464 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d46010f3-17ac-485b-a618-8d9bab33146e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.740745 master-0 kubenswrapper[18707]: I0320 09:05:02.740754 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46010f3-17ac-485b-a618-8d9bab33146e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.740745 master-0 kubenswrapper[18707]: I0320 09:05:02.740892 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rmvp\" (UniqueName: \"kubernetes.io/projected/d46010f3-17ac-485b-a618-8d9bab33146e-kube-api-access-6rmvp\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.740745 master-0 kubenswrapper[18707]: I0320 09:05:02.741040 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d46010f3-17ac-485b-a618-8d9bab33146e-scripts\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.740745 master-0 kubenswrapper[18707]: I0320 09:05:02.741075 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d46010f3-17ac-485b-a618-8d9bab33146e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.740745 master-0 kubenswrapper[18707]: I0320 09:05:02.741128 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d46010f3-17ac-485b-a618-8d9bab33146e-config\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.743007 master-0 kubenswrapper[18707]: I0320 09:05:02.742975 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d46010f3-17ac-485b-a618-8d9bab33146e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.753631 master-0 kubenswrapper[18707]: I0320 09:05:02.744645 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d46010f3-17ac-485b-a618-8d9bab33146e-scripts\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.753631 master-0 kubenswrapper[18707]: I0320 09:05:02.744993 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d46010f3-17ac-485b-a618-8d9bab33146e-config\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.753631 master-0 kubenswrapper[18707]: I0320 09:05:02.745853 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46010f3-17ac-485b-a618-8d9bab33146e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.769666 master-0 kubenswrapper[18707]: I0320 09:05:02.769618 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d46010f3-17ac-485b-a618-8d9bab33146e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.770047 master-0 kubenswrapper[18707]: I0320 09:05:02.769972 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rmvp\" (UniqueName: \"kubernetes.io/projected/d46010f3-17ac-485b-a618-8d9bab33146e-kube-api-access-6rmvp\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.770464 master-0 kubenswrapper[18707]: I0320 09:05:02.770438 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d46010f3-17ac-485b-a618-8d9bab33146e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d46010f3-17ac-485b-a618-8d9bab33146e\") " pod="openstack/ovn-northd-0" Mar 20 09:05:02.927996 master-0 kubenswrapper[18707]: I0320 09:05:02.927873 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:02.947487 master-0 kubenswrapper[18707]: I0320 09:05:02.947434 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 09:05:02.968849 master-0 kubenswrapper[18707]: I0320 09:05:02.968792 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65db7fd8ff-r6h25"] Mar 20 09:05:03.302134 master-0 kubenswrapper[18707]: I0320 09:05:03.299889 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-p4sw4"] Mar 20 09:05:03.579772 master-0 kubenswrapper[18707]: I0320 09:05:03.578384 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 09:05:03.580487 master-0 kubenswrapper[18707]: W0320 09:05:03.580261 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd46010f3_17ac_485b_a618_8d9bab33146e.slice/crio-40424835bbb2be75770d211da70364ec3942dc8a08d4814f0a11255010cc0fe7 WatchSource:0}: Error finding container 40424835bbb2be75770d211da70364ec3942dc8a08d4814f0a11255010cc0fe7: Status 404 returned error can't find the container with id 40424835bbb2be75770d211da70364ec3942dc8a08d4814f0a11255010cc0fe7 Mar 20 09:05:03.667211 master-0 kubenswrapper[18707]: I0320 09:05:03.666790 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f498f559-52t76"] Mar 20 09:05:03.745566 master-0 kubenswrapper[18707]: I0320 09:05:03.743898 18707 generic.go:334] "Generic (PLEG): container finished" podID="beeeac58-ff43-450e-bf75-6687d339672c" containerID="c2a70fee556b5d139d87b3a3507da46c67d8cae1ef11ead044cf64afcbb69183" exitCode=0 Mar 20 09:05:03.745566 master-0 kubenswrapper[18707]: I0320 09:05:03.743989 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" event={"ID":"beeeac58-ff43-450e-bf75-6687d339672c","Type":"ContainerDied","Data":"c2a70fee556b5d139d87b3a3507da46c67d8cae1ef11ead044cf64afcbb69183"} Mar 20 09:05:03.745566 master-0 kubenswrapper[18707]: I0320 09:05:03.744025 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" event={"ID":"beeeac58-ff43-450e-bf75-6687d339672c","Type":"ContainerStarted","Data":"c2e0d98691506996aae5c6821d4ad6a3353ffb804c6518434c8202a3bd5e11ca"} Mar 20 09:05:03.748400 master-0 kubenswrapper[18707]: I0320 09:05:03.747917 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p4sw4" event={"ID":"a7ba2e77-de53-49c0-a55f-c7c4efc75c09","Type":"ContainerStarted","Data":"84585b13165601b6497f4f0cba4eae13e140e077d870cb26a1359d83bc10be3d"} Mar 20 09:05:03.748400 master-0 kubenswrapper[18707]: I0320 09:05:03.747981 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-p4sw4" event={"ID":"a7ba2e77-de53-49c0-a55f-c7c4efc75c09","Type":"ContainerStarted","Data":"1830b845c8f98e0e84171096d9236a74ec9b7de61caebb9bc0f7052a951739f5"} Mar 20 09:05:03.749627 master-0 kubenswrapper[18707]: I0320 09:05:03.749584 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f498f559-52t76" event={"ID":"8cdeb72a-9d14-480c-8d70-3a574f19755b","Type":"ContainerStarted","Data":"8b58582e19b02d67a6dd6a01d38592e4151bfd351a3cb1fc4825dfc5e0d4df2b"} Mar 20 09:05:03.751765 master-0 kubenswrapper[18707]: I0320 09:05:03.751009 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d46010f3-17ac-485b-a618-8d9bab33146e","Type":"ContainerStarted","Data":"40424835bbb2be75770d211da70364ec3942dc8a08d4814f0a11255010cc0fe7"} Mar 20 09:05:03.792871 master-0 kubenswrapper[18707]: I0320 09:05:03.790789 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-p4sw4" podStartSLOduration=1.790765356 podStartE2EDuration="1.790765356s" podCreationTimestamp="2026-03-20 09:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:03.789312275 +0000 UTC m=+1448.945492641" watchObservedRunningTime="2026-03-20 09:05:03.790765356 +0000 UTC m=+1448.946945712" Mar 20 09:05:04.133854 master-0 kubenswrapper[18707]: E0320 09:05:04.133789 18707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:34750->192.168.32.10:37773: write tcp 192.168.32.10:34750->192.168.32.10:37773: write: broken pipe Mar 20 09:05:04.174472 master-0 kubenswrapper[18707]: I0320 09:05:04.174366 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:04.250307 master-0 kubenswrapper[18707]: I0320 09:05:04.250228 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-ovsdbserver-sb\") pod \"beeeac58-ff43-450e-bf75-6687d339672c\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " Mar 20 09:05:04.250642 master-0 kubenswrapper[18707]: I0320 09:05:04.250351 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlp8k\" (UniqueName: \"kubernetes.io/projected/beeeac58-ff43-450e-bf75-6687d339672c-kube-api-access-xlp8k\") pod \"beeeac58-ff43-450e-bf75-6687d339672c\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " Mar 20 09:05:04.250642 master-0 kubenswrapper[18707]: I0320 09:05:04.250463 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-config\") pod \"beeeac58-ff43-450e-bf75-6687d339672c\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " Mar 20 09:05:04.250642 master-0 kubenswrapper[18707]: I0320 09:05:04.250556 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-dns-svc\") pod \"beeeac58-ff43-450e-bf75-6687d339672c\" (UID: \"beeeac58-ff43-450e-bf75-6687d339672c\") " Mar 20 09:05:04.254309 master-0 kubenswrapper[18707]: I0320 09:05:04.254240 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beeeac58-ff43-450e-bf75-6687d339672c-kube-api-access-xlp8k" (OuterVolumeSpecName: "kube-api-access-xlp8k") pod "beeeac58-ff43-450e-bf75-6687d339672c" (UID: "beeeac58-ff43-450e-bf75-6687d339672c"). InnerVolumeSpecName "kube-api-access-xlp8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:04.271809 master-0 kubenswrapper[18707]: I0320 09:05:04.271697 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-config" (OuterVolumeSpecName: "config") pod "beeeac58-ff43-450e-bf75-6687d339672c" (UID: "beeeac58-ff43-450e-bf75-6687d339672c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:04.272121 master-0 kubenswrapper[18707]: I0320 09:05:04.272070 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "beeeac58-ff43-450e-bf75-6687d339672c" (UID: "beeeac58-ff43-450e-bf75-6687d339672c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:04.275015 master-0 kubenswrapper[18707]: I0320 09:05:04.274976 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "beeeac58-ff43-450e-bf75-6687d339672c" (UID: "beeeac58-ff43-450e-bf75-6687d339672c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:04.352908 master-0 kubenswrapper[18707]: I0320 09:05:04.352842 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:04.352908 master-0 kubenswrapper[18707]: I0320 09:05:04.352894 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlp8k\" (UniqueName: \"kubernetes.io/projected/beeeac58-ff43-450e-bf75-6687d339672c-kube-api-access-xlp8k\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:04.352908 master-0 kubenswrapper[18707]: I0320 09:05:04.352908 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:04.352908 master-0 kubenswrapper[18707]: I0320 09:05:04.352918 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/beeeac58-ff43-450e-bf75-6687d339672c-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:04.767285 master-0 kubenswrapper[18707]: I0320 09:05:04.766714 18707 generic.go:334] "Generic (PLEG): container finished" podID="8cdeb72a-9d14-480c-8d70-3a574f19755b" containerID="2729da0be72f4c92cb042747477c20e03a677330a2d6103b00969cee06d0efad" exitCode=0 Mar 20 09:05:04.767285 master-0 kubenswrapper[18707]: I0320 09:05:04.766804 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f498f559-52t76" event={"ID":"8cdeb72a-9d14-480c-8d70-3a574f19755b","Type":"ContainerDied","Data":"2729da0be72f4c92cb042747477c20e03a677330a2d6103b00969cee06d0efad"} Mar 20 09:05:04.769636 master-0 kubenswrapper[18707]: I0320 09:05:04.769565 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" event={"ID":"beeeac58-ff43-450e-bf75-6687d339672c","Type":"ContainerDied","Data":"c2e0d98691506996aae5c6821d4ad6a3353ffb804c6518434c8202a3bd5e11ca"} Mar 20 09:05:04.769722 master-0 kubenswrapper[18707]: I0320 09:05:04.769667 18707 scope.go:117] "RemoveContainer" containerID="c2a70fee556b5d139d87b3a3507da46c67d8cae1ef11ead044cf64afcbb69183" Mar 20 09:05:04.769768 master-0 kubenswrapper[18707]: I0320 09:05:04.769707 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65db7fd8ff-r6h25" Mar 20 09:05:04.852451 master-0 kubenswrapper[18707]: I0320 09:05:04.852392 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65db7fd8ff-r6h25"] Mar 20 09:05:04.863169 master-0 kubenswrapper[18707]: I0320 09:05:04.862992 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65db7fd8ff-r6h25"] Mar 20 09:05:05.018273 master-0 kubenswrapper[18707]: I0320 09:05:05.018123 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 09:05:05.110528 master-0 kubenswrapper[18707]: I0320 09:05:05.110497 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beeeac58-ff43-450e-bf75-6687d339672c" path="/var/lib/kubelet/pods/beeeac58-ff43-450e-bf75-6687d339672c/volumes" Mar 20 09:05:05.115590 master-0 kubenswrapper[18707]: I0320 09:05:05.115527 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 09:05:05.785762 master-0 kubenswrapper[18707]: I0320 09:05:05.785699 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f498f559-52t76" event={"ID":"8cdeb72a-9d14-480c-8d70-3a574f19755b","Type":"ContainerStarted","Data":"d7c42445f9bef57cd6f547aa88450e87424608c018c22c496d9aa0850a140d32"} Mar 20 09:05:05.785762 master-0 kubenswrapper[18707]: I0320 09:05:05.785761 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:05.788084 master-0 kubenswrapper[18707]: I0320 09:05:05.788045 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d46010f3-17ac-485b-a618-8d9bab33146e","Type":"ContainerStarted","Data":"652f1db7a5750053e79a57f3f1e195d103c677ca682f9396ef927d83b7c0b5a4"} Mar 20 09:05:05.788256 master-0 kubenswrapper[18707]: I0320 09:05:05.788238 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d46010f3-17ac-485b-a618-8d9bab33146e","Type":"ContainerStarted","Data":"98e2c234c05bfe9a0e85fb502666b46af7dc6c3901b3299db880e6380e18ac3f"} Mar 20 09:05:05.822758 master-0 kubenswrapper[18707]: I0320 09:05:05.822511 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76f498f559-52t76" podStartSLOduration=3.822491184 podStartE2EDuration="3.822491184s" podCreationTimestamp="2026-03-20 09:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:05.80902599 +0000 UTC m=+1450.965206356" watchObservedRunningTime="2026-03-20 09:05:05.822491184 +0000 UTC m=+1450.978671540" Mar 20 09:05:05.845481 master-0 kubenswrapper[18707]: I0320 09:05:05.844061 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.584014653 podStartE2EDuration="3.8440427s" podCreationTimestamp="2026-03-20 09:05:02 +0000 UTC" firstStartedPulling="2026-03-20 09:05:03.58371916 +0000 UTC m=+1448.739899516" lastFinishedPulling="2026-03-20 09:05:04.843747217 +0000 UTC m=+1449.999927563" observedRunningTime="2026-03-20 09:05:05.841721284 +0000 UTC m=+1450.997901640" watchObservedRunningTime="2026-03-20 09:05:05.8440427 +0000 UTC m=+1451.000223066" Mar 20 09:05:06.797006 master-0 kubenswrapper[18707]: I0320 09:05:06.796940 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 09:05:07.646747 master-0 kubenswrapper[18707]: I0320 09:05:07.646667 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ec41-account-create-update-wlbg9"] Mar 20 09:05:07.647584 master-0 kubenswrapper[18707]: E0320 09:05:07.647556 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beeeac58-ff43-450e-bf75-6687d339672c" containerName="init" Mar 20 09:05:07.647706 master-0 kubenswrapper[18707]: I0320 09:05:07.647679 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="beeeac58-ff43-450e-bf75-6687d339672c" containerName="init" Mar 20 09:05:07.648204 master-0 kubenswrapper[18707]: I0320 09:05:07.648168 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="beeeac58-ff43-450e-bf75-6687d339672c" containerName="init" Mar 20 09:05:07.649331 master-0 kubenswrapper[18707]: I0320 09:05:07.649306 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ec41-account-create-update-wlbg9" Mar 20 09:05:07.651807 master-0 kubenswrapper[18707]: I0320 09:05:07.651748 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 09:05:07.668533 master-0 kubenswrapper[18707]: I0320 09:05:07.668478 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ec41-account-create-update-wlbg9"] Mar 20 09:05:07.679353 master-0 kubenswrapper[18707]: I0320 09:05:07.679295 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-cslxs"] Mar 20 09:05:07.681119 master-0 kubenswrapper[18707]: I0320 09:05:07.680801 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cslxs" Mar 20 09:05:07.720554 master-0 kubenswrapper[18707]: I0320 09:05:07.720493 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-cslxs"] Mar 20 09:05:07.731236 master-0 kubenswrapper[18707]: I0320 09:05:07.728685 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6hgj\" (UniqueName: \"kubernetes.io/projected/2a482121-8e91-4338-94d1-216c258dda1f-kube-api-access-m6hgj\") pod \"keystone-ec41-account-create-update-wlbg9\" (UID: \"2a482121-8e91-4338-94d1-216c258dda1f\") " pod="openstack/keystone-ec41-account-create-update-wlbg9" Mar 20 09:05:07.731236 master-0 kubenswrapper[18707]: I0320 09:05:07.728824 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a482121-8e91-4338-94d1-216c258dda1f-operator-scripts\") pod \"keystone-ec41-account-create-update-wlbg9\" (UID: \"2a482121-8e91-4338-94d1-216c258dda1f\") " pod="openstack/keystone-ec41-account-create-update-wlbg9" Mar 20 09:05:07.834135 master-0 kubenswrapper[18707]: I0320 09:05:07.834052 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a482121-8e91-4338-94d1-216c258dda1f-operator-scripts\") pod \"keystone-ec41-account-create-update-wlbg9\" (UID: \"2a482121-8e91-4338-94d1-216c258dda1f\") " pod="openstack/keystone-ec41-account-create-update-wlbg9" Mar 20 09:05:07.834800 master-0 kubenswrapper[18707]: I0320 09:05:07.834330 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr6qv\" (UniqueName: \"kubernetes.io/projected/c1badf9b-3bad-4267-aecb-cc8bf311cf07-kube-api-access-rr6qv\") pod \"keystone-db-create-cslxs\" (UID: \"c1badf9b-3bad-4267-aecb-cc8bf311cf07\") " pod="openstack/keystone-db-create-cslxs" Mar 20 09:05:07.834800 master-0 kubenswrapper[18707]: I0320 09:05:07.834456 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6hgj\" (UniqueName: \"kubernetes.io/projected/2a482121-8e91-4338-94d1-216c258dda1f-kube-api-access-m6hgj\") pod \"keystone-ec41-account-create-update-wlbg9\" (UID: \"2a482121-8e91-4338-94d1-216c258dda1f\") " pod="openstack/keystone-ec41-account-create-update-wlbg9" Mar 20 09:05:07.834800 master-0 kubenswrapper[18707]: I0320 09:05:07.834595 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1badf9b-3bad-4267-aecb-cc8bf311cf07-operator-scripts\") pod \"keystone-db-create-cslxs\" (UID: \"c1badf9b-3bad-4267-aecb-cc8bf311cf07\") " pod="openstack/keystone-db-create-cslxs" Mar 20 09:05:07.847963 master-0 kubenswrapper[18707]: I0320 09:05:07.838768 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a482121-8e91-4338-94d1-216c258dda1f-operator-scripts\") pod \"keystone-ec41-account-create-update-wlbg9\" (UID: \"2a482121-8e91-4338-94d1-216c258dda1f\") " pod="openstack/keystone-ec41-account-create-update-wlbg9" Mar 20 09:05:07.868512 master-0 kubenswrapper[18707]: I0320 09:05:07.868464 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-dqlc9"] Mar 20 09:05:07.869955 master-0 kubenswrapper[18707]: I0320 09:05:07.869929 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dqlc9" Mar 20 09:05:07.883682 master-0 kubenswrapper[18707]: I0320 09:05:07.883542 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6hgj\" (UniqueName: \"kubernetes.io/projected/2a482121-8e91-4338-94d1-216c258dda1f-kube-api-access-m6hgj\") pod \"keystone-ec41-account-create-update-wlbg9\" (UID: \"2a482121-8e91-4338-94d1-216c258dda1f\") " pod="openstack/keystone-ec41-account-create-update-wlbg9" Mar 20 09:05:07.901912 master-0 kubenswrapper[18707]: I0320 09:05:07.901778 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dqlc9"] Mar 20 09:05:07.937000 master-0 kubenswrapper[18707]: I0320 09:05:07.936932 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr6qv\" (UniqueName: \"kubernetes.io/projected/c1badf9b-3bad-4267-aecb-cc8bf311cf07-kube-api-access-rr6qv\") pod \"keystone-db-create-cslxs\" (UID: \"c1badf9b-3bad-4267-aecb-cc8bf311cf07\") " pod="openstack/keystone-db-create-cslxs" Mar 20 09:05:07.937199 master-0 kubenswrapper[18707]: I0320 09:05:07.937157 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1badf9b-3bad-4267-aecb-cc8bf311cf07-operator-scripts\") pod \"keystone-db-create-cslxs\" (UID: \"c1badf9b-3bad-4267-aecb-cc8bf311cf07\") " pod="openstack/keystone-db-create-cslxs" Mar 20 09:05:07.937274 master-0 kubenswrapper[18707]: I0320 09:05:07.937240 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jlzf\" (UniqueName: \"kubernetes.io/projected/51e675f7-a40f-4fc7-8e3f-227d95698d5d-kube-api-access-6jlzf\") pod \"placement-db-create-dqlc9\" (UID: \"51e675f7-a40f-4fc7-8e3f-227d95698d5d\") " pod="openstack/placement-db-create-dqlc9" Mar 20 09:05:07.937317 master-0 kubenswrapper[18707]: I0320 09:05:07.937300 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e675f7-a40f-4fc7-8e3f-227d95698d5d-operator-scripts\") pod \"placement-db-create-dqlc9\" (UID: \"51e675f7-a40f-4fc7-8e3f-227d95698d5d\") " pod="openstack/placement-db-create-dqlc9" Mar 20 09:05:07.940954 master-0 kubenswrapper[18707]: I0320 09:05:07.940359 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1badf9b-3bad-4267-aecb-cc8bf311cf07-operator-scripts\") pod \"keystone-db-create-cslxs\" (UID: \"c1badf9b-3bad-4267-aecb-cc8bf311cf07\") " pod="openstack/keystone-db-create-cslxs" Mar 20 09:05:07.944214 master-0 kubenswrapper[18707]: I0320 09:05:07.944143 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 09:05:07.964233 master-0 kubenswrapper[18707]: I0320 09:05:07.964163 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr6qv\" (UniqueName: \"kubernetes.io/projected/c1badf9b-3bad-4267-aecb-cc8bf311cf07-kube-api-access-rr6qv\") pod \"keystone-db-create-cslxs\" (UID: \"c1badf9b-3bad-4267-aecb-cc8bf311cf07\") " pod="openstack/keystone-db-create-cslxs" Mar 20 09:05:07.984472 master-0 kubenswrapper[18707]: I0320 09:05:07.984401 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-ffa9-account-create-update-22wx5"] Mar 20 09:05:07.986845 master-0 kubenswrapper[18707]: I0320 09:05:07.986794 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ffa9-account-create-update-22wx5" Mar 20 09:05:07.989275 master-0 kubenswrapper[18707]: I0320 09:05:07.989221 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ec41-account-create-update-wlbg9" Mar 20 09:05:07.990707 master-0 kubenswrapper[18707]: I0320 09:05:07.990367 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 09:05:07.998140 master-0 kubenswrapper[18707]: I0320 09:05:07.998099 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ffa9-account-create-update-22wx5"] Mar 20 09:05:08.027797 master-0 kubenswrapper[18707]: I0320 09:05:08.027742 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cslxs" Mar 20 09:05:08.040389 master-0 kubenswrapper[18707]: I0320 09:05:08.039609 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jlzf\" (UniqueName: \"kubernetes.io/projected/51e675f7-a40f-4fc7-8e3f-227d95698d5d-kube-api-access-6jlzf\") pod \"placement-db-create-dqlc9\" (UID: \"51e675f7-a40f-4fc7-8e3f-227d95698d5d\") " pod="openstack/placement-db-create-dqlc9" Mar 20 09:05:08.040389 master-0 kubenswrapper[18707]: I0320 09:05:08.039700 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e675f7-a40f-4fc7-8e3f-227d95698d5d-operator-scripts\") pod \"placement-db-create-dqlc9\" (UID: \"51e675f7-a40f-4fc7-8e3f-227d95698d5d\") " pod="openstack/placement-db-create-dqlc9" Mar 20 09:05:08.041476 master-0 kubenswrapper[18707]: I0320 09:05:08.041158 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e675f7-a40f-4fc7-8e3f-227d95698d5d-operator-scripts\") pod \"placement-db-create-dqlc9\" (UID: \"51e675f7-a40f-4fc7-8e3f-227d95698d5d\") " pod="openstack/placement-db-create-dqlc9" Mar 20 09:05:08.053785 master-0 kubenswrapper[18707]: I0320 09:05:08.053668 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 09:05:08.057977 master-0 kubenswrapper[18707]: I0320 09:05:08.057928 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jlzf\" (UniqueName: \"kubernetes.io/projected/51e675f7-a40f-4fc7-8e3f-227d95698d5d-kube-api-access-6jlzf\") pod \"placement-db-create-dqlc9\" (UID: \"51e675f7-a40f-4fc7-8e3f-227d95698d5d\") " pod="openstack/placement-db-create-dqlc9" Mar 20 09:05:08.142524 master-0 kubenswrapper[18707]: I0320 09:05:08.142125 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d63fe82e-6808-4fe3-a55c-681f01ea78da-operator-scripts\") pod \"placement-ffa9-account-create-update-22wx5\" (UID: \"d63fe82e-6808-4fe3-a55c-681f01ea78da\") " pod="openstack/placement-ffa9-account-create-update-22wx5" Mar 20 09:05:08.142524 master-0 kubenswrapper[18707]: I0320 09:05:08.142362 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjv6p\" (UniqueName: \"kubernetes.io/projected/d63fe82e-6808-4fe3-a55c-681f01ea78da-kube-api-access-fjv6p\") pod \"placement-ffa9-account-create-update-22wx5\" (UID: \"d63fe82e-6808-4fe3-a55c-681f01ea78da\") " pod="openstack/placement-ffa9-account-create-update-22wx5" Mar 20 09:05:08.229490 master-0 kubenswrapper[18707]: I0320 09:05:08.228955 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dqlc9" Mar 20 09:05:08.244508 master-0 kubenswrapper[18707]: I0320 09:05:08.244438 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d63fe82e-6808-4fe3-a55c-681f01ea78da-operator-scripts\") pod \"placement-ffa9-account-create-update-22wx5\" (UID: \"d63fe82e-6808-4fe3-a55c-681f01ea78da\") " pod="openstack/placement-ffa9-account-create-update-22wx5" Mar 20 09:05:08.244723 master-0 kubenswrapper[18707]: I0320 09:05:08.244642 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjv6p\" (UniqueName: \"kubernetes.io/projected/d63fe82e-6808-4fe3-a55c-681f01ea78da-kube-api-access-fjv6p\") pod \"placement-ffa9-account-create-update-22wx5\" (UID: \"d63fe82e-6808-4fe3-a55c-681f01ea78da\") " pod="openstack/placement-ffa9-account-create-update-22wx5" Mar 20 09:05:08.246764 master-0 kubenswrapper[18707]: I0320 09:05:08.246709 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d63fe82e-6808-4fe3-a55c-681f01ea78da-operator-scripts\") pod \"placement-ffa9-account-create-update-22wx5\" (UID: \"d63fe82e-6808-4fe3-a55c-681f01ea78da\") " pod="openstack/placement-ffa9-account-create-update-22wx5" Mar 20 09:05:08.268059 master-0 kubenswrapper[18707]: I0320 09:05:08.268002 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjv6p\" (UniqueName: \"kubernetes.io/projected/d63fe82e-6808-4fe3-a55c-681f01ea78da-kube-api-access-fjv6p\") pod \"placement-ffa9-account-create-update-22wx5\" (UID: \"d63fe82e-6808-4fe3-a55c-681f01ea78da\") " pod="openstack/placement-ffa9-account-create-update-22wx5" Mar 20 09:05:08.321301 master-0 kubenswrapper[18707]: I0320 09:05:08.320795 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ffa9-account-create-update-22wx5" Mar 20 09:05:08.628368 master-0 kubenswrapper[18707]: I0320 09:05:08.628204 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f498f559-52t76"] Mar 20 09:05:08.635154 master-0 kubenswrapper[18707]: I0320 09:05:08.628628 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76f498f559-52t76" podUID="8cdeb72a-9d14-480c-8d70-3a574f19755b" containerName="dnsmasq-dns" containerID="cri-o://d7c42445f9bef57cd6f547aa88450e87424608c018c22c496d9aa0850a140d32" gracePeriod=10 Mar 20 09:05:08.774477 master-0 kubenswrapper[18707]: I0320 09:05:08.769884 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ec41-account-create-update-wlbg9"] Mar 20 09:05:08.916778 master-0 kubenswrapper[18707]: W0320 09:05:08.916243 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1badf9b_3bad_4267_aecb_cc8bf311cf07.slice/crio-4ec2d264b5289dde29795bc8e121dffd062b81bfb34e01a86f3acf17313ed79d WatchSource:0}: Error finding container 4ec2d264b5289dde29795bc8e121dffd062b81bfb34e01a86f3acf17313ed79d: Status 404 returned error can't find the container with id 4ec2d264b5289dde29795bc8e121dffd062b81bfb34e01a86f3acf17313ed79d Mar 20 09:05:08.961088 master-0 kubenswrapper[18707]: I0320 09:05:08.958314 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-cslxs"] Mar 20 09:05:09.086207 master-0 kubenswrapper[18707]: I0320 09:05:09.073931 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-dhd96"] Mar 20 09:05:09.086207 master-0 kubenswrapper[18707]: I0320 09:05:09.076108 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.114788 master-0 kubenswrapper[18707]: I0320 09:05:09.099834 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-dhd96"] Mar 20 09:05:09.144495 master-0 kubenswrapper[18707]: I0320 09:05:09.144181 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dqlc9"] Mar 20 09:05:09.144495 master-0 kubenswrapper[18707]: I0320 09:05:09.144239 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ffa9-account-create-update-22wx5"] Mar 20 09:05:09.179290 master-0 kubenswrapper[18707]: I0320 09:05:09.178756 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7x8c\" (UniqueName: \"kubernetes.io/projected/5e4940cf-9604-4c97-b847-f928f2dadaa9-kube-api-access-s7x8c\") pod \"dnsmasq-dns-5bf8b865dc-dhd96\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.183629 master-0 kubenswrapper[18707]: I0320 09:05:09.182785 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-config\") pod \"dnsmasq-dns-5bf8b865dc-dhd96\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.183629 master-0 kubenswrapper[18707]: I0320 09:05:09.182877 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-dns-svc\") pod \"dnsmasq-dns-5bf8b865dc-dhd96\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.183629 master-0 kubenswrapper[18707]: I0320 09:05:09.182994 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf8b865dc-dhd96\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.183629 master-0 kubenswrapper[18707]: I0320 09:05:09.183211 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf8b865dc-dhd96\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.294788 master-0 kubenswrapper[18707]: I0320 09:05:09.289707 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf8b865dc-dhd96\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.294788 master-0 kubenswrapper[18707]: I0320 09:05:09.289860 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf8b865dc-dhd96\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.294788 master-0 kubenswrapper[18707]: I0320 09:05:09.289953 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7x8c\" (UniqueName: \"kubernetes.io/projected/5e4940cf-9604-4c97-b847-f928f2dadaa9-kube-api-access-s7x8c\") pod \"dnsmasq-dns-5bf8b865dc-dhd96\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.294788 master-0 kubenswrapper[18707]: I0320 09:05:09.290016 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-config\") pod \"dnsmasq-dns-5bf8b865dc-dhd96\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.294788 master-0 kubenswrapper[18707]: I0320 09:05:09.290055 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-dns-svc\") pod \"dnsmasq-dns-5bf8b865dc-dhd96\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.294788 master-0 kubenswrapper[18707]: I0320 09:05:09.291208 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-dns-svc\") pod \"dnsmasq-dns-5bf8b865dc-dhd96\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.294788 master-0 kubenswrapper[18707]: I0320 09:05:09.291917 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf8b865dc-dhd96\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.294788 master-0 kubenswrapper[18707]: I0320 09:05:09.293233 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf8b865dc-dhd96\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.294788 master-0 kubenswrapper[18707]: I0320 09:05:09.293306 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-config\") pod \"dnsmasq-dns-5bf8b865dc-dhd96\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.319589 master-0 kubenswrapper[18707]: I0320 09:05:09.318163 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7x8c\" (UniqueName: \"kubernetes.io/projected/5e4940cf-9604-4c97-b847-f928f2dadaa9-kube-api-access-s7x8c\") pod \"dnsmasq-dns-5bf8b865dc-dhd96\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.521972 master-0 kubenswrapper[18707]: I0320 09:05:09.521094 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:09.531887 master-0 kubenswrapper[18707]: I0320 09:05:09.528917 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:09.607795 master-0 kubenswrapper[18707]: I0320 09:05:09.606275 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-ovsdbserver-nb\") pod \"8cdeb72a-9d14-480c-8d70-3a574f19755b\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " Mar 20 09:05:09.607795 master-0 kubenswrapper[18707]: I0320 09:05:09.606398 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4kqs\" (UniqueName: \"kubernetes.io/projected/8cdeb72a-9d14-480c-8d70-3a574f19755b-kube-api-access-g4kqs\") pod \"8cdeb72a-9d14-480c-8d70-3a574f19755b\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " Mar 20 09:05:09.607795 master-0 kubenswrapper[18707]: I0320 09:05:09.606505 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-dns-svc\") pod \"8cdeb72a-9d14-480c-8d70-3a574f19755b\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " Mar 20 09:05:09.607795 master-0 kubenswrapper[18707]: I0320 09:05:09.606601 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-ovsdbserver-sb\") pod \"8cdeb72a-9d14-480c-8d70-3a574f19755b\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " Mar 20 09:05:09.607795 master-0 kubenswrapper[18707]: I0320 09:05:09.606681 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-config\") pod \"8cdeb72a-9d14-480c-8d70-3a574f19755b\" (UID: \"8cdeb72a-9d14-480c-8d70-3a574f19755b\") " Mar 20 09:05:09.621092 master-0 kubenswrapper[18707]: I0320 09:05:09.621024 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cdeb72a-9d14-480c-8d70-3a574f19755b-kube-api-access-g4kqs" (OuterVolumeSpecName: "kube-api-access-g4kqs") pod "8cdeb72a-9d14-480c-8d70-3a574f19755b" (UID: "8cdeb72a-9d14-480c-8d70-3a574f19755b"). InnerVolumeSpecName "kube-api-access-g4kqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:09.701262 master-0 kubenswrapper[18707]: I0320 09:05:09.701009 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8cdeb72a-9d14-480c-8d70-3a574f19755b" (UID: "8cdeb72a-9d14-480c-8d70-3a574f19755b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:09.704141 master-0 kubenswrapper[18707]: I0320 09:05:09.704046 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-config" (OuterVolumeSpecName: "config") pod "8cdeb72a-9d14-480c-8d70-3a574f19755b" (UID: "8cdeb72a-9d14-480c-8d70-3a574f19755b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:09.712845 master-0 kubenswrapper[18707]: I0320 09:05:09.712800 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4kqs\" (UniqueName: \"kubernetes.io/projected/8cdeb72a-9d14-480c-8d70-3a574f19755b-kube-api-access-g4kqs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:09.712845 master-0 kubenswrapper[18707]: I0320 09:05:09.712844 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:09.713069 master-0 kubenswrapper[18707]: I0320 09:05:09.712855 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:09.721734 master-0 kubenswrapper[18707]: I0320 09:05:09.721669 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8cdeb72a-9d14-480c-8d70-3a574f19755b" (UID: "8cdeb72a-9d14-480c-8d70-3a574f19755b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:09.727738 master-0 kubenswrapper[18707]: I0320 09:05:09.727578 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8cdeb72a-9d14-480c-8d70-3a574f19755b" (UID: "8cdeb72a-9d14-480c-8d70-3a574f19755b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:09.814731 master-0 kubenswrapper[18707]: I0320 09:05:09.814651 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:09.814731 master-0 kubenswrapper[18707]: I0320 09:05:09.814695 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8cdeb72a-9d14-480c-8d70-3a574f19755b-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:09.975221 master-0 kubenswrapper[18707]: I0320 09:05:09.973765 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ffa9-account-create-update-22wx5" event={"ID":"d63fe82e-6808-4fe3-a55c-681f01ea78da","Type":"ContainerStarted","Data":"7bdb6c9c4298089409f05a4eee45f5ad939101002f891a0863f338d29754115c"} Mar 20 09:05:09.975221 master-0 kubenswrapper[18707]: I0320 09:05:09.973829 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ffa9-account-create-update-22wx5" event={"ID":"d63fe82e-6808-4fe3-a55c-681f01ea78da","Type":"ContainerStarted","Data":"a123cf2e2fb40011ca897233ddf5dfedeb280442206a033d86342440e84f627d"} Mar 20 09:05:10.008927 master-0 kubenswrapper[18707]: I0320 09:05:09.999985 18707 generic.go:334] "Generic (PLEG): container finished" podID="8cdeb72a-9d14-480c-8d70-3a574f19755b" containerID="d7c42445f9bef57cd6f547aa88450e87424608c018c22c496d9aa0850a140d32" exitCode=0 Mar 20 09:05:10.008927 master-0 kubenswrapper[18707]: I0320 09:05:10.000052 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f498f559-52t76" event={"ID":"8cdeb72a-9d14-480c-8d70-3a574f19755b","Type":"ContainerDied","Data":"d7c42445f9bef57cd6f547aa88450e87424608c018c22c496d9aa0850a140d32"} Mar 20 09:05:10.008927 master-0 kubenswrapper[18707]: I0320 09:05:10.000094 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f498f559-52t76" event={"ID":"8cdeb72a-9d14-480c-8d70-3a574f19755b","Type":"ContainerDied","Data":"8b58582e19b02d67a6dd6a01d38592e4151bfd351a3cb1fc4825dfc5e0d4df2b"} Mar 20 09:05:10.008927 master-0 kubenswrapper[18707]: I0320 09:05:10.000115 18707 scope.go:117] "RemoveContainer" containerID="d7c42445f9bef57cd6f547aa88450e87424608c018c22c496d9aa0850a140d32" Mar 20 09:05:10.008927 master-0 kubenswrapper[18707]: I0320 09:05:10.000323 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f498f559-52t76" Mar 20 09:05:10.008927 master-0 kubenswrapper[18707]: I0320 09:05:10.003537 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-ffa9-account-create-update-22wx5" podStartSLOduration=3.003504851 podStartE2EDuration="3.003504851s" podCreationTimestamp="2026-03-20 09:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:09.995009459 +0000 UTC m=+1455.151189815" watchObservedRunningTime="2026-03-20 09:05:10.003504851 +0000 UTC m=+1455.159685207" Mar 20 09:05:10.008927 master-0 kubenswrapper[18707]: I0320 09:05:10.005849 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dqlc9" event={"ID":"51e675f7-a40f-4fc7-8e3f-227d95698d5d","Type":"ContainerStarted","Data":"36a774bdc7bdad7f2095a1a3885ebe4a35dffffc0d6b0d33093bde6063a71735"} Mar 20 09:05:10.008927 master-0 kubenswrapper[18707]: I0320 09:05:10.005885 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dqlc9" event={"ID":"51e675f7-a40f-4fc7-8e3f-227d95698d5d","Type":"ContainerStarted","Data":"cea31dc58756500bc7da203aa6ad8e6cddf22e2b782c27ec2dd377f656c53d55"} Mar 20 09:05:10.009788 master-0 kubenswrapper[18707]: I0320 09:05:10.009165 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ec41-account-create-update-wlbg9" event={"ID":"2a482121-8e91-4338-94d1-216c258dda1f","Type":"ContainerStarted","Data":"ca2b9e4f1c759dd4223cd6cee17729e7fad04b7f2a0a483eab4b178b5ea8e69e"} Mar 20 09:05:10.009788 master-0 kubenswrapper[18707]: I0320 09:05:10.009220 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ec41-account-create-update-wlbg9" event={"ID":"2a482121-8e91-4338-94d1-216c258dda1f","Type":"ContainerStarted","Data":"3f25c4ad3df142ef1aa3479c55195a61fcd4d17cf79f7046b478382c643d6b4a"} Mar 20 09:05:10.017797 master-0 kubenswrapper[18707]: I0320 09:05:10.017436 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cslxs" event={"ID":"c1badf9b-3bad-4267-aecb-cc8bf311cf07","Type":"ContainerStarted","Data":"cd55f7a32d0ec9054689727e6da502d46386a9de188dea7b2368cf48e3cd6ac4"} Mar 20 09:05:10.017797 master-0 kubenswrapper[18707]: I0320 09:05:10.017483 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cslxs" event={"ID":"c1badf9b-3bad-4267-aecb-cc8bf311cf07","Type":"ContainerStarted","Data":"4ec2d264b5289dde29795bc8e121dffd062b81bfb34e01a86f3acf17313ed79d"} Mar 20 09:05:10.046904 master-0 kubenswrapper[18707]: I0320 09:05:10.046210 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-dqlc9" podStartSLOduration=3.046173691 podStartE2EDuration="3.046173691s" podCreationTimestamp="2026-03-20 09:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:10.039452379 +0000 UTC m=+1455.195632735" watchObservedRunningTime="2026-03-20 09:05:10.046173691 +0000 UTC m=+1455.202354047" Mar 20 09:05:10.068420 master-0 kubenswrapper[18707]: I0320 09:05:10.066004 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-cslxs" podStartSLOduration=3.065967876 podStartE2EDuration="3.065967876s" podCreationTimestamp="2026-03-20 09:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:10.057306379 +0000 UTC m=+1455.213486725" watchObservedRunningTime="2026-03-20 09:05:10.065967876 +0000 UTC m=+1455.222148232" Mar 20 09:05:10.106118 master-0 kubenswrapper[18707]: I0320 09:05:10.106067 18707 scope.go:117] "RemoveContainer" containerID="2729da0be72f4c92cb042747477c20e03a677330a2d6103b00969cee06d0efad" Mar 20 09:05:10.125168 master-0 kubenswrapper[18707]: I0320 09:05:10.125071 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-ec41-account-create-update-wlbg9" podStartSLOduration=3.1250435149999998 podStartE2EDuration="3.125043515s" podCreationTimestamp="2026-03-20 09:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:10.087623475 +0000 UTC m=+1455.243803831" watchObservedRunningTime="2026-03-20 09:05:10.125043515 +0000 UTC m=+1455.281223871" Mar 20 09:05:10.143248 master-0 kubenswrapper[18707]: I0320 09:05:10.141421 18707 scope.go:117] "RemoveContainer" containerID="d7c42445f9bef57cd6f547aa88450e87424608c018c22c496d9aa0850a140d32" Mar 20 09:05:10.143614 master-0 kubenswrapper[18707]: E0320 09:05:10.143539 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c42445f9bef57cd6f547aa88450e87424608c018c22c496d9aa0850a140d32\": container with ID starting with d7c42445f9bef57cd6f547aa88450e87424608c018c22c496d9aa0850a140d32 not found: ID does not exist" containerID="d7c42445f9bef57cd6f547aa88450e87424608c018c22c496d9aa0850a140d32" Mar 20 09:05:10.143674 master-0 kubenswrapper[18707]: I0320 09:05:10.143621 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c42445f9bef57cd6f547aa88450e87424608c018c22c496d9aa0850a140d32"} err="failed to get container status \"d7c42445f9bef57cd6f547aa88450e87424608c018c22c496d9aa0850a140d32\": rpc error: code = NotFound desc = could not find container \"d7c42445f9bef57cd6f547aa88450e87424608c018c22c496d9aa0850a140d32\": container with ID starting with d7c42445f9bef57cd6f547aa88450e87424608c018c22c496d9aa0850a140d32 not found: ID does not exist" Mar 20 09:05:10.143674 master-0 kubenswrapper[18707]: I0320 09:05:10.143658 18707 scope.go:117] "RemoveContainer" containerID="2729da0be72f4c92cb042747477c20e03a677330a2d6103b00969cee06d0efad" Mar 20 09:05:10.144443 master-0 kubenswrapper[18707]: E0320 09:05:10.144414 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2729da0be72f4c92cb042747477c20e03a677330a2d6103b00969cee06d0efad\": container with ID starting with 2729da0be72f4c92cb042747477c20e03a677330a2d6103b00969cee06d0efad not found: ID does not exist" containerID="2729da0be72f4c92cb042747477c20e03a677330a2d6103b00969cee06d0efad" Mar 20 09:05:10.144495 master-0 kubenswrapper[18707]: I0320 09:05:10.144464 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2729da0be72f4c92cb042747477c20e03a677330a2d6103b00969cee06d0efad"} err="failed to get container status \"2729da0be72f4c92cb042747477c20e03a677330a2d6103b00969cee06d0efad\": rpc error: code = NotFound desc = could not find container \"2729da0be72f4c92cb042747477c20e03a677330a2d6103b00969cee06d0efad\": container with ID starting with 2729da0be72f4c92cb042747477c20e03a677330a2d6103b00969cee06d0efad not found: ID does not exist" Mar 20 09:05:10.170883 master-0 kubenswrapper[18707]: I0320 09:05:10.170719 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f498f559-52t76"] Mar 20 09:05:10.196799 master-0 kubenswrapper[18707]: I0320 09:05:10.187752 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76f498f559-52t76"] Mar 20 09:05:10.205753 master-0 kubenswrapper[18707]: I0320 09:05:10.205641 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-dhd96"] Mar 20 09:05:11.023030 master-0 kubenswrapper[18707]: I0320 09:05:11.022889 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 09:05:11.023690 master-0 kubenswrapper[18707]: E0320 09:05:11.023498 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cdeb72a-9d14-480c-8d70-3a574f19755b" containerName="dnsmasq-dns" Mar 20 09:05:11.023690 master-0 kubenswrapper[18707]: I0320 09:05:11.023519 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cdeb72a-9d14-480c-8d70-3a574f19755b" containerName="dnsmasq-dns" Mar 20 09:05:11.023690 master-0 kubenswrapper[18707]: E0320 09:05:11.023542 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cdeb72a-9d14-480c-8d70-3a574f19755b" containerName="init" Mar 20 09:05:11.023690 master-0 kubenswrapper[18707]: I0320 09:05:11.023552 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cdeb72a-9d14-480c-8d70-3a574f19755b" containerName="init" Mar 20 09:05:11.023854 master-0 kubenswrapper[18707]: I0320 09:05:11.023838 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cdeb72a-9d14-480c-8d70-3a574f19755b" containerName="dnsmasq-dns" Mar 20 09:05:11.043975 master-0 kubenswrapper[18707]: I0320 09:05:11.043903 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 09:05:11.047082 master-0 kubenswrapper[18707]: I0320 09:05:11.046296 18707 generic.go:334] "Generic (PLEG): container finished" podID="51e675f7-a40f-4fc7-8e3f-227d95698d5d" containerID="36a774bdc7bdad7f2095a1a3885ebe4a35dffffc0d6b0d33093bde6063a71735" exitCode=0 Mar 20 09:05:11.047082 master-0 kubenswrapper[18707]: I0320 09:05:11.046519 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dqlc9" event={"ID":"51e675f7-a40f-4fc7-8e3f-227d95698d5d","Type":"ContainerDied","Data":"36a774bdc7bdad7f2095a1a3885ebe4a35dffffc0d6b0d33093bde6063a71735"} Mar 20 09:05:11.048915 master-0 kubenswrapper[18707]: I0320 09:05:11.048874 18707 generic.go:334] "Generic (PLEG): container finished" podID="2a482121-8e91-4338-94d1-216c258dda1f" containerID="ca2b9e4f1c759dd4223cd6cee17729e7fad04b7f2a0a483eab4b178b5ea8e69e" exitCode=0 Mar 20 09:05:11.049016 master-0 kubenswrapper[18707]: I0320 09:05:11.048928 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ec41-account-create-update-wlbg9" event={"ID":"2a482121-8e91-4338-94d1-216c258dda1f","Type":"ContainerDied","Data":"ca2b9e4f1c759dd4223cd6cee17729e7fad04b7f2a0a483eab4b178b5ea8e69e"} Mar 20 09:05:11.052784 master-0 kubenswrapper[18707]: I0320 09:05:11.052657 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 09:05:11.052985 master-0 kubenswrapper[18707]: I0320 09:05:11.052952 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 09:05:11.053047 master-0 kubenswrapper[18707]: I0320 09:05:11.052984 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 09:05:11.061001 master-0 kubenswrapper[18707]: I0320 09:05:11.060034 18707 generic.go:334] "Generic (PLEG): container finished" podID="c1badf9b-3bad-4267-aecb-cc8bf311cf07" containerID="cd55f7a32d0ec9054689727e6da502d46386a9de188dea7b2368cf48e3cd6ac4" exitCode=0 Mar 20 09:05:11.061001 master-0 kubenswrapper[18707]: I0320 09:05:11.060152 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cslxs" event={"ID":"c1badf9b-3bad-4267-aecb-cc8bf311cf07","Type":"ContainerDied","Data":"cd55f7a32d0ec9054689727e6da502d46386a9de188dea7b2368cf48e3cd6ac4"} Mar 20 09:05:11.061001 master-0 kubenswrapper[18707]: I0320 09:05:11.060537 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 09:05:11.063769 master-0 kubenswrapper[18707]: I0320 09:05:11.063712 18707 generic.go:334] "Generic (PLEG): container finished" podID="5e4940cf-9604-4c97-b847-f928f2dadaa9" containerID="1ca630b2bcd6c9372c7740fe93dd9f588683360db12802e43e148f06eb2be378" exitCode=0 Mar 20 09:05:11.063858 master-0 kubenswrapper[18707]: I0320 09:05:11.063814 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" event={"ID":"5e4940cf-9604-4c97-b847-f928f2dadaa9","Type":"ContainerDied","Data":"1ca630b2bcd6c9372c7740fe93dd9f588683360db12802e43e148f06eb2be378"} Mar 20 09:05:11.063911 master-0 kubenswrapper[18707]: I0320 09:05:11.063856 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" event={"ID":"5e4940cf-9604-4c97-b847-f928f2dadaa9","Type":"ContainerStarted","Data":"54219d7425d1e1460191032517f026f3f1cc97cbe8d487a1f9b2c8d326f408a0"} Mar 20 09:05:11.085108 master-0 kubenswrapper[18707]: I0320 09:05:11.085048 18707 generic.go:334] "Generic (PLEG): container finished" podID="d63fe82e-6808-4fe3-a55c-681f01ea78da" containerID="7bdb6c9c4298089409f05a4eee45f5ad939101002f891a0863f338d29754115c" exitCode=0 Mar 20 09:05:11.085290 master-0 kubenswrapper[18707]: I0320 09:05:11.085120 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ffa9-account-create-update-22wx5" event={"ID":"d63fe82e-6808-4fe3-a55c-681f01ea78da","Type":"ContainerDied","Data":"7bdb6c9c4298089409f05a4eee45f5ad939101002f891a0863f338d29754115c"} Mar 20 09:05:11.124763 master-0 kubenswrapper[18707]: I0320 09:05:11.123424 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cdeb72a-9d14-480c-8d70-3a574f19755b" path="/var/lib/kubelet/pods/8cdeb72a-9d14-480c-8d70-3a574f19755b/volumes" Mar 20 09:05:11.183258 master-0 kubenswrapper[18707]: I0320 09:05:11.183120 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d537130-a515-4e6e-aedb-89a848cd477a-lock\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.183258 master-0 kubenswrapper[18707]: I0320 09:05:11.183219 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.183258 master-0 kubenswrapper[18707]: I0320 09:05:11.183281 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d537130-a515-4e6e-aedb-89a848cd477a-cache\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.184979 master-0 kubenswrapper[18707]: I0320 09:05:11.183325 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5b0423b7-8d71-4226-b20d-e3df8d969082\" (UniqueName: \"kubernetes.io/csi/topolvm.io^01a19d89-65f1-4a1f-bd25-859cce88819c\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.184979 master-0 kubenswrapper[18707]: I0320 09:05:11.183374 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m66vg\" (UniqueName: \"kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-kube-api-access-m66vg\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.184979 master-0 kubenswrapper[18707]: I0320 09:05:11.183416 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d537130-a515-4e6e-aedb-89a848cd477a-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.286096 master-0 kubenswrapper[18707]: I0320 09:05:11.285975 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d537130-a515-4e6e-aedb-89a848cd477a-lock\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.286324 master-0 kubenswrapper[18707]: I0320 09:05:11.286250 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.286472 master-0 kubenswrapper[18707]: E0320 09:05:11.286423 18707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 09:05:11.286472 master-0 kubenswrapper[18707]: E0320 09:05:11.286472 18707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 09:05:11.286538 master-0 kubenswrapper[18707]: I0320 09:05:11.286445 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d537130-a515-4e6e-aedb-89a848cd477a-cache\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.286664 master-0 kubenswrapper[18707]: I0320 09:05:11.286603 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5b0423b7-8d71-4226-b20d-e3df8d969082\" (UniqueName: \"kubernetes.io/csi/topolvm.io^01a19d89-65f1-4a1f-bd25-859cce88819c\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.286711 master-0 kubenswrapper[18707]: I0320 09:05:11.286668 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m66vg\" (UniqueName: \"kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-kube-api-access-m66vg\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.286711 master-0 kubenswrapper[18707]: I0320 09:05:11.286698 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d537130-a515-4e6e-aedb-89a848cd477a-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.286792 master-0 kubenswrapper[18707]: I0320 09:05:11.286747 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d537130-a515-4e6e-aedb-89a848cd477a-lock\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.287005 master-0 kubenswrapper[18707]: I0320 09:05:11.286958 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d537130-a515-4e6e-aedb-89a848cd477a-cache\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.288805 master-0 kubenswrapper[18707]: E0320 09:05:11.288750 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift podName:4d537130-a515-4e6e-aedb-89a848cd477a nodeName:}" failed. No retries permitted until 2026-03-20 09:05:11.788719968 +0000 UTC m=+1456.944900454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift") pod "swift-storage-0" (UID: "4d537130-a515-4e6e-aedb-89a848cd477a") : configmap "swift-ring-files" not found Mar 20 09:05:11.292054 master-0 kubenswrapper[18707]: I0320 09:05:11.291984 18707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 09:05:11.292054 master-0 kubenswrapper[18707]: I0320 09:05:11.292022 18707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5b0423b7-8d71-4226-b20d-e3df8d969082\" (UniqueName: \"kubernetes.io/csi/topolvm.io^01a19d89-65f1-4a1f-bd25-859cce88819c\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/78642f4b74cb980f0d6e319fef0d79de3f36428ff642a3d2cd0c9caee522d817/globalmount\"" pod="openstack/swift-storage-0" Mar 20 09:05:11.292493 master-0 kubenswrapper[18707]: I0320 09:05:11.292383 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d537130-a515-4e6e-aedb-89a848cd477a-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.303668 master-0 kubenswrapper[18707]: I0320 09:05:11.303622 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m66vg\" (UniqueName: \"kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-kube-api-access-m66vg\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.797928 master-0 kubenswrapper[18707]: I0320 09:05:11.797824 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:11.798965 master-0 kubenswrapper[18707]: E0320 09:05:11.798919 18707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 09:05:11.798965 master-0 kubenswrapper[18707]: E0320 09:05:11.798950 18707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 09:05:11.799135 master-0 kubenswrapper[18707]: E0320 09:05:11.799002 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift podName:4d537130-a515-4e6e-aedb-89a848cd477a nodeName:}" failed. No retries permitted until 2026-03-20 09:05:12.79898486 +0000 UTC m=+1457.955165216 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift") pod "swift-storage-0" (UID: "4d537130-a515-4e6e-aedb-89a848cd477a") : configmap "swift-ring-files" not found Mar 20 09:05:12.101103 master-0 kubenswrapper[18707]: I0320 09:05:12.100981 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" event={"ID":"5e4940cf-9604-4c97-b847-f928f2dadaa9","Type":"ContainerStarted","Data":"5c99fd3f403bbd68a04ab23e8e2717779b23a1758a6b5c1b6557a2e7daa2f890"} Mar 20 09:05:12.101631 master-0 kubenswrapper[18707]: I0320 09:05:12.101344 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:12.133828 master-0 kubenswrapper[18707]: I0320 09:05:12.133709 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" podStartSLOduration=4.133664353 podStartE2EDuration="4.133664353s" podCreationTimestamp="2026-03-20 09:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:12.127710862 +0000 UTC m=+1457.283891228" watchObservedRunningTime="2026-03-20 09:05:12.133664353 +0000 UTC m=+1457.289844739" Mar 20 09:05:12.248636 master-0 kubenswrapper[18707]: I0320 09:05:12.248261 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-l87pv"] Mar 20 09:05:12.252286 master-0 kubenswrapper[18707]: I0320 09:05:12.252244 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-l87pv" Mar 20 09:05:12.272638 master-0 kubenswrapper[18707]: I0320 09:05:12.272489 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-fb00-account-create-update-9vcp7"] Mar 20 09:05:12.276025 master-0 kubenswrapper[18707]: I0320 09:05:12.274641 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fb00-account-create-update-9vcp7" Mar 20 09:05:12.284072 master-0 kubenswrapper[18707]: I0320 09:05:12.279871 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 09:05:12.284072 master-0 kubenswrapper[18707]: I0320 09:05:12.283323 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-l87pv"] Mar 20 09:05:12.299603 master-0 kubenswrapper[18707]: I0320 09:05:12.299554 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fb00-account-create-update-9vcp7"] Mar 20 09:05:12.316459 master-0 kubenswrapper[18707]: I0320 09:05:12.316383 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx6tx\" (UniqueName: \"kubernetes.io/projected/bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a-kube-api-access-bx6tx\") pod \"glance-db-create-l87pv\" (UID: \"bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a\") " pod="openstack/glance-db-create-l87pv" Mar 20 09:05:12.316459 master-0 kubenswrapper[18707]: I0320 09:05:12.316444 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cc50577-daf4-4a55-b1aa-2ae0b52e04d3-operator-scripts\") pod \"glance-fb00-account-create-update-9vcp7\" (UID: \"1cc50577-daf4-4a55-b1aa-2ae0b52e04d3\") " pod="openstack/glance-fb00-account-create-update-9vcp7" Mar 20 09:05:12.316732 master-0 kubenswrapper[18707]: I0320 09:05:12.316519 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a-operator-scripts\") pod \"glance-db-create-l87pv\" (UID: \"bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a\") " pod="openstack/glance-db-create-l87pv" Mar 20 09:05:12.316732 master-0 kubenswrapper[18707]: I0320 09:05:12.316572 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbpbn\" (UniqueName: \"kubernetes.io/projected/1cc50577-daf4-4a55-b1aa-2ae0b52e04d3-kube-api-access-cbpbn\") pod \"glance-fb00-account-create-update-9vcp7\" (UID: \"1cc50577-daf4-4a55-b1aa-2ae0b52e04d3\") " pod="openstack/glance-fb00-account-create-update-9vcp7" Mar 20 09:05:12.420166 master-0 kubenswrapper[18707]: I0320 09:05:12.420001 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbpbn\" (UniqueName: \"kubernetes.io/projected/1cc50577-daf4-4a55-b1aa-2ae0b52e04d3-kube-api-access-cbpbn\") pod \"glance-fb00-account-create-update-9vcp7\" (UID: \"1cc50577-daf4-4a55-b1aa-2ae0b52e04d3\") " pod="openstack/glance-fb00-account-create-update-9vcp7" Mar 20 09:05:12.422608 master-0 kubenswrapper[18707]: I0320 09:05:12.420385 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx6tx\" (UniqueName: \"kubernetes.io/projected/bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a-kube-api-access-bx6tx\") pod \"glance-db-create-l87pv\" (UID: \"bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a\") " pod="openstack/glance-db-create-l87pv" Mar 20 09:05:12.422608 master-0 kubenswrapper[18707]: I0320 09:05:12.420447 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cc50577-daf4-4a55-b1aa-2ae0b52e04d3-operator-scripts\") pod \"glance-fb00-account-create-update-9vcp7\" (UID: \"1cc50577-daf4-4a55-b1aa-2ae0b52e04d3\") " pod="openstack/glance-fb00-account-create-update-9vcp7" Mar 20 09:05:12.422608 master-0 kubenswrapper[18707]: I0320 09:05:12.420538 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a-operator-scripts\") pod \"glance-db-create-l87pv\" (UID: \"bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a\") " pod="openstack/glance-db-create-l87pv" Mar 20 09:05:12.422608 master-0 kubenswrapper[18707]: I0320 09:05:12.421958 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cc50577-daf4-4a55-b1aa-2ae0b52e04d3-operator-scripts\") pod \"glance-fb00-account-create-update-9vcp7\" (UID: \"1cc50577-daf4-4a55-b1aa-2ae0b52e04d3\") " pod="openstack/glance-fb00-account-create-update-9vcp7" Mar 20 09:05:12.423826 master-0 kubenswrapper[18707]: I0320 09:05:12.423756 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a-operator-scripts\") pod \"glance-db-create-l87pv\" (UID: \"bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a\") " pod="openstack/glance-db-create-l87pv" Mar 20 09:05:12.444789 master-0 kubenswrapper[18707]: I0320 09:05:12.444719 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbpbn\" (UniqueName: \"kubernetes.io/projected/1cc50577-daf4-4a55-b1aa-2ae0b52e04d3-kube-api-access-cbpbn\") pod \"glance-fb00-account-create-update-9vcp7\" (UID: \"1cc50577-daf4-4a55-b1aa-2ae0b52e04d3\") " pod="openstack/glance-fb00-account-create-update-9vcp7" Mar 20 09:05:12.457868 master-0 kubenswrapper[18707]: I0320 09:05:12.457828 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx6tx\" (UniqueName: \"kubernetes.io/projected/bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a-kube-api-access-bx6tx\") pod \"glance-db-create-l87pv\" (UID: \"bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a\") " pod="openstack/glance-db-create-l87pv" Mar 20 09:05:12.582235 master-0 kubenswrapper[18707]: I0320 09:05:12.582154 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-l87pv" Mar 20 09:05:12.621318 master-0 kubenswrapper[18707]: I0320 09:05:12.620472 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fb00-account-create-update-9vcp7" Mar 20 09:05:12.699398 master-0 kubenswrapper[18707]: I0320 09:05:12.698989 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5b0423b7-8d71-4226-b20d-e3df8d969082\" (UniqueName: \"kubernetes.io/csi/topolvm.io^01a19d89-65f1-4a1f-bd25-859cce88819c\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:12.724338 master-0 kubenswrapper[18707]: I0320 09:05:12.724026 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cslxs" Mar 20 09:05:12.875455 master-0 kubenswrapper[18707]: I0320 09:05:12.866333 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr6qv\" (UniqueName: \"kubernetes.io/projected/c1badf9b-3bad-4267-aecb-cc8bf311cf07-kube-api-access-rr6qv\") pod \"c1badf9b-3bad-4267-aecb-cc8bf311cf07\" (UID: \"c1badf9b-3bad-4267-aecb-cc8bf311cf07\") " Mar 20 09:05:12.875455 master-0 kubenswrapper[18707]: I0320 09:05:12.866537 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1badf9b-3bad-4267-aecb-cc8bf311cf07-operator-scripts\") pod \"c1badf9b-3bad-4267-aecb-cc8bf311cf07\" (UID: \"c1badf9b-3bad-4267-aecb-cc8bf311cf07\") " Mar 20 09:05:12.875455 master-0 kubenswrapper[18707]: I0320 09:05:12.867234 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:12.875455 master-0 kubenswrapper[18707]: E0320 09:05:12.867497 18707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 09:05:12.875455 master-0 kubenswrapper[18707]: E0320 09:05:12.867515 18707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 09:05:12.875455 master-0 kubenswrapper[18707]: E0320 09:05:12.867570 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift podName:4d537130-a515-4e6e-aedb-89a848cd477a nodeName:}" failed. No retries permitted until 2026-03-20 09:05:14.867552634 +0000 UTC m=+1460.023732990 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift") pod "swift-storage-0" (UID: "4d537130-a515-4e6e-aedb-89a848cd477a") : configmap "swift-ring-files" not found Mar 20 09:05:12.875455 master-0 kubenswrapper[18707]: I0320 09:05:12.871070 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1badf9b-3bad-4267-aecb-cc8bf311cf07-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c1badf9b-3bad-4267-aecb-cc8bf311cf07" (UID: "c1badf9b-3bad-4267-aecb-cc8bf311cf07"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:12.876081 master-0 kubenswrapper[18707]: I0320 09:05:12.876032 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1badf9b-3bad-4267-aecb-cc8bf311cf07-kube-api-access-rr6qv" (OuterVolumeSpecName: "kube-api-access-rr6qv") pod "c1badf9b-3bad-4267-aecb-cc8bf311cf07" (UID: "c1badf9b-3bad-4267-aecb-cc8bf311cf07"). InnerVolumeSpecName "kube-api-access-rr6qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:12.969736 master-0 kubenswrapper[18707]: I0320 09:05:12.969672 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr6qv\" (UniqueName: \"kubernetes.io/projected/c1badf9b-3bad-4267-aecb-cc8bf311cf07-kube-api-access-rr6qv\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:12.969736 master-0 kubenswrapper[18707]: I0320 09:05:12.969719 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1badf9b-3bad-4267-aecb-cc8bf311cf07-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:13.116317 master-0 kubenswrapper[18707]: I0320 09:05:13.116268 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dqlc9" Mar 20 09:05:13.117918 master-0 kubenswrapper[18707]: I0320 09:05:13.117871 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cslxs" Mar 20 09:05:13.128634 master-0 kubenswrapper[18707]: I0320 09:05:13.128118 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ec41-account-create-update-wlbg9" event={"ID":"2a482121-8e91-4338-94d1-216c258dda1f","Type":"ContainerDied","Data":"3f25c4ad3df142ef1aa3479c55195a61fcd4d17cf79f7046b478382c643d6b4a"} Mar 20 09:05:13.128634 master-0 kubenswrapper[18707]: I0320 09:05:13.128167 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f25c4ad3df142ef1aa3479c55195a61fcd4d17cf79f7046b478382c643d6b4a" Mar 20 09:05:13.128634 master-0 kubenswrapper[18707]: I0320 09:05:13.128178 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cslxs" event={"ID":"c1badf9b-3bad-4267-aecb-cc8bf311cf07","Type":"ContainerDied","Data":"4ec2d264b5289dde29795bc8e121dffd062b81bfb34e01a86f3acf17313ed79d"} Mar 20 09:05:13.128634 master-0 kubenswrapper[18707]: I0320 09:05:13.128220 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ec2d264b5289dde29795bc8e121dffd062b81bfb34e01a86f3acf17313ed79d" Mar 20 09:05:13.128634 master-0 kubenswrapper[18707]: I0320 09:05:13.128229 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ffa9-account-create-update-22wx5" event={"ID":"d63fe82e-6808-4fe3-a55c-681f01ea78da","Type":"ContainerDied","Data":"a123cf2e2fb40011ca897233ddf5dfedeb280442206a033d86342440e84f627d"} Mar 20 09:05:13.128634 master-0 kubenswrapper[18707]: I0320 09:05:13.128241 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a123cf2e2fb40011ca897233ddf5dfedeb280442206a033d86342440e84f627d" Mar 20 09:05:13.129299 master-0 kubenswrapper[18707]: I0320 09:05:13.129271 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dqlc9" Mar 20 09:05:13.129387 master-0 kubenswrapper[18707]: I0320 09:05:13.129322 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dqlc9" event={"ID":"51e675f7-a40f-4fc7-8e3f-227d95698d5d","Type":"ContainerDied","Data":"cea31dc58756500bc7da203aa6ad8e6cddf22e2b782c27ec2dd377f656c53d55"} Mar 20 09:05:13.129387 master-0 kubenswrapper[18707]: I0320 09:05:13.129372 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cea31dc58756500bc7da203aa6ad8e6cddf22e2b782c27ec2dd377f656c53d55" Mar 20 09:05:13.129976 master-0 kubenswrapper[18707]: I0320 09:05:13.129936 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ec41-account-create-update-wlbg9" Mar 20 09:05:13.148810 master-0 kubenswrapper[18707]: I0320 09:05:13.148757 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ffa9-account-create-update-22wx5" Mar 20 09:05:13.173456 master-0 kubenswrapper[18707]: I0320 09:05:13.172889 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jlzf\" (UniqueName: \"kubernetes.io/projected/51e675f7-a40f-4fc7-8e3f-227d95698d5d-kube-api-access-6jlzf\") pod \"51e675f7-a40f-4fc7-8e3f-227d95698d5d\" (UID: \"51e675f7-a40f-4fc7-8e3f-227d95698d5d\") " Mar 20 09:05:13.173456 master-0 kubenswrapper[18707]: I0320 09:05:13.173154 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e675f7-a40f-4fc7-8e3f-227d95698d5d-operator-scripts\") pod \"51e675f7-a40f-4fc7-8e3f-227d95698d5d\" (UID: \"51e675f7-a40f-4fc7-8e3f-227d95698d5d\") " Mar 20 09:05:13.173456 master-0 kubenswrapper[18707]: I0320 09:05:13.173382 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6hgj\" (UniqueName: \"kubernetes.io/projected/2a482121-8e91-4338-94d1-216c258dda1f-kube-api-access-m6hgj\") pod \"2a482121-8e91-4338-94d1-216c258dda1f\" (UID: \"2a482121-8e91-4338-94d1-216c258dda1f\") " Mar 20 09:05:13.173456 master-0 kubenswrapper[18707]: I0320 09:05:13.173425 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a482121-8e91-4338-94d1-216c258dda1f-operator-scripts\") pod \"2a482121-8e91-4338-94d1-216c258dda1f\" (UID: \"2a482121-8e91-4338-94d1-216c258dda1f\") " Mar 20 09:05:13.174544 master-0 kubenswrapper[18707]: I0320 09:05:13.173854 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51e675f7-a40f-4fc7-8e3f-227d95698d5d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51e675f7-a40f-4fc7-8e3f-227d95698d5d" (UID: "51e675f7-a40f-4fc7-8e3f-227d95698d5d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:13.174544 master-0 kubenswrapper[18707]: I0320 09:05:13.174198 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e675f7-a40f-4fc7-8e3f-227d95698d5d-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:13.175827 master-0 kubenswrapper[18707]: I0320 09:05:13.175775 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a482121-8e91-4338-94d1-216c258dda1f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a482121-8e91-4338-94d1-216c258dda1f" (UID: "2a482121-8e91-4338-94d1-216c258dda1f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:13.178726 master-0 kubenswrapper[18707]: I0320 09:05:13.178685 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a482121-8e91-4338-94d1-216c258dda1f-kube-api-access-m6hgj" (OuterVolumeSpecName: "kube-api-access-m6hgj") pod "2a482121-8e91-4338-94d1-216c258dda1f" (UID: "2a482121-8e91-4338-94d1-216c258dda1f"). InnerVolumeSpecName "kube-api-access-m6hgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:13.179143 master-0 kubenswrapper[18707]: I0320 09:05:13.179096 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51e675f7-a40f-4fc7-8e3f-227d95698d5d-kube-api-access-6jlzf" (OuterVolumeSpecName: "kube-api-access-6jlzf") pod "51e675f7-a40f-4fc7-8e3f-227d95698d5d" (UID: "51e675f7-a40f-4fc7-8e3f-227d95698d5d"). InnerVolumeSpecName "kube-api-access-6jlzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:13.244753 master-0 kubenswrapper[18707]: I0320 09:05:13.244709 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-l87pv"] Mar 20 09:05:13.276237 master-0 kubenswrapper[18707]: I0320 09:05:13.275221 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjv6p\" (UniqueName: \"kubernetes.io/projected/d63fe82e-6808-4fe3-a55c-681f01ea78da-kube-api-access-fjv6p\") pod \"d63fe82e-6808-4fe3-a55c-681f01ea78da\" (UID: \"d63fe82e-6808-4fe3-a55c-681f01ea78da\") " Mar 20 09:05:13.276237 master-0 kubenswrapper[18707]: I0320 09:05:13.275300 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d63fe82e-6808-4fe3-a55c-681f01ea78da-operator-scripts\") pod \"d63fe82e-6808-4fe3-a55c-681f01ea78da\" (UID: \"d63fe82e-6808-4fe3-a55c-681f01ea78da\") " Mar 20 09:05:13.276237 master-0 kubenswrapper[18707]: I0320 09:05:13.276071 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6hgj\" (UniqueName: \"kubernetes.io/projected/2a482121-8e91-4338-94d1-216c258dda1f-kube-api-access-m6hgj\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:13.276237 master-0 kubenswrapper[18707]: I0320 09:05:13.276087 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a482121-8e91-4338-94d1-216c258dda1f-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:13.276237 master-0 kubenswrapper[18707]: I0320 09:05:13.276098 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jlzf\" (UniqueName: \"kubernetes.io/projected/51e675f7-a40f-4fc7-8e3f-227d95698d5d-kube-api-access-6jlzf\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:13.276594 master-0 kubenswrapper[18707]: I0320 09:05:13.276573 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d63fe82e-6808-4fe3-a55c-681f01ea78da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d63fe82e-6808-4fe3-a55c-681f01ea78da" (UID: "d63fe82e-6808-4fe3-a55c-681f01ea78da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:13.278299 master-0 kubenswrapper[18707]: I0320 09:05:13.278124 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d63fe82e-6808-4fe3-a55c-681f01ea78da-kube-api-access-fjv6p" (OuterVolumeSpecName: "kube-api-access-fjv6p") pod "d63fe82e-6808-4fe3-a55c-681f01ea78da" (UID: "d63fe82e-6808-4fe3-a55c-681f01ea78da"). InnerVolumeSpecName "kube-api-access-fjv6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:13.353982 master-0 kubenswrapper[18707]: I0320 09:05:13.353831 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fb00-account-create-update-9vcp7"] Mar 20 09:05:13.360883 master-0 kubenswrapper[18707]: W0320 09:05:13.360840 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cc50577_daf4_4a55_b1aa_2ae0b52e04d3.slice/crio-bf979bead3d3262883b5630a9472bdc00bbede632736a332c1f99fe5dbb583c0 WatchSource:0}: Error finding container bf979bead3d3262883b5630a9472bdc00bbede632736a332c1f99fe5dbb583c0: Status 404 returned error can't find the container with id bf979bead3d3262883b5630a9472bdc00bbede632736a332c1f99fe5dbb583c0 Mar 20 09:05:13.378310 master-0 kubenswrapper[18707]: I0320 09:05:13.377683 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjv6p\" (UniqueName: \"kubernetes.io/projected/d63fe82e-6808-4fe3-a55c-681f01ea78da-kube-api-access-fjv6p\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:13.378310 master-0 kubenswrapper[18707]: I0320 09:05:13.377719 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d63fe82e-6808-4fe3-a55c-681f01ea78da-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:13.685351 master-0 kubenswrapper[18707]: I0320 09:05:13.683852 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2zgwx"] Mar 20 09:05:13.685351 master-0 kubenswrapper[18707]: E0320 09:05:13.684351 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1badf9b-3bad-4267-aecb-cc8bf311cf07" containerName="mariadb-database-create" Mar 20 09:05:13.685351 master-0 kubenswrapper[18707]: I0320 09:05:13.684368 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1badf9b-3bad-4267-aecb-cc8bf311cf07" containerName="mariadb-database-create" Mar 20 09:05:13.685351 master-0 kubenswrapper[18707]: E0320 09:05:13.684388 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51e675f7-a40f-4fc7-8e3f-227d95698d5d" containerName="mariadb-database-create" Mar 20 09:05:13.685351 master-0 kubenswrapper[18707]: I0320 09:05:13.684396 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="51e675f7-a40f-4fc7-8e3f-227d95698d5d" containerName="mariadb-database-create" Mar 20 09:05:13.685351 master-0 kubenswrapper[18707]: E0320 09:05:13.684438 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a482121-8e91-4338-94d1-216c258dda1f" containerName="mariadb-account-create-update" Mar 20 09:05:13.685351 master-0 kubenswrapper[18707]: I0320 09:05:13.684448 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a482121-8e91-4338-94d1-216c258dda1f" containerName="mariadb-account-create-update" Mar 20 09:05:13.685351 master-0 kubenswrapper[18707]: E0320 09:05:13.684461 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d63fe82e-6808-4fe3-a55c-681f01ea78da" containerName="mariadb-account-create-update" Mar 20 09:05:13.685351 master-0 kubenswrapper[18707]: I0320 09:05:13.684468 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63fe82e-6808-4fe3-a55c-681f01ea78da" containerName="mariadb-account-create-update" Mar 20 09:05:13.685351 master-0 kubenswrapper[18707]: I0320 09:05:13.684682 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d63fe82e-6808-4fe3-a55c-681f01ea78da" containerName="mariadb-account-create-update" Mar 20 09:05:13.685351 master-0 kubenswrapper[18707]: I0320 09:05:13.684700 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="51e675f7-a40f-4fc7-8e3f-227d95698d5d" containerName="mariadb-database-create" Mar 20 09:05:13.685351 master-0 kubenswrapper[18707]: I0320 09:05:13.684727 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1badf9b-3bad-4267-aecb-cc8bf311cf07" containerName="mariadb-database-create" Mar 20 09:05:13.685351 master-0 kubenswrapper[18707]: I0320 09:05:13.684753 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a482121-8e91-4338-94d1-216c258dda1f" containerName="mariadb-account-create-update" Mar 20 09:05:13.686120 master-0 kubenswrapper[18707]: I0320 09:05:13.685445 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.689416 master-0 kubenswrapper[18707]: I0320 09:05:13.689348 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 09:05:13.689703 master-0 kubenswrapper[18707]: I0320 09:05:13.689476 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 09:05:13.690277 master-0 kubenswrapper[18707]: I0320 09:05:13.690222 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 09:05:13.734427 master-0 kubenswrapper[18707]: I0320 09:05:13.733018 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-2zgwx"] Mar 20 09:05:13.734427 master-0 kubenswrapper[18707]: E0320 09:05:13.734381 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-drxcn ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-drxcn ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-2zgwx" podUID="09f5c2e9-727f-43c8-b1f9-b54babbb5626" Mar 20 09:05:13.765142 master-0 kubenswrapper[18707]: I0320 09:05:13.765062 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-p9cff"] Mar 20 09:05:13.766687 master-0 kubenswrapper[18707]: I0320 09:05:13.766649 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.790594 master-0 kubenswrapper[18707]: I0320 09:05:13.790539 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/09f5c2e9-727f-43c8-b1f9-b54babbb5626-etc-swift\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.790594 master-0 kubenswrapper[18707]: I0320 09:05:13.790588 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-combined-ca-bundle\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.790861 master-0 kubenswrapper[18707]: I0320 09:05:13.790638 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/09f5c2e9-727f-43c8-b1f9-b54babbb5626-ring-data-devices\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.790861 master-0 kubenswrapper[18707]: I0320 09:05:13.790744 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-dispersionconf\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.790861 master-0 kubenswrapper[18707]: I0320 09:05:13.790785 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09f5c2e9-727f-43c8-b1f9-b54babbb5626-scripts\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.790861 master-0 kubenswrapper[18707]: I0320 09:05:13.790842 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-swiftconf\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.790861 master-0 kubenswrapper[18707]: I0320 09:05:13.790864 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drxcn\" (UniqueName: \"kubernetes.io/projected/09f5c2e9-727f-43c8-b1f9-b54babbb5626-kube-api-access-drxcn\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.801565 master-0 kubenswrapper[18707]: I0320 09:05:13.800870 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-2zgwx"] Mar 20 09:05:13.823301 master-0 kubenswrapper[18707]: I0320 09:05:13.823204 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-p9cff"] Mar 20 09:05:13.894403 master-0 kubenswrapper[18707]: I0320 09:05:13.893368 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-combined-ca-bundle\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.894403 master-0 kubenswrapper[18707]: I0320 09:05:13.893422 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae0d0efd-e9bb-4546-8606-82af8296bab1-etc-swift\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.894403 master-0 kubenswrapper[18707]: I0320 09:05:13.893441 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-combined-ca-bundle\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.894403 master-0 kubenswrapper[18707]: I0320 09:05:13.893470 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae0d0efd-e9bb-4546-8606-82af8296bab1-ring-data-devices\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.894403 master-0 kubenswrapper[18707]: I0320 09:05:13.893512 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/09f5c2e9-727f-43c8-b1f9-b54babbb5626-ring-data-devices\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.894403 master-0 kubenswrapper[18707]: I0320 09:05:13.893632 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae0d0efd-e9bb-4546-8606-82af8296bab1-scripts\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.894403 master-0 kubenswrapper[18707]: I0320 09:05:13.893965 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cgrc\" (UniqueName: \"kubernetes.io/projected/ae0d0efd-e9bb-4546-8606-82af8296bab1-kube-api-access-7cgrc\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.894403 master-0 kubenswrapper[18707]: I0320 09:05:13.894097 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-dispersionconf\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.894403 master-0 kubenswrapper[18707]: I0320 09:05:13.894162 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09f5c2e9-727f-43c8-b1f9-b54babbb5626-scripts\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.894403 master-0 kubenswrapper[18707]: I0320 09:05:13.894296 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-swiftconf\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.894403 master-0 kubenswrapper[18707]: I0320 09:05:13.894315 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/09f5c2e9-727f-43c8-b1f9-b54babbb5626-ring-data-devices\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.896498 master-0 kubenswrapper[18707]: I0320 09:05:13.894470 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drxcn\" (UniqueName: \"kubernetes.io/projected/09f5c2e9-727f-43c8-b1f9-b54babbb5626-kube-api-access-drxcn\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.896498 master-0 kubenswrapper[18707]: I0320 09:05:13.894521 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-dispersionconf\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.896498 master-0 kubenswrapper[18707]: I0320 09:05:13.894998 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09f5c2e9-727f-43c8-b1f9-b54babbb5626-scripts\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.896498 master-0 kubenswrapper[18707]: I0320 09:05:13.895382 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-swiftconf\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.896498 master-0 kubenswrapper[18707]: I0320 09:05:13.895558 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/09f5c2e9-727f-43c8-b1f9-b54babbb5626-etc-swift\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.896498 master-0 kubenswrapper[18707]: I0320 09:05:13.896033 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/09f5c2e9-727f-43c8-b1f9-b54babbb5626-etc-swift\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.899160 master-0 kubenswrapper[18707]: I0320 09:05:13.899094 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-dispersionconf\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.899530 master-0 kubenswrapper[18707]: I0320 09:05:13.899219 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-combined-ca-bundle\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.901970 master-0 kubenswrapper[18707]: I0320 09:05:13.901806 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-swiftconf\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.912858 master-0 kubenswrapper[18707]: I0320 09:05:13.912802 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drxcn\" (UniqueName: \"kubernetes.io/projected/09f5c2e9-727f-43c8-b1f9-b54babbb5626-kube-api-access-drxcn\") pod \"swift-ring-rebalance-2zgwx\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:13.998010 master-0 kubenswrapper[18707]: I0320 09:05:13.997835 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-swiftconf\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.998010 master-0 kubenswrapper[18707]: I0320 09:05:13.997951 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae0d0efd-e9bb-4546-8606-82af8296bab1-etc-swift\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.998399 master-0 kubenswrapper[18707]: I0320 09:05:13.998133 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-combined-ca-bundle\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.998399 master-0 kubenswrapper[18707]: I0320 09:05:13.998169 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae0d0efd-e9bb-4546-8606-82af8296bab1-ring-data-devices\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.998399 master-0 kubenswrapper[18707]: I0320 09:05:13.998274 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae0d0efd-e9bb-4546-8606-82af8296bab1-scripts\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.998399 master-0 kubenswrapper[18707]: I0320 09:05:13.998367 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cgrc\" (UniqueName: \"kubernetes.io/projected/ae0d0efd-e9bb-4546-8606-82af8296bab1-kube-api-access-7cgrc\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.998578 master-0 kubenswrapper[18707]: I0320 09:05:13.998426 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae0d0efd-e9bb-4546-8606-82af8296bab1-etc-swift\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.998578 master-0 kubenswrapper[18707]: I0320 09:05:13.998482 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-dispersionconf\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.999788 master-0 kubenswrapper[18707]: I0320 09:05:13.999041 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae0d0efd-e9bb-4546-8606-82af8296bab1-scripts\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:13.999788 master-0 kubenswrapper[18707]: I0320 09:05:13.999097 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae0d0efd-e9bb-4546-8606-82af8296bab1-ring-data-devices\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:14.001923 master-0 kubenswrapper[18707]: I0320 09:05:14.001868 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-combined-ca-bundle\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:14.002080 master-0 kubenswrapper[18707]: I0320 09:05:14.002043 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-swiftconf\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:14.002466 master-0 kubenswrapper[18707]: I0320 09:05:14.002432 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-dispersionconf\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:14.017276 master-0 kubenswrapper[18707]: I0320 09:05:14.016853 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cgrc\" (UniqueName: \"kubernetes.io/projected/ae0d0efd-e9bb-4546-8606-82af8296bab1-kube-api-access-7cgrc\") pod \"swift-ring-rebalance-p9cff\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:14.105436 master-0 kubenswrapper[18707]: I0320 09:05:14.096661 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-nc4r5"] Mar 20 09:05:14.105436 master-0 kubenswrapper[18707]: I0320 09:05:14.098311 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nc4r5" Mar 20 09:05:14.113937 master-0 kubenswrapper[18707]: I0320 09:05:14.113869 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:14.114900 master-0 kubenswrapper[18707]: I0320 09:05:14.114700 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 09:05:14.124352 master-0 kubenswrapper[18707]: I0320 09:05:14.124241 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nc4r5"] Mar 20 09:05:14.140147 master-0 kubenswrapper[18707]: I0320 09:05:14.140092 18707 generic.go:334] "Generic (PLEG): container finished" podID="1cc50577-daf4-4a55-b1aa-2ae0b52e04d3" containerID="4d9f7b2cd1e6116f3e6dd74bac6715ca5e47aa39a73a16abdd32a9c48b8d1101" exitCode=0 Mar 20 09:05:14.141288 master-0 kubenswrapper[18707]: I0320 09:05:14.140197 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fb00-account-create-update-9vcp7" event={"ID":"1cc50577-daf4-4a55-b1aa-2ae0b52e04d3","Type":"ContainerDied","Data":"4d9f7b2cd1e6116f3e6dd74bac6715ca5e47aa39a73a16abdd32a9c48b8d1101"} Mar 20 09:05:14.141288 master-0 kubenswrapper[18707]: I0320 09:05:14.140250 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fb00-account-create-update-9vcp7" event={"ID":"1cc50577-daf4-4a55-b1aa-2ae0b52e04d3","Type":"ContainerStarted","Data":"bf979bead3d3262883b5630a9472bdc00bbede632736a332c1f99fe5dbb583c0"} Mar 20 09:05:14.142263 master-0 kubenswrapper[18707]: I0320 09:05:14.142222 18707 generic.go:334] "Generic (PLEG): container finished" podID="bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a" containerID="53c19949948cfde5dd778064ab56e6214cf9de8fac4728709b7e8b6eee84d38a" exitCode=0 Mar 20 09:05:14.142751 master-0 kubenswrapper[18707]: I0320 09:05:14.142724 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:14.143710 master-0 kubenswrapper[18707]: I0320 09:05:14.143659 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-l87pv" event={"ID":"bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a","Type":"ContainerDied","Data":"53c19949948cfde5dd778064ab56e6214cf9de8fac4728709b7e8b6eee84d38a"} Mar 20 09:05:14.143710 master-0 kubenswrapper[18707]: I0320 09:05:14.143692 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-l87pv" event={"ID":"bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a","Type":"ContainerStarted","Data":"268e275433a25cc45125f44e1e14e8d52c0a2509641c20845ea0e4a6c0b136fe"} Mar 20 09:05:14.144571 master-0 kubenswrapper[18707]: I0320 09:05:14.144542 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ec41-account-create-update-wlbg9" Mar 20 09:05:14.145007 master-0 kubenswrapper[18707]: I0320 09:05:14.144609 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ffa9-account-create-update-22wx5" Mar 20 09:05:14.201246 master-0 kubenswrapper[18707]: I0320 09:05:14.201130 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:14.202955 master-0 kubenswrapper[18707]: I0320 09:05:14.202523 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v68nc\" (UniqueName: \"kubernetes.io/projected/37b99a3a-0f6d-4c4d-a71a-940816b034d3-kube-api-access-v68nc\") pod \"root-account-create-update-nc4r5\" (UID: \"37b99a3a-0f6d-4c4d-a71a-940816b034d3\") " pod="openstack/root-account-create-update-nc4r5" Mar 20 09:05:14.203025 master-0 kubenswrapper[18707]: I0320 09:05:14.202948 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b99a3a-0f6d-4c4d-a71a-940816b034d3-operator-scripts\") pod \"root-account-create-update-nc4r5\" (UID: \"37b99a3a-0f6d-4c4d-a71a-940816b034d3\") " pod="openstack/root-account-create-update-nc4r5" Mar 20 09:05:14.304925 master-0 kubenswrapper[18707]: I0320 09:05:14.304861 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-combined-ca-bundle\") pod \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " Mar 20 09:05:14.304925 master-0 kubenswrapper[18707]: I0320 09:05:14.304917 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drxcn\" (UniqueName: \"kubernetes.io/projected/09f5c2e9-727f-43c8-b1f9-b54babbb5626-kube-api-access-drxcn\") pod \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " Mar 20 09:05:14.305246 master-0 kubenswrapper[18707]: I0320 09:05:14.305068 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/09f5c2e9-727f-43c8-b1f9-b54babbb5626-etc-swift\") pod \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " Mar 20 09:05:14.305246 master-0 kubenswrapper[18707]: I0320 09:05:14.305101 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/09f5c2e9-727f-43c8-b1f9-b54babbb5626-ring-data-devices\") pod \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " Mar 20 09:05:14.305246 master-0 kubenswrapper[18707]: I0320 09:05:14.305226 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-dispersionconf\") pod \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " Mar 20 09:05:14.312906 master-0 kubenswrapper[18707]: I0320 09:05:14.305319 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09f5c2e9-727f-43c8-b1f9-b54babbb5626-scripts\") pod \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " Mar 20 09:05:14.312906 master-0 kubenswrapper[18707]: I0320 09:05:14.305367 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-swiftconf\") pod \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\" (UID: \"09f5c2e9-727f-43c8-b1f9-b54babbb5626\") " Mar 20 09:05:14.312906 master-0 kubenswrapper[18707]: I0320 09:05:14.305769 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09f5c2e9-727f-43c8-b1f9-b54babbb5626-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "09f5c2e9-727f-43c8-b1f9-b54babbb5626" (UID: "09f5c2e9-727f-43c8-b1f9-b54babbb5626"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:05:14.312906 master-0 kubenswrapper[18707]: I0320 09:05:14.305798 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v68nc\" (UniqueName: \"kubernetes.io/projected/37b99a3a-0f6d-4c4d-a71a-940816b034d3-kube-api-access-v68nc\") pod \"root-account-create-update-nc4r5\" (UID: \"37b99a3a-0f6d-4c4d-a71a-940816b034d3\") " pod="openstack/root-account-create-update-nc4r5" Mar 20 09:05:14.312906 master-0 kubenswrapper[18707]: I0320 09:05:14.305852 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b99a3a-0f6d-4c4d-a71a-940816b034d3-operator-scripts\") pod \"root-account-create-update-nc4r5\" (UID: \"37b99a3a-0f6d-4c4d-a71a-940816b034d3\") " pod="openstack/root-account-create-update-nc4r5" Mar 20 09:05:14.312906 master-0 kubenswrapper[18707]: I0320 09:05:14.305936 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f5c2e9-727f-43c8-b1f9-b54babbb5626-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "09f5c2e9-727f-43c8-b1f9-b54babbb5626" (UID: "09f5c2e9-727f-43c8-b1f9-b54babbb5626"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:14.312906 master-0 kubenswrapper[18707]: I0320 09:05:14.306068 18707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/09f5c2e9-727f-43c8-b1f9-b54babbb5626-etc-swift\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:14.312906 master-0 kubenswrapper[18707]: I0320 09:05:14.306083 18707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/09f5c2e9-727f-43c8-b1f9-b54babbb5626-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:14.312906 master-0 kubenswrapper[18707]: I0320 09:05:14.306494 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09f5c2e9-727f-43c8-b1f9-b54babbb5626-scripts" (OuterVolumeSpecName: "scripts") pod "09f5c2e9-727f-43c8-b1f9-b54babbb5626" (UID: "09f5c2e9-727f-43c8-b1f9-b54babbb5626"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:14.312906 master-0 kubenswrapper[18707]: I0320 09:05:14.306876 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b99a3a-0f6d-4c4d-a71a-940816b034d3-operator-scripts\") pod \"root-account-create-update-nc4r5\" (UID: \"37b99a3a-0f6d-4c4d-a71a-940816b034d3\") " pod="openstack/root-account-create-update-nc4r5" Mar 20 09:05:14.312906 master-0 kubenswrapper[18707]: I0320 09:05:14.308334 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09f5c2e9-727f-43c8-b1f9-b54babbb5626" (UID: "09f5c2e9-727f-43c8-b1f9-b54babbb5626"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:05:14.312906 master-0 kubenswrapper[18707]: I0320 09:05:14.309064 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "09f5c2e9-727f-43c8-b1f9-b54babbb5626" (UID: "09f5c2e9-727f-43c8-b1f9-b54babbb5626"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:05:14.312906 master-0 kubenswrapper[18707]: I0320 09:05:14.309157 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09f5c2e9-727f-43c8-b1f9-b54babbb5626-kube-api-access-drxcn" (OuterVolumeSpecName: "kube-api-access-drxcn") pod "09f5c2e9-727f-43c8-b1f9-b54babbb5626" (UID: "09f5c2e9-727f-43c8-b1f9-b54babbb5626"). InnerVolumeSpecName "kube-api-access-drxcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:14.312906 master-0 kubenswrapper[18707]: I0320 09:05:14.311519 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "09f5c2e9-727f-43c8-b1f9-b54babbb5626" (UID: "09f5c2e9-727f-43c8-b1f9-b54babbb5626"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:05:14.321037 master-0 kubenswrapper[18707]: I0320 09:05:14.320992 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v68nc\" (UniqueName: \"kubernetes.io/projected/37b99a3a-0f6d-4c4d-a71a-940816b034d3-kube-api-access-v68nc\") pod \"root-account-create-update-nc4r5\" (UID: \"37b99a3a-0f6d-4c4d-a71a-940816b034d3\") " pod="openstack/root-account-create-update-nc4r5" Mar 20 09:05:14.412201 master-0 kubenswrapper[18707]: I0320 09:05:14.409211 18707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-swiftconf\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:14.412201 master-0 kubenswrapper[18707]: I0320 09:05:14.409266 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:14.412201 master-0 kubenswrapper[18707]: I0320 09:05:14.409284 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drxcn\" (UniqueName: \"kubernetes.io/projected/09f5c2e9-727f-43c8-b1f9-b54babbb5626-kube-api-access-drxcn\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:14.412201 master-0 kubenswrapper[18707]: I0320 09:05:14.409295 18707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/09f5c2e9-727f-43c8-b1f9-b54babbb5626-dispersionconf\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:14.412201 master-0 kubenswrapper[18707]: I0320 09:05:14.409306 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09f5c2e9-727f-43c8-b1f9-b54babbb5626-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:14.489864 master-0 kubenswrapper[18707]: I0320 09:05:14.489810 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nc4r5" Mar 20 09:05:14.586220 master-0 kubenswrapper[18707]: I0320 09:05:14.583839 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-p9cff"] Mar 20 09:05:14.591544 master-0 kubenswrapper[18707]: W0320 09:05:14.589355 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae0d0efd_e9bb_4546_8606_82af8296bab1.slice/crio-f62e2073ab6c228a8186e030940fe47e27fe057145f53218711cb98a9a92eea2 WatchSource:0}: Error finding container f62e2073ab6c228a8186e030940fe47e27fe057145f53218711cb98a9a92eea2: Status 404 returned error can't find the container with id f62e2073ab6c228a8186e030940fe47e27fe057145f53218711cb98a9a92eea2 Mar 20 09:05:14.959130 master-0 kubenswrapper[18707]: I0320 09:05:14.923385 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:14.959130 master-0 kubenswrapper[18707]: E0320 09:05:14.923623 18707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 09:05:14.959130 master-0 kubenswrapper[18707]: E0320 09:05:14.923640 18707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 09:05:14.959130 master-0 kubenswrapper[18707]: E0320 09:05:14.923703 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift podName:4d537130-a515-4e6e-aedb-89a848cd477a nodeName:}" failed. No retries permitted until 2026-03-20 09:05:18.923685421 +0000 UTC m=+1464.079865777 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift") pod "swift-storage-0" (UID: "4d537130-a515-4e6e-aedb-89a848cd477a") : configmap "swift-ring-files" not found Mar 20 09:05:15.012040 master-0 kubenswrapper[18707]: I0320 09:05:15.011943 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nc4r5"] Mar 20 09:05:15.031748 master-0 kubenswrapper[18707]: W0320 09:05:15.030997 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37b99a3a_0f6d_4c4d_a71a_940816b034d3.slice/crio-f11c48695b9a017f5a49eabe87f04fb0780d69fdf3ec7bc2e6526bd7d13a09ed WatchSource:0}: Error finding container f11c48695b9a017f5a49eabe87f04fb0780d69fdf3ec7bc2e6526bd7d13a09ed: Status 404 returned error can't find the container with id f11c48695b9a017f5a49eabe87f04fb0780d69fdf3ec7bc2e6526bd7d13a09ed Mar 20 09:05:15.159906 master-0 kubenswrapper[18707]: I0320 09:05:15.159851 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nc4r5" event={"ID":"37b99a3a-0f6d-4c4d-a71a-940816b034d3","Type":"ContainerStarted","Data":"f11c48695b9a017f5a49eabe87f04fb0780d69fdf3ec7bc2e6526bd7d13a09ed"} Mar 20 09:05:15.161696 master-0 kubenswrapper[18707]: I0320 09:05:15.161628 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p9cff" event={"ID":"ae0d0efd-e9bb-4546-8606-82af8296bab1","Type":"ContainerStarted","Data":"f62e2073ab6c228a8186e030940fe47e27fe057145f53218711cb98a9a92eea2"} Mar 20 09:05:15.161696 master-0 kubenswrapper[18707]: I0320 09:05:15.161663 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2zgwx" Mar 20 09:05:15.216137 master-0 kubenswrapper[18707]: I0320 09:05:15.215865 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-2zgwx"] Mar 20 09:05:15.225448 master-0 kubenswrapper[18707]: I0320 09:05:15.225045 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-2zgwx"] Mar 20 09:05:15.587748 master-0 kubenswrapper[18707]: E0320 09:05:15.587019 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37b99a3a_0f6d_4c4d_a71a_940816b034d3.slice/crio-conmon-01276e1df3863b98d6c4a0c4fd58f71f07c78c8d74fb4c327558a43a8ec346e8.scope\": RecentStats: unable to find data in memory cache]" Mar 20 09:05:15.658890 master-0 kubenswrapper[18707]: I0320 09:05:15.658555 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-l87pv" Mar 20 09:05:15.749010 master-0 kubenswrapper[18707]: I0320 09:05:15.747630 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx6tx\" (UniqueName: \"kubernetes.io/projected/bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a-kube-api-access-bx6tx\") pod \"bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a\" (UID: \"bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a\") " Mar 20 09:05:15.749010 master-0 kubenswrapper[18707]: I0320 09:05:15.748124 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a-operator-scripts\") pod \"bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a\" (UID: \"bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a\") " Mar 20 09:05:15.749283 master-0 kubenswrapper[18707]: I0320 09:05:15.749135 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a" (UID: "bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:15.749874 master-0 kubenswrapper[18707]: I0320 09:05:15.749824 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:15.753337 master-0 kubenswrapper[18707]: I0320 09:05:15.752349 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a-kube-api-access-bx6tx" (OuterVolumeSpecName: "kube-api-access-bx6tx") pod "bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a" (UID: "bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a"). InnerVolumeSpecName "kube-api-access-bx6tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:15.791082 master-0 kubenswrapper[18707]: I0320 09:05:15.791025 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fb00-account-create-update-9vcp7" Mar 20 09:05:15.850356 master-0 kubenswrapper[18707]: I0320 09:05:15.850290 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbpbn\" (UniqueName: \"kubernetes.io/projected/1cc50577-daf4-4a55-b1aa-2ae0b52e04d3-kube-api-access-cbpbn\") pod \"1cc50577-daf4-4a55-b1aa-2ae0b52e04d3\" (UID: \"1cc50577-daf4-4a55-b1aa-2ae0b52e04d3\") " Mar 20 09:05:15.850563 master-0 kubenswrapper[18707]: I0320 09:05:15.850476 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cc50577-daf4-4a55-b1aa-2ae0b52e04d3-operator-scripts\") pod \"1cc50577-daf4-4a55-b1aa-2ae0b52e04d3\" (UID: \"1cc50577-daf4-4a55-b1aa-2ae0b52e04d3\") " Mar 20 09:05:15.850968 master-0 kubenswrapper[18707]: I0320 09:05:15.850943 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx6tx\" (UniqueName: \"kubernetes.io/projected/bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a-kube-api-access-bx6tx\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:15.851491 master-0 kubenswrapper[18707]: I0320 09:05:15.851460 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cc50577-daf4-4a55-b1aa-2ae0b52e04d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1cc50577-daf4-4a55-b1aa-2ae0b52e04d3" (UID: "1cc50577-daf4-4a55-b1aa-2ae0b52e04d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:15.855254 master-0 kubenswrapper[18707]: I0320 09:05:15.854472 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cc50577-daf4-4a55-b1aa-2ae0b52e04d3-kube-api-access-cbpbn" (OuterVolumeSpecName: "kube-api-access-cbpbn") pod "1cc50577-daf4-4a55-b1aa-2ae0b52e04d3" (UID: "1cc50577-daf4-4a55-b1aa-2ae0b52e04d3"). InnerVolumeSpecName "kube-api-access-cbpbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:15.953434 master-0 kubenswrapper[18707]: I0320 09:05:15.953383 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbpbn\" (UniqueName: \"kubernetes.io/projected/1cc50577-daf4-4a55-b1aa-2ae0b52e04d3-kube-api-access-cbpbn\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:15.953699 master-0 kubenswrapper[18707]: I0320 09:05:15.953682 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cc50577-daf4-4a55-b1aa-2ae0b52e04d3-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:16.174497 master-0 kubenswrapper[18707]: I0320 09:05:16.174433 18707 generic.go:334] "Generic (PLEG): container finished" podID="37b99a3a-0f6d-4c4d-a71a-940816b034d3" containerID="01276e1df3863b98d6c4a0c4fd58f71f07c78c8d74fb4c327558a43a8ec346e8" exitCode=0 Mar 20 09:05:16.175158 master-0 kubenswrapper[18707]: I0320 09:05:16.174509 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nc4r5" event={"ID":"37b99a3a-0f6d-4c4d-a71a-940816b034d3","Type":"ContainerDied","Data":"01276e1df3863b98d6c4a0c4fd58f71f07c78c8d74fb4c327558a43a8ec346e8"} Mar 20 09:05:16.176401 master-0 kubenswrapper[18707]: I0320 09:05:16.176346 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fb00-account-create-update-9vcp7" event={"ID":"1cc50577-daf4-4a55-b1aa-2ae0b52e04d3","Type":"ContainerDied","Data":"bf979bead3d3262883b5630a9472bdc00bbede632736a332c1f99fe5dbb583c0"} Mar 20 09:05:16.176461 master-0 kubenswrapper[18707]: I0320 09:05:16.176410 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf979bead3d3262883b5630a9472bdc00bbede632736a332c1f99fe5dbb583c0" Mar 20 09:05:16.176512 master-0 kubenswrapper[18707]: I0320 09:05:16.176501 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fb00-account-create-update-9vcp7" Mar 20 09:05:16.183812 master-0 kubenswrapper[18707]: I0320 09:05:16.183744 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-l87pv" event={"ID":"bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a","Type":"ContainerDied","Data":"268e275433a25cc45125f44e1e14e8d52c0a2509641c20845ea0e4a6c0b136fe"} Mar 20 09:05:16.183812 master-0 kubenswrapper[18707]: I0320 09:05:16.183806 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="268e275433a25cc45125f44e1e14e8d52c0a2509641c20845ea0e4a6c0b136fe" Mar 20 09:05:16.184005 master-0 kubenswrapper[18707]: I0320 09:05:16.183826 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-l87pv" Mar 20 09:05:17.111596 master-0 kubenswrapper[18707]: I0320 09:05:17.111528 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09f5c2e9-727f-43c8-b1f9-b54babbb5626" path="/var/lib/kubelet/pods/09f5c2e9-727f-43c8-b1f9-b54babbb5626/volumes" Mar 20 09:05:17.562173 master-0 kubenswrapper[18707]: I0320 09:05:17.562109 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-662t6"] Mar 20 09:05:17.564810 master-0 kubenswrapper[18707]: E0320 09:05:17.562811 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc50577-daf4-4a55-b1aa-2ae0b52e04d3" containerName="mariadb-account-create-update" Mar 20 09:05:17.564810 master-0 kubenswrapper[18707]: I0320 09:05:17.562839 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc50577-daf4-4a55-b1aa-2ae0b52e04d3" containerName="mariadb-account-create-update" Mar 20 09:05:17.564810 master-0 kubenswrapper[18707]: E0320 09:05:17.562872 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a" containerName="mariadb-database-create" Mar 20 09:05:17.564810 master-0 kubenswrapper[18707]: I0320 09:05:17.562881 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a" containerName="mariadb-database-create" Mar 20 09:05:17.564810 master-0 kubenswrapper[18707]: I0320 09:05:17.563209 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a" containerName="mariadb-database-create" Mar 20 09:05:17.564810 master-0 kubenswrapper[18707]: I0320 09:05:17.563262 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc50577-daf4-4a55-b1aa-2ae0b52e04d3" containerName="mariadb-account-create-update" Mar 20 09:05:17.564810 master-0 kubenswrapper[18707]: I0320 09:05:17.564429 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-662t6" Mar 20 09:05:17.569233 master-0 kubenswrapper[18707]: I0320 09:05:17.566763 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-27086-config-data" Mar 20 09:05:17.575217 master-0 kubenswrapper[18707]: I0320 09:05:17.573755 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-662t6"] Mar 20 09:05:17.594465 master-0 kubenswrapper[18707]: I0320 09:05:17.591295 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-db-sync-config-data\") pod \"glance-db-sync-662t6\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " pod="openstack/glance-db-sync-662t6" Mar 20 09:05:17.594465 master-0 kubenswrapper[18707]: I0320 09:05:17.591404 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw4wt\" (UniqueName: \"kubernetes.io/projected/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-kube-api-access-jw4wt\") pod \"glance-db-sync-662t6\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " pod="openstack/glance-db-sync-662t6" Mar 20 09:05:17.594465 master-0 kubenswrapper[18707]: I0320 09:05:17.591435 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-config-data\") pod \"glance-db-sync-662t6\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " pod="openstack/glance-db-sync-662t6" Mar 20 09:05:17.594465 master-0 kubenswrapper[18707]: I0320 09:05:17.591488 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-combined-ca-bundle\") pod \"glance-db-sync-662t6\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " pod="openstack/glance-db-sync-662t6" Mar 20 09:05:17.693634 master-0 kubenswrapper[18707]: I0320 09:05:17.693539 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-combined-ca-bundle\") pod \"glance-db-sync-662t6\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " pod="openstack/glance-db-sync-662t6" Mar 20 09:05:17.694367 master-0 kubenswrapper[18707]: I0320 09:05:17.694296 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-db-sync-config-data\") pod \"glance-db-sync-662t6\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " pod="openstack/glance-db-sync-662t6" Mar 20 09:05:17.694711 master-0 kubenswrapper[18707]: I0320 09:05:17.694669 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw4wt\" (UniqueName: \"kubernetes.io/projected/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-kube-api-access-jw4wt\") pod \"glance-db-sync-662t6\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " pod="openstack/glance-db-sync-662t6" Mar 20 09:05:17.694779 master-0 kubenswrapper[18707]: I0320 09:05:17.694746 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-config-data\") pod \"glance-db-sync-662t6\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " pod="openstack/glance-db-sync-662t6" Mar 20 09:05:17.708000 master-0 kubenswrapper[18707]: I0320 09:05:17.698885 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-config-data\") pod \"glance-db-sync-662t6\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " pod="openstack/glance-db-sync-662t6" Mar 20 09:05:17.708000 master-0 kubenswrapper[18707]: I0320 09:05:17.698892 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-db-sync-config-data\") pod \"glance-db-sync-662t6\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " pod="openstack/glance-db-sync-662t6" Mar 20 09:05:17.712098 master-0 kubenswrapper[18707]: I0320 09:05:17.712061 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-combined-ca-bundle\") pod \"glance-db-sync-662t6\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " pod="openstack/glance-db-sync-662t6" Mar 20 09:05:17.713100 master-0 kubenswrapper[18707]: I0320 09:05:17.713043 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw4wt\" (UniqueName: \"kubernetes.io/projected/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-kube-api-access-jw4wt\") pod \"glance-db-sync-662t6\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " pod="openstack/glance-db-sync-662t6" Mar 20 09:05:17.899729 master-0 kubenswrapper[18707]: I0320 09:05:17.899591 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-662t6" Mar 20 09:05:18.679556 master-0 kubenswrapper[18707]: I0320 09:05:18.679491 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nc4r5" Mar 20 09:05:18.823747 master-0 kubenswrapper[18707]: I0320 09:05:18.823685 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b99a3a-0f6d-4c4d-a71a-940816b034d3-operator-scripts\") pod \"37b99a3a-0f6d-4c4d-a71a-940816b034d3\" (UID: \"37b99a3a-0f6d-4c4d-a71a-940816b034d3\") " Mar 20 09:05:18.823983 master-0 kubenswrapper[18707]: I0320 09:05:18.823869 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v68nc\" (UniqueName: \"kubernetes.io/projected/37b99a3a-0f6d-4c4d-a71a-940816b034d3-kube-api-access-v68nc\") pod \"37b99a3a-0f6d-4c4d-a71a-940816b034d3\" (UID: \"37b99a3a-0f6d-4c4d-a71a-940816b034d3\") " Mar 20 09:05:18.825975 master-0 kubenswrapper[18707]: I0320 09:05:18.825942 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b99a3a-0f6d-4c4d-a71a-940816b034d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37b99a3a-0f6d-4c4d-a71a-940816b034d3" (UID: "37b99a3a-0f6d-4c4d-a71a-940816b034d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:18.832796 master-0 kubenswrapper[18707]: I0320 09:05:18.832724 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b99a3a-0f6d-4c4d-a71a-940816b034d3-kube-api-access-v68nc" (OuterVolumeSpecName: "kube-api-access-v68nc") pod "37b99a3a-0f6d-4c4d-a71a-940816b034d3" (UID: "37b99a3a-0f6d-4c4d-a71a-940816b034d3"). InnerVolumeSpecName "kube-api-access-v68nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:18.926996 master-0 kubenswrapper[18707]: I0320 09:05:18.926909 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:18.927203 master-0 kubenswrapper[18707]: I0320 09:05:18.927064 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v68nc\" (UniqueName: \"kubernetes.io/projected/37b99a3a-0f6d-4c4d-a71a-940816b034d3-kube-api-access-v68nc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:18.927203 master-0 kubenswrapper[18707]: I0320 09:05:18.927084 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b99a3a-0f6d-4c4d-a71a-940816b034d3-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:18.927321 master-0 kubenswrapper[18707]: E0320 09:05:18.927213 18707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 09:05:18.927321 master-0 kubenswrapper[18707]: E0320 09:05:18.927228 18707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 09:05:18.927321 master-0 kubenswrapper[18707]: E0320 09:05:18.927301 18707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift podName:4d537130-a515-4e6e-aedb-89a848cd477a nodeName:}" failed. No retries permitted until 2026-03-20 09:05:26.927267017 +0000 UTC m=+1472.083447373 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift") pod "swift-storage-0" (UID: "4d537130-a515-4e6e-aedb-89a848cd477a") : configmap "swift-ring-files" not found Mar 20 09:05:19.141647 master-0 kubenswrapper[18707]: I0320 09:05:19.137821 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-662t6"] Mar 20 09:05:19.231949 master-0 kubenswrapper[18707]: I0320 09:05:19.231900 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nc4r5" event={"ID":"37b99a3a-0f6d-4c4d-a71a-940816b034d3","Type":"ContainerDied","Data":"f11c48695b9a017f5a49eabe87f04fb0780d69fdf3ec7bc2e6526bd7d13a09ed"} Mar 20 09:05:19.231949 master-0 kubenswrapper[18707]: I0320 09:05:19.231946 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f11c48695b9a017f5a49eabe87f04fb0780d69fdf3ec7bc2e6526bd7d13a09ed" Mar 20 09:05:19.232325 master-0 kubenswrapper[18707]: I0320 09:05:19.232305 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nc4r5" Mar 20 09:05:19.234122 master-0 kubenswrapper[18707]: I0320 09:05:19.234079 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-662t6" event={"ID":"dd5f8067-9874-4ff3-a9fb-a3251cc3622d","Type":"ContainerStarted","Data":"0b4b7ae2200fe76848331e15d12821d473a328643c8836b61a6128fcf0b43bd7"} Mar 20 09:05:19.236360 master-0 kubenswrapper[18707]: I0320 09:05:19.236323 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p9cff" event={"ID":"ae0d0efd-e9bb-4546-8606-82af8296bab1","Type":"ContainerStarted","Data":"46ed3721e06910055a152852f40f3a63311fe7e5ec22fe6e6248404148f81795"} Mar 20 09:05:19.277211 master-0 kubenswrapper[18707]: I0320 09:05:19.271776 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-p9cff" podStartSLOduration=2.265626632 podStartE2EDuration="6.271754112s" podCreationTimestamp="2026-03-20 09:05:13 +0000 UTC" firstStartedPulling="2026-03-20 09:05:14.590916791 +0000 UTC m=+1459.747097147" lastFinishedPulling="2026-03-20 09:05:18.597044271 +0000 UTC m=+1463.753224627" observedRunningTime="2026-03-20 09:05:19.266317346 +0000 UTC m=+1464.422497702" watchObservedRunningTime="2026-03-20 09:05:19.271754112 +0000 UTC m=+1464.427934478" Mar 20 09:05:19.531951 master-0 kubenswrapper[18707]: I0320 09:05:19.531759 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:05:19.657275 master-0 kubenswrapper[18707]: I0320 09:05:19.655453 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-47t6r"] Mar 20 09:05:19.657275 master-0 kubenswrapper[18707]: I0320 09:05:19.655698 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" podUID="01a3ea60-7016-443c-9fb2-aefe2bd1ee89" containerName="dnsmasq-dns" containerID="cri-o://6ad23b2a7e70eee9f718200f4669404d729789cbf65e935f164c88a96c881295" gracePeriod=10 Mar 20 09:05:19.912209 master-0 kubenswrapper[18707]: I0320 09:05:19.912125 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-nc4r5"] Mar 20 09:05:19.921339 master-0 kubenswrapper[18707]: I0320 09:05:19.920634 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-nc4r5"] Mar 20 09:05:20.250241 master-0 kubenswrapper[18707]: I0320 09:05:20.250169 18707 generic.go:334] "Generic (PLEG): container finished" podID="01a3ea60-7016-443c-9fb2-aefe2bd1ee89" containerID="6ad23b2a7e70eee9f718200f4669404d729789cbf65e935f164c88a96c881295" exitCode=0 Mar 20 09:05:20.250427 master-0 kubenswrapper[18707]: I0320 09:05:20.250227 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" event={"ID":"01a3ea60-7016-443c-9fb2-aefe2bd1ee89","Type":"ContainerDied","Data":"6ad23b2a7e70eee9f718200f4669404d729789cbf65e935f164c88a96c881295"} Mar 20 09:05:20.250427 master-0 kubenswrapper[18707]: I0320 09:05:20.250289 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" event={"ID":"01a3ea60-7016-443c-9fb2-aefe2bd1ee89","Type":"ContainerDied","Data":"82bee70d4c604fe1302f41dbcc75ae958b2f7d127c50166358639a16e86586d7"} Mar 20 09:05:20.250427 master-0 kubenswrapper[18707]: I0320 09:05:20.250305 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82bee70d4c604fe1302f41dbcc75ae958b2f7d127c50166358639a16e86586d7" Mar 20 09:05:20.320678 master-0 kubenswrapper[18707]: I0320 09:05:20.320609 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" Mar 20 09:05:20.405133 master-0 kubenswrapper[18707]: I0320 09:05:20.405009 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nvfv\" (UniqueName: \"kubernetes.io/projected/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-kube-api-access-4nvfv\") pod \"01a3ea60-7016-443c-9fb2-aefe2bd1ee89\" (UID: \"01a3ea60-7016-443c-9fb2-aefe2bd1ee89\") " Mar 20 09:05:20.405133 master-0 kubenswrapper[18707]: I0320 09:05:20.405060 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-dns-svc\") pod \"01a3ea60-7016-443c-9fb2-aefe2bd1ee89\" (UID: \"01a3ea60-7016-443c-9fb2-aefe2bd1ee89\") " Mar 20 09:05:20.405420 master-0 kubenswrapper[18707]: I0320 09:05:20.405226 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-config\") pod \"01a3ea60-7016-443c-9fb2-aefe2bd1ee89\" (UID: \"01a3ea60-7016-443c-9fb2-aefe2bd1ee89\") " Mar 20 09:05:20.409125 master-0 kubenswrapper[18707]: I0320 09:05:20.409071 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-kube-api-access-4nvfv" (OuterVolumeSpecName: "kube-api-access-4nvfv") pod "01a3ea60-7016-443c-9fb2-aefe2bd1ee89" (UID: "01a3ea60-7016-443c-9fb2-aefe2bd1ee89"). InnerVolumeSpecName "kube-api-access-4nvfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:20.459199 master-0 kubenswrapper[18707]: I0320 09:05:20.459109 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-config" (OuterVolumeSpecName: "config") pod "01a3ea60-7016-443c-9fb2-aefe2bd1ee89" (UID: "01a3ea60-7016-443c-9fb2-aefe2bd1ee89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:20.468773 master-0 kubenswrapper[18707]: I0320 09:05:20.468672 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "01a3ea60-7016-443c-9fb2-aefe2bd1ee89" (UID: "01a3ea60-7016-443c-9fb2-aefe2bd1ee89"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:20.508171 master-0 kubenswrapper[18707]: I0320 09:05:20.508103 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nvfv\" (UniqueName: \"kubernetes.io/projected/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-kube-api-access-4nvfv\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:20.508171 master-0 kubenswrapper[18707]: I0320 09:05:20.508160 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:20.508171 master-0 kubenswrapper[18707]: I0320 09:05:20.508175 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01a3ea60-7016-443c-9fb2-aefe2bd1ee89-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:21.113211 master-0 kubenswrapper[18707]: I0320 09:05:21.113145 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b99a3a-0f6d-4c4d-a71a-940816b034d3" path="/var/lib/kubelet/pods/37b99a3a-0f6d-4c4d-a71a-940816b034d3/volumes" Mar 20 09:05:21.261509 master-0 kubenswrapper[18707]: I0320 09:05:21.261452 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-47t6r" Mar 20 09:05:21.303474 master-0 kubenswrapper[18707]: I0320 09:05:21.303421 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-47t6r"] Mar 20 09:05:21.319333 master-0 kubenswrapper[18707]: I0320 09:05:21.319273 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-47t6r"] Mar 20 09:05:23.017514 master-0 kubenswrapper[18707]: I0320 09:05:23.017430 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 09:05:23.114677 master-0 kubenswrapper[18707]: I0320 09:05:23.114547 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01a3ea60-7016-443c-9fb2-aefe2bd1ee89" path="/var/lib/kubelet/pods/01a3ea60-7016-443c-9fb2-aefe2bd1ee89/volumes" Mar 20 09:05:24.149215 master-0 kubenswrapper[18707]: I0320 09:05:24.140956 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-54pxn"] Mar 20 09:05:24.149215 master-0 kubenswrapper[18707]: E0320 09:05:24.142862 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b99a3a-0f6d-4c4d-a71a-940816b034d3" containerName="mariadb-account-create-update" Mar 20 09:05:24.149215 master-0 kubenswrapper[18707]: I0320 09:05:24.142880 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b99a3a-0f6d-4c4d-a71a-940816b034d3" containerName="mariadb-account-create-update" Mar 20 09:05:24.149215 master-0 kubenswrapper[18707]: E0320 09:05:24.142899 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a3ea60-7016-443c-9fb2-aefe2bd1ee89" containerName="init" Mar 20 09:05:24.149215 master-0 kubenswrapper[18707]: I0320 09:05:24.142905 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a3ea60-7016-443c-9fb2-aefe2bd1ee89" containerName="init" Mar 20 09:05:24.149215 master-0 kubenswrapper[18707]: E0320 09:05:24.142929 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a3ea60-7016-443c-9fb2-aefe2bd1ee89" containerName="dnsmasq-dns" Mar 20 09:05:24.149215 master-0 kubenswrapper[18707]: I0320 09:05:24.142935 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a3ea60-7016-443c-9fb2-aefe2bd1ee89" containerName="dnsmasq-dns" Mar 20 09:05:24.149215 master-0 kubenswrapper[18707]: I0320 09:05:24.143179 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a3ea60-7016-443c-9fb2-aefe2bd1ee89" containerName="dnsmasq-dns" Mar 20 09:05:24.149215 master-0 kubenswrapper[18707]: I0320 09:05:24.143213 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b99a3a-0f6d-4c4d-a71a-940816b034d3" containerName="mariadb-account-create-update" Mar 20 09:05:24.149215 master-0 kubenswrapper[18707]: I0320 09:05:24.144531 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-54pxn"] Mar 20 09:05:24.149215 master-0 kubenswrapper[18707]: I0320 09:05:24.144934 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-54pxn" Mar 20 09:05:24.149215 master-0 kubenswrapper[18707]: I0320 09:05:24.148698 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 09:05:24.207622 master-0 kubenswrapper[18707]: I0320 09:05:24.207415 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqkj2\" (UniqueName: \"kubernetes.io/projected/a3520eae-5608-4e22-9d61-f7c0354029f9-kube-api-access-xqkj2\") pod \"root-account-create-update-54pxn\" (UID: \"a3520eae-5608-4e22-9d61-f7c0354029f9\") " pod="openstack/root-account-create-update-54pxn" Mar 20 09:05:24.207923 master-0 kubenswrapper[18707]: I0320 09:05:24.207739 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3520eae-5608-4e22-9d61-f7c0354029f9-operator-scripts\") pod \"root-account-create-update-54pxn\" (UID: \"a3520eae-5608-4e22-9d61-f7c0354029f9\") " pod="openstack/root-account-create-update-54pxn" Mar 20 09:05:24.314331 master-0 kubenswrapper[18707]: I0320 09:05:24.309971 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqkj2\" (UniqueName: \"kubernetes.io/projected/a3520eae-5608-4e22-9d61-f7c0354029f9-kube-api-access-xqkj2\") pod \"root-account-create-update-54pxn\" (UID: \"a3520eae-5608-4e22-9d61-f7c0354029f9\") " pod="openstack/root-account-create-update-54pxn" Mar 20 09:05:24.314331 master-0 kubenswrapper[18707]: I0320 09:05:24.310033 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3520eae-5608-4e22-9d61-f7c0354029f9-operator-scripts\") pod \"root-account-create-update-54pxn\" (UID: \"a3520eae-5608-4e22-9d61-f7c0354029f9\") " pod="openstack/root-account-create-update-54pxn" Mar 20 09:05:24.314331 master-0 kubenswrapper[18707]: I0320 09:05:24.310853 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3520eae-5608-4e22-9d61-f7c0354029f9-operator-scripts\") pod \"root-account-create-update-54pxn\" (UID: \"a3520eae-5608-4e22-9d61-f7c0354029f9\") " pod="openstack/root-account-create-update-54pxn" Mar 20 09:05:24.333284 master-0 kubenswrapper[18707]: I0320 09:05:24.329857 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqkj2\" (UniqueName: \"kubernetes.io/projected/a3520eae-5608-4e22-9d61-f7c0354029f9-kube-api-access-xqkj2\") pod \"root-account-create-update-54pxn\" (UID: \"a3520eae-5608-4e22-9d61-f7c0354029f9\") " pod="openstack/root-account-create-update-54pxn" Mar 20 09:05:24.495006 master-0 kubenswrapper[18707]: I0320 09:05:24.494868 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-54pxn" Mar 20 09:05:25.051205 master-0 kubenswrapper[18707]: I0320 09:05:25.050048 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-54pxn"] Mar 20 09:05:25.312012 master-0 kubenswrapper[18707]: I0320 09:05:25.311941 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-54pxn" event={"ID":"a3520eae-5608-4e22-9d61-f7c0354029f9","Type":"ContainerStarted","Data":"2683bb71581f4bac53a3e5cfa71a8a6fc713dacb8ebfb502a6d2471a9b2714f1"} Mar 20 09:05:25.312012 master-0 kubenswrapper[18707]: I0320 09:05:25.312007 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-54pxn" event={"ID":"a3520eae-5608-4e22-9d61-f7c0354029f9","Type":"ContainerStarted","Data":"756dfcdacbe80c9b4b64a43792bc76ae7644cb3a3a8710e78fdc1decc8d384b3"} Mar 20 09:05:25.342549 master-0 kubenswrapper[18707]: I0320 09:05:25.342447 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-54pxn" podStartSLOduration=1.342424697 podStartE2EDuration="1.342424697s" podCreationTimestamp="2026-03-20 09:05:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:25.330223358 +0000 UTC m=+1470.486403724" watchObservedRunningTime="2026-03-20 09:05:25.342424697 +0000 UTC m=+1470.498605053" Mar 20 09:05:25.890968 master-0 kubenswrapper[18707]: E0320 09:05:25.890917 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae0d0efd_e9bb_4546_8606_82af8296bab1.slice/crio-46ed3721e06910055a152852f40f3a63311fe7e5ec22fe6e6248404148f81795.scope\": RecentStats: unable to find data in memory cache]" Mar 20 09:05:26.323382 master-0 kubenswrapper[18707]: I0320 09:05:26.323327 18707 generic.go:334] "Generic (PLEG): container finished" podID="a3520eae-5608-4e22-9d61-f7c0354029f9" containerID="2683bb71581f4bac53a3e5cfa71a8a6fc713dacb8ebfb502a6d2471a9b2714f1" exitCode=0 Mar 20 09:05:26.323382 master-0 kubenswrapper[18707]: I0320 09:05:26.323380 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-54pxn" event={"ID":"a3520eae-5608-4e22-9d61-f7c0354029f9","Type":"ContainerDied","Data":"2683bb71581f4bac53a3e5cfa71a8a6fc713dacb8ebfb502a6d2471a9b2714f1"} Mar 20 09:05:26.977521 master-0 kubenswrapper[18707]: I0320 09:05:26.977416 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:26.983906 master-0 kubenswrapper[18707]: I0320 09:05:26.983855 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d537130-a515-4e6e-aedb-89a848cd477a-etc-swift\") pod \"swift-storage-0\" (UID: \"4d537130-a515-4e6e-aedb-89a848cd477a\") " pod="openstack/swift-storage-0" Mar 20 09:05:27.065851 master-0 kubenswrapper[18707]: I0320 09:05:27.065792 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-q5t48" podUID="b75e9b7b-6504-4904-96af-66385e6649e4" containerName="ovn-controller" probeResult="failure" output=< Mar 20 09:05:27.065851 master-0 kubenswrapper[18707]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 09:05:27.065851 master-0 kubenswrapper[18707]: > Mar 20 09:05:27.106932 master-0 kubenswrapper[18707]: I0320 09:05:27.106374 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 09:05:27.180279 master-0 kubenswrapper[18707]: I0320 09:05:27.180153 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:05:27.215603 master-0 kubenswrapper[18707]: I0320 09:05:27.215545 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hz4j4" Mar 20 09:05:29.059780 master-0 kubenswrapper[18707]: I0320 09:05:29.059710 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-q5t48-config-bfd5h"] Mar 20 09:05:29.061717 master-0 kubenswrapper[18707]: I0320 09:05:29.061687 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.065334 master-0 kubenswrapper[18707]: I0320 09:05:29.064779 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 09:05:29.072842 master-0 kubenswrapper[18707]: I0320 09:05:29.072778 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q5t48-config-bfd5h"] Mar 20 09:05:29.229119 master-0 kubenswrapper[18707]: I0320 09:05:29.227844 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-run\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.229119 master-0 kubenswrapper[18707]: I0320 09:05:29.228001 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-log-ovn\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.229119 master-0 kubenswrapper[18707]: I0320 09:05:29.228043 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/259b1015-26d1-4fe3-abbd-b32997097476-additional-scripts\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.229119 master-0 kubenswrapper[18707]: I0320 09:05:29.228170 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjm4l\" (UniqueName: \"kubernetes.io/projected/259b1015-26d1-4fe3-abbd-b32997097476-kube-api-access-gjm4l\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.229119 master-0 kubenswrapper[18707]: I0320 09:05:29.228253 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/259b1015-26d1-4fe3-abbd-b32997097476-scripts\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.229119 master-0 kubenswrapper[18707]: I0320 09:05:29.228380 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-run-ovn\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.331229 master-0 kubenswrapper[18707]: I0320 09:05:29.331066 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-run\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.331229 master-0 kubenswrapper[18707]: I0320 09:05:29.331147 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-run\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.331491 master-0 kubenswrapper[18707]: I0320 09:05:29.331282 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-log-ovn\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.331491 master-0 kubenswrapper[18707]: I0320 09:05:29.331372 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-log-ovn\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.331491 master-0 kubenswrapper[18707]: I0320 09:05:29.331436 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/259b1015-26d1-4fe3-abbd-b32997097476-additional-scripts\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.338003 master-0 kubenswrapper[18707]: I0320 09:05:29.337645 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjm4l\" (UniqueName: \"kubernetes.io/projected/259b1015-26d1-4fe3-abbd-b32997097476-kube-api-access-gjm4l\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.338003 master-0 kubenswrapper[18707]: I0320 09:05:29.337734 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/259b1015-26d1-4fe3-abbd-b32997097476-scripts\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.338003 master-0 kubenswrapper[18707]: I0320 09:05:29.337766 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/259b1015-26d1-4fe3-abbd-b32997097476-additional-scripts\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.338003 master-0 kubenswrapper[18707]: I0320 09:05:29.337830 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-run-ovn\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.338003 master-0 kubenswrapper[18707]: I0320 09:05:29.337921 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-run-ovn\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.339745 master-0 kubenswrapper[18707]: I0320 09:05:29.339713 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/259b1015-26d1-4fe3-abbd-b32997097476-scripts\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.368103 master-0 kubenswrapper[18707]: I0320 09:05:29.359771 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjm4l\" (UniqueName: \"kubernetes.io/projected/259b1015-26d1-4fe3-abbd-b32997097476-kube-api-access-gjm4l\") pod \"ovn-controller-q5t48-config-bfd5h\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:29.408262 master-0 kubenswrapper[18707]: I0320 09:05:29.407305 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:32.067968 master-0 kubenswrapper[18707]: I0320 09:05:32.067908 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-q5t48" podUID="b75e9b7b-6504-4904-96af-66385e6649e4" containerName="ovn-controller" probeResult="failure" output=< Mar 20 09:05:32.067968 master-0 kubenswrapper[18707]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 09:05:32.067968 master-0 kubenswrapper[18707]: > Mar 20 09:05:32.425566 master-0 kubenswrapper[18707]: I0320 09:05:32.417394 18707 generic.go:334] "Generic (PLEG): container finished" podID="ce336090-2814-4550-a4c2-dd726e9b6ad2" containerID="99b1b565bb48f20f483733cfb3b051761fc87701e5eeb684b66b85bb100e5cbd" exitCode=0 Mar 20 09:05:32.425566 master-0 kubenswrapper[18707]: I0320 09:05:32.417507 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ce336090-2814-4550-a4c2-dd726e9b6ad2","Type":"ContainerDied","Data":"99b1b565bb48f20f483733cfb3b051761fc87701e5eeb684b66b85bb100e5cbd"} Mar 20 09:05:32.430550 master-0 kubenswrapper[18707]: I0320 09:05:32.429924 18707 generic.go:334] "Generic (PLEG): container finished" podID="211c6c3f-43f8-4ae9-86a1-ca7d393db4e7" containerID="07dca153b771d2fd4ce2f7a87ea4777a2092378eaa23ae39894a407bc0ca0115" exitCode=0 Mar 20 09:05:32.430550 master-0 kubenswrapper[18707]: I0320 09:05:32.429994 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7","Type":"ContainerDied","Data":"07dca153b771d2fd4ce2f7a87ea4777a2092378eaa23ae39894a407bc0ca0115"} Mar 20 09:05:32.442066 master-0 kubenswrapper[18707]: I0320 09:05:32.442021 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-54pxn" event={"ID":"a3520eae-5608-4e22-9d61-f7c0354029f9","Type":"ContainerDied","Data":"756dfcdacbe80c9b4b64a43792bc76ae7644cb3a3a8710e78fdc1decc8d384b3"} Mar 20 09:05:32.442066 master-0 kubenswrapper[18707]: I0320 09:05:32.442066 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="756dfcdacbe80c9b4b64a43792bc76ae7644cb3a3a8710e78fdc1decc8d384b3" Mar 20 09:05:32.462977 master-0 kubenswrapper[18707]: I0320 09:05:32.460831 18707 generic.go:334] "Generic (PLEG): container finished" podID="ae0d0efd-e9bb-4546-8606-82af8296bab1" containerID="46ed3721e06910055a152852f40f3a63311fe7e5ec22fe6e6248404148f81795" exitCode=0 Mar 20 09:05:32.462977 master-0 kubenswrapper[18707]: I0320 09:05:32.460880 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p9cff" event={"ID":"ae0d0efd-e9bb-4546-8606-82af8296bab1","Type":"ContainerDied","Data":"46ed3721e06910055a152852f40f3a63311fe7e5ec22fe6e6248404148f81795"} Mar 20 09:05:32.482560 master-0 kubenswrapper[18707]: I0320 09:05:32.482502 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-54pxn" Mar 20 09:05:32.627372 master-0 kubenswrapper[18707]: I0320 09:05:32.627305 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqkj2\" (UniqueName: \"kubernetes.io/projected/a3520eae-5608-4e22-9d61-f7c0354029f9-kube-api-access-xqkj2\") pod \"a3520eae-5608-4e22-9d61-f7c0354029f9\" (UID: \"a3520eae-5608-4e22-9d61-f7c0354029f9\") " Mar 20 09:05:32.627480 master-0 kubenswrapper[18707]: I0320 09:05:32.627415 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3520eae-5608-4e22-9d61-f7c0354029f9-operator-scripts\") pod \"a3520eae-5608-4e22-9d61-f7c0354029f9\" (UID: \"a3520eae-5608-4e22-9d61-f7c0354029f9\") " Mar 20 09:05:32.628052 master-0 kubenswrapper[18707]: I0320 09:05:32.628000 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3520eae-5608-4e22-9d61-f7c0354029f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3520eae-5608-4e22-9d61-f7c0354029f9" (UID: "a3520eae-5608-4e22-9d61-f7c0354029f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:32.634065 master-0 kubenswrapper[18707]: I0320 09:05:32.633133 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3520eae-5608-4e22-9d61-f7c0354029f9-kube-api-access-xqkj2" (OuterVolumeSpecName: "kube-api-access-xqkj2") pod "a3520eae-5608-4e22-9d61-f7c0354029f9" (UID: "a3520eae-5608-4e22-9d61-f7c0354029f9"). InnerVolumeSpecName "kube-api-access-xqkj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:32.731931 master-0 kubenswrapper[18707]: I0320 09:05:32.730215 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3520eae-5608-4e22-9d61-f7c0354029f9-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:32.731931 master-0 kubenswrapper[18707]: I0320 09:05:32.730261 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqkj2\" (UniqueName: \"kubernetes.io/projected/a3520eae-5608-4e22-9d61-f7c0354029f9-kube-api-access-xqkj2\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:32.802073 master-0 kubenswrapper[18707]: I0320 09:05:32.801995 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 09:05:32.901569 master-0 kubenswrapper[18707]: I0320 09:05:32.901519 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q5t48-config-bfd5h"] Mar 20 09:05:33.476580 master-0 kubenswrapper[18707]: I0320 09:05:33.476497 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-662t6" event={"ID":"dd5f8067-9874-4ff3-a9fb-a3251cc3622d","Type":"ContainerStarted","Data":"4d88c8c581a9c364031cb4d4336c34f4cc762f582c79aac4aa41cc2b4dca5dee"} Mar 20 09:05:33.479150 master-0 kubenswrapper[18707]: I0320 09:05:33.479094 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ce336090-2814-4550-a4c2-dd726e9b6ad2","Type":"ContainerStarted","Data":"3606574f5bab5aa250f786ecb791f7e6676496e8a06b839914a78f1e843667fc"} Mar 20 09:05:33.479632 master-0 kubenswrapper[18707]: I0320 09:05:33.479575 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 09:05:33.487537 master-0 kubenswrapper[18707]: I0320 09:05:33.487450 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"211c6c3f-43f8-4ae9-86a1-ca7d393db4e7","Type":"ContainerStarted","Data":"2f3e7ebaeaaca12f530da63b05f0e3adda34a72153221d01104e55115e8c9f0a"} Mar 20 09:05:33.489247 master-0 kubenswrapper[18707]: I0320 09:05:33.489204 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"a9d255fc544817a499d6e5283972321cb701e198a53fcd0d14fff414030f01bc"} Mar 20 09:05:33.496210 master-0 kubenswrapper[18707]: I0320 09:05:33.495230 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5t48-config-bfd5h" event={"ID":"259b1015-26d1-4fe3-abbd-b32997097476","Type":"ContainerStarted","Data":"9db4bb7612f42841bc0ef5f709b1e5c9ca169b392a89638c8e9ae654d3b7eeb3"} Mar 20 09:05:33.496210 master-0 kubenswrapper[18707]: I0320 09:05:33.495304 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5t48-config-bfd5h" event={"ID":"259b1015-26d1-4fe3-abbd-b32997097476","Type":"ContainerStarted","Data":"cc2da6b2580afe1ab2c0cc7de67797bf968f6927e5de55a724db7e753d7850ab"} Mar 20 09:05:33.496210 master-0 kubenswrapper[18707]: I0320 09:05:33.495392 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-54pxn" Mar 20 09:05:33.521451 master-0 kubenswrapper[18707]: I0320 09:05:33.517793 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-662t6" podStartSLOduration=3.317006591 podStartE2EDuration="16.517768686s" podCreationTimestamp="2026-03-20 09:05:17 +0000 UTC" firstStartedPulling="2026-03-20 09:05:19.144961038 +0000 UTC m=+1464.301141394" lastFinishedPulling="2026-03-20 09:05:32.345723123 +0000 UTC m=+1477.501903489" observedRunningTime="2026-03-20 09:05:33.496775916 +0000 UTC m=+1478.652956282" watchObservedRunningTime="2026-03-20 09:05:33.517768686 +0000 UTC m=+1478.673949042" Mar 20 09:05:33.546020 master-0 kubenswrapper[18707]: I0320 09:05:33.544960 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=59.000768219 podStartE2EDuration="1m11.544938042s" podCreationTimestamp="2026-03-20 09:04:22 +0000 UTC" firstStartedPulling="2026-03-20 09:04:40.794374621 +0000 UTC m=+1425.950554977" lastFinishedPulling="2026-03-20 09:04:53.338544434 +0000 UTC m=+1438.494724800" observedRunningTime="2026-03-20 09:05:33.541157204 +0000 UTC m=+1478.697337590" watchObservedRunningTime="2026-03-20 09:05:33.544938042 +0000 UTC m=+1478.701118418" Mar 20 09:05:33.602212 master-0 kubenswrapper[18707]: I0320 09:05:33.594951 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-q5t48-config-bfd5h" podStartSLOduration=4.594930671 podStartE2EDuration="4.594930671s" podCreationTimestamp="2026-03-20 09:05:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:33.565366426 +0000 UTC m=+1478.721546782" watchObservedRunningTime="2026-03-20 09:05:33.594930671 +0000 UTC m=+1478.751111027" Mar 20 09:05:33.631893 master-0 kubenswrapper[18707]: I0320 09:05:33.631793 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=59.862315543 podStartE2EDuration="1m12.631773803s" podCreationTimestamp="2026-03-20 09:04:21 +0000 UTC" firstStartedPulling="2026-03-20 09:04:40.488429889 +0000 UTC m=+1425.644610255" lastFinishedPulling="2026-03-20 09:04:53.257888159 +0000 UTC m=+1438.414068515" observedRunningTime="2026-03-20 09:05:33.60227793 +0000 UTC m=+1478.758458286" watchObservedRunningTime="2026-03-20 09:05:33.631773803 +0000 UTC m=+1478.787954159" Mar 20 09:05:34.458539 master-0 kubenswrapper[18707]: I0320 09:05:34.458494 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:34.505928 master-0 kubenswrapper[18707]: I0320 09:05:34.505801 18707 generic.go:334] "Generic (PLEG): container finished" podID="259b1015-26d1-4fe3-abbd-b32997097476" containerID="9db4bb7612f42841bc0ef5f709b1e5c9ca169b392a89638c8e9ae654d3b7eeb3" exitCode=0 Mar 20 09:05:34.505928 master-0 kubenswrapper[18707]: I0320 09:05:34.505898 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5t48-config-bfd5h" event={"ID":"259b1015-26d1-4fe3-abbd-b32997097476","Type":"ContainerDied","Data":"9db4bb7612f42841bc0ef5f709b1e5c9ca169b392a89638c8e9ae654d3b7eeb3"} Mar 20 09:05:34.508200 master-0 kubenswrapper[18707]: I0320 09:05:34.508136 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p9cff" event={"ID":"ae0d0efd-e9bb-4546-8606-82af8296bab1","Type":"ContainerDied","Data":"f62e2073ab6c228a8186e030940fe47e27fe057145f53218711cb98a9a92eea2"} Mar 20 09:05:34.508200 master-0 kubenswrapper[18707]: I0320 09:05:34.508202 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f62e2073ab6c228a8186e030940fe47e27fe057145f53218711cb98a9a92eea2" Mar 20 09:05:34.508359 master-0 kubenswrapper[18707]: I0320 09:05:34.508252 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p9cff" Mar 20 09:05:34.536255 master-0 kubenswrapper[18707]: I0320 09:05:34.534732 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cgrc\" (UniqueName: \"kubernetes.io/projected/ae0d0efd-e9bb-4546-8606-82af8296bab1-kube-api-access-7cgrc\") pod \"ae0d0efd-e9bb-4546-8606-82af8296bab1\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " Mar 20 09:05:34.536255 master-0 kubenswrapper[18707]: I0320 09:05:34.534862 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-swiftconf\") pod \"ae0d0efd-e9bb-4546-8606-82af8296bab1\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " Mar 20 09:05:34.536255 master-0 kubenswrapper[18707]: I0320 09:05:34.534974 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae0d0efd-e9bb-4546-8606-82af8296bab1-etc-swift\") pod \"ae0d0efd-e9bb-4546-8606-82af8296bab1\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " Mar 20 09:05:34.536255 master-0 kubenswrapper[18707]: I0320 09:05:34.535026 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae0d0efd-e9bb-4546-8606-82af8296bab1-ring-data-devices\") pod \"ae0d0efd-e9bb-4546-8606-82af8296bab1\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " Mar 20 09:05:34.536255 master-0 kubenswrapper[18707]: I0320 09:05:34.535152 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-combined-ca-bundle\") pod \"ae0d0efd-e9bb-4546-8606-82af8296bab1\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " Mar 20 09:05:34.536255 master-0 kubenswrapper[18707]: I0320 09:05:34.535217 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-dispersionconf\") pod \"ae0d0efd-e9bb-4546-8606-82af8296bab1\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " Mar 20 09:05:34.536255 master-0 kubenswrapper[18707]: I0320 09:05:34.535367 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae0d0efd-e9bb-4546-8606-82af8296bab1-scripts\") pod \"ae0d0efd-e9bb-4546-8606-82af8296bab1\" (UID: \"ae0d0efd-e9bb-4546-8606-82af8296bab1\") " Mar 20 09:05:34.536719 master-0 kubenswrapper[18707]: I0320 09:05:34.536325 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae0d0efd-e9bb-4546-8606-82af8296bab1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ae0d0efd-e9bb-4546-8606-82af8296bab1" (UID: "ae0d0efd-e9bb-4546-8606-82af8296bab1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:05:34.542213 master-0 kubenswrapper[18707]: I0320 09:05:34.539907 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae0d0efd-e9bb-4546-8606-82af8296bab1-kube-api-access-7cgrc" (OuterVolumeSpecName: "kube-api-access-7cgrc") pod "ae0d0efd-e9bb-4546-8606-82af8296bab1" (UID: "ae0d0efd-e9bb-4546-8606-82af8296bab1"). InnerVolumeSpecName "kube-api-access-7cgrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:34.542213 master-0 kubenswrapper[18707]: I0320 09:05:34.540289 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae0d0efd-e9bb-4546-8606-82af8296bab1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ae0d0efd-e9bb-4546-8606-82af8296bab1" (UID: "ae0d0efd-e9bb-4546-8606-82af8296bab1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:34.543265 master-0 kubenswrapper[18707]: I0320 09:05:34.542885 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ae0d0efd-e9bb-4546-8606-82af8296bab1" (UID: "ae0d0efd-e9bb-4546-8606-82af8296bab1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:05:34.565935 master-0 kubenswrapper[18707]: I0320 09:05:34.565875 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae0d0efd-e9bb-4546-8606-82af8296bab1-scripts" (OuterVolumeSpecName: "scripts") pod "ae0d0efd-e9bb-4546-8606-82af8296bab1" (UID: "ae0d0efd-e9bb-4546-8606-82af8296bab1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:34.567032 master-0 kubenswrapper[18707]: I0320 09:05:34.566995 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ae0d0efd-e9bb-4546-8606-82af8296bab1" (UID: "ae0d0efd-e9bb-4546-8606-82af8296bab1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:05:34.599762 master-0 kubenswrapper[18707]: I0320 09:05:34.599421 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae0d0efd-e9bb-4546-8606-82af8296bab1" (UID: "ae0d0efd-e9bb-4546-8606-82af8296bab1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:05:34.638581 master-0 kubenswrapper[18707]: I0320 09:05:34.637965 18707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-dispersionconf\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:34.638581 master-0 kubenswrapper[18707]: I0320 09:05:34.638020 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae0d0efd-e9bb-4546-8606-82af8296bab1-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:34.638581 master-0 kubenswrapper[18707]: I0320 09:05:34.638037 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cgrc\" (UniqueName: \"kubernetes.io/projected/ae0d0efd-e9bb-4546-8606-82af8296bab1-kube-api-access-7cgrc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:34.638581 master-0 kubenswrapper[18707]: I0320 09:05:34.638049 18707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-swiftconf\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:34.638581 master-0 kubenswrapper[18707]: I0320 09:05:34.638064 18707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ae0d0efd-e9bb-4546-8606-82af8296bab1-etc-swift\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:34.638581 master-0 kubenswrapper[18707]: I0320 09:05:34.638077 18707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ae0d0efd-e9bb-4546-8606-82af8296bab1-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:34.638581 master-0 kubenswrapper[18707]: I0320 09:05:34.638088 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae0d0efd-e9bb-4546-8606-82af8296bab1-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:34.906440 master-0 kubenswrapper[18707]: I0320 09:05:34.906380 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-54pxn"] Mar 20 09:05:34.923605 master-0 kubenswrapper[18707]: I0320 09:05:34.922969 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-54pxn"] Mar 20 09:05:35.117940 master-0 kubenswrapper[18707]: I0320 09:05:35.117887 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3520eae-5608-4e22-9d61-f7c0354029f9" path="/var/lib/kubelet/pods/a3520eae-5608-4e22-9d61-f7c0354029f9/volumes" Mar 20 09:05:35.523019 master-0 kubenswrapper[18707]: I0320 09:05:35.522882 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"00c5fc0d3bf2a84c9a9ca6d1061e28a505d07058938cae398e50b23501ee072b"} Mar 20 09:05:35.523019 master-0 kubenswrapper[18707]: I0320 09:05:35.522948 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"88c96d05037791a6d1a4739a311d39df597c681d40116454b09650c1649541a7"} Mar 20 09:05:35.523019 master-0 kubenswrapper[18707]: I0320 09:05:35.522960 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"6221cc29a56bd731438d249fec4296c62809d4ccbed4a59f4b29f11d365f139d"} Mar 20 09:05:35.523019 master-0 kubenswrapper[18707]: I0320 09:05:35.522969 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"46e86f8a56dbf793092176fcc7d2b07554eca3aea03d2893022e5425031349d0"} Mar 20 09:05:35.942803 master-0 kubenswrapper[18707]: I0320 09:05:35.942712 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:36.077253 master-0 kubenswrapper[18707]: I0320 09:05:36.075496 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/259b1015-26d1-4fe3-abbd-b32997097476-additional-scripts\") pod \"259b1015-26d1-4fe3-abbd-b32997097476\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " Mar 20 09:05:36.077253 master-0 kubenswrapper[18707]: I0320 09:05:36.075542 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/259b1015-26d1-4fe3-abbd-b32997097476-scripts\") pod \"259b1015-26d1-4fe3-abbd-b32997097476\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " Mar 20 09:05:36.077253 master-0 kubenswrapper[18707]: I0320 09:05:36.075573 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjm4l\" (UniqueName: \"kubernetes.io/projected/259b1015-26d1-4fe3-abbd-b32997097476-kube-api-access-gjm4l\") pod \"259b1015-26d1-4fe3-abbd-b32997097476\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " Mar 20 09:05:36.077253 master-0 kubenswrapper[18707]: I0320 09:05:36.075623 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-log-ovn\") pod \"259b1015-26d1-4fe3-abbd-b32997097476\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " Mar 20 09:05:36.077253 master-0 kubenswrapper[18707]: I0320 09:05:36.075685 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-run\") pod \"259b1015-26d1-4fe3-abbd-b32997097476\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " Mar 20 09:05:36.077253 master-0 kubenswrapper[18707]: I0320 09:05:36.075866 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-run-ovn\") pod \"259b1015-26d1-4fe3-abbd-b32997097476\" (UID: \"259b1015-26d1-4fe3-abbd-b32997097476\") " Mar 20 09:05:36.077253 master-0 kubenswrapper[18707]: I0320 09:05:36.076209 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/259b1015-26d1-4fe3-abbd-b32997097476-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "259b1015-26d1-4fe3-abbd-b32997097476" (UID: "259b1015-26d1-4fe3-abbd-b32997097476"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:36.077253 master-0 kubenswrapper[18707]: I0320 09:05:36.076267 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "259b1015-26d1-4fe3-abbd-b32997097476" (UID: "259b1015-26d1-4fe3-abbd-b32997097476"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:05:36.077253 master-0 kubenswrapper[18707]: I0320 09:05:36.076901 18707 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:36.077253 master-0 kubenswrapper[18707]: I0320 09:05:36.076910 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/259b1015-26d1-4fe3-abbd-b32997097476-scripts" (OuterVolumeSpecName: "scripts") pod "259b1015-26d1-4fe3-abbd-b32997097476" (UID: "259b1015-26d1-4fe3-abbd-b32997097476"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:36.077253 master-0 kubenswrapper[18707]: I0320 09:05:36.076926 18707 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/259b1015-26d1-4fe3-abbd-b32997097476-additional-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:36.077253 master-0 kubenswrapper[18707]: I0320 09:05:36.076962 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-run" (OuterVolumeSpecName: "var-run") pod "259b1015-26d1-4fe3-abbd-b32997097476" (UID: "259b1015-26d1-4fe3-abbd-b32997097476"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:05:36.077253 master-0 kubenswrapper[18707]: I0320 09:05:36.076997 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "259b1015-26d1-4fe3-abbd-b32997097476" (UID: "259b1015-26d1-4fe3-abbd-b32997097476"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:05:36.085603 master-0 kubenswrapper[18707]: I0320 09:05:36.085505 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/259b1015-26d1-4fe3-abbd-b32997097476-kube-api-access-gjm4l" (OuterVolumeSpecName: "kube-api-access-gjm4l") pod "259b1015-26d1-4fe3-abbd-b32997097476" (UID: "259b1015-26d1-4fe3-abbd-b32997097476"). InnerVolumeSpecName "kube-api-access-gjm4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:36.179395 master-0 kubenswrapper[18707]: I0320 09:05:36.179333 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/259b1015-26d1-4fe3-abbd-b32997097476-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:36.179622 master-0 kubenswrapper[18707]: I0320 09:05:36.179411 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjm4l\" (UniqueName: \"kubernetes.io/projected/259b1015-26d1-4fe3-abbd-b32997097476-kube-api-access-gjm4l\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:36.179622 master-0 kubenswrapper[18707]: I0320 09:05:36.179427 18707 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:36.179622 master-0 kubenswrapper[18707]: I0320 09:05:36.179437 18707 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/259b1015-26d1-4fe3-abbd-b32997097476-var-run\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:36.542128 master-0 kubenswrapper[18707]: I0320 09:05:36.542067 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5t48-config-bfd5h" event={"ID":"259b1015-26d1-4fe3-abbd-b32997097476","Type":"ContainerDied","Data":"cc2da6b2580afe1ab2c0cc7de67797bf968f6927e5de55a724db7e753d7850ab"} Mar 20 09:05:36.542128 master-0 kubenswrapper[18707]: I0320 09:05:36.542122 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc2da6b2580afe1ab2c0cc7de67797bf968f6927e5de55a724db7e753d7850ab" Mar 20 09:05:36.542619 master-0 kubenswrapper[18707]: I0320 09:05:36.542217 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5t48-config-bfd5h" Mar 20 09:05:38.260815 master-0 kubenswrapper[18707]: I0320 09:05:38.260762 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-q5t48" Mar 20 09:05:38.325962 master-0 kubenswrapper[18707]: I0320 09:05:38.324495 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"b646a0f484431730cdb0e79b30f4877b3e5c619012fd6f64a4e8504e50658313"} Mar 20 09:05:38.480573 master-0 kubenswrapper[18707]: I0320 09:05:38.480498 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-q5t48-config-bfd5h"] Mar 20 09:05:38.514335 master-0 kubenswrapper[18707]: I0320 09:05:38.512722 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-q5t48-config-bfd5h"] Mar 20 09:05:38.676013 master-0 kubenswrapper[18707]: I0320 09:05:38.675942 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-q5t48-config-944s7"] Mar 20 09:05:38.676670 master-0 kubenswrapper[18707]: E0320 09:05:38.676637 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259b1015-26d1-4fe3-abbd-b32997097476" containerName="ovn-config" Mar 20 09:05:38.676670 master-0 kubenswrapper[18707]: I0320 09:05:38.676665 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="259b1015-26d1-4fe3-abbd-b32997097476" containerName="ovn-config" Mar 20 09:05:38.676795 master-0 kubenswrapper[18707]: E0320 09:05:38.676709 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3520eae-5608-4e22-9d61-f7c0354029f9" containerName="mariadb-account-create-update" Mar 20 09:05:38.676795 master-0 kubenswrapper[18707]: I0320 09:05:38.676720 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3520eae-5608-4e22-9d61-f7c0354029f9" containerName="mariadb-account-create-update" Mar 20 09:05:38.676795 master-0 kubenswrapper[18707]: E0320 09:05:38.676766 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae0d0efd-e9bb-4546-8606-82af8296bab1" containerName="swift-ring-rebalance" Mar 20 09:05:38.676795 master-0 kubenswrapper[18707]: I0320 09:05:38.676777 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae0d0efd-e9bb-4546-8606-82af8296bab1" containerName="swift-ring-rebalance" Mar 20 09:05:38.677137 master-0 kubenswrapper[18707]: I0320 09:05:38.677109 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="259b1015-26d1-4fe3-abbd-b32997097476" containerName="ovn-config" Mar 20 09:05:38.677236 master-0 kubenswrapper[18707]: I0320 09:05:38.677211 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae0d0efd-e9bb-4546-8606-82af8296bab1" containerName="swift-ring-rebalance" Mar 20 09:05:38.677236 master-0 kubenswrapper[18707]: I0320 09:05:38.677233 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3520eae-5608-4e22-9d61-f7c0354029f9" containerName="mariadb-account-create-update" Mar 20 09:05:38.678217 master-0 kubenswrapper[18707]: I0320 09:05:38.678151 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.684926 master-0 kubenswrapper[18707]: I0320 09:05:38.684805 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 09:05:38.713247 master-0 kubenswrapper[18707]: I0320 09:05:38.712812 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q5t48-config-944s7"] Mar 20 09:05:38.775287 master-0 kubenswrapper[18707]: I0320 09:05:38.774432 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-log-ovn\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.775287 master-0 kubenswrapper[18707]: I0320 09:05:38.774490 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-scripts\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.775287 master-0 kubenswrapper[18707]: I0320 09:05:38.774609 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24pkc\" (UniqueName: \"kubernetes.io/projected/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-kube-api-access-24pkc\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.775287 master-0 kubenswrapper[18707]: I0320 09:05:38.774679 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-run-ovn\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.775287 master-0 kubenswrapper[18707]: I0320 09:05:38.774782 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-run\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.775287 master-0 kubenswrapper[18707]: I0320 09:05:38.774820 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-additional-scripts\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.877836 master-0 kubenswrapper[18707]: I0320 09:05:38.877666 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-run\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.878086 master-0 kubenswrapper[18707]: I0320 09:05:38.877945 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-additional-scripts\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.878086 master-0 kubenswrapper[18707]: I0320 09:05:38.877983 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-run\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.878086 master-0 kubenswrapper[18707]: I0320 09:05:38.878038 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-scripts\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.878086 master-0 kubenswrapper[18707]: I0320 09:05:38.878062 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-log-ovn\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.878288 master-0 kubenswrapper[18707]: I0320 09:05:38.878153 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24pkc\" (UniqueName: \"kubernetes.io/projected/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-kube-api-access-24pkc\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.878288 master-0 kubenswrapper[18707]: I0320 09:05:38.878220 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-run-ovn\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.878288 master-0 kubenswrapper[18707]: I0320 09:05:38.878224 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-log-ovn\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.878419 master-0 kubenswrapper[18707]: I0320 09:05:38.878313 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-run-ovn\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.879112 master-0 kubenswrapper[18707]: I0320 09:05:38.878687 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-additional-scripts\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.880971 master-0 kubenswrapper[18707]: I0320 09:05:38.880924 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-scripts\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:38.900250 master-0 kubenswrapper[18707]: I0320 09:05:38.900113 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24pkc\" (UniqueName: \"kubernetes.io/projected/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-kube-api-access-24pkc\") pod \"ovn-controller-q5t48-config-944s7\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:39.104951 master-0 kubenswrapper[18707]: I0320 09:05:39.104894 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="259b1015-26d1-4fe3-abbd-b32997097476" path="/var/lib/kubelet/pods/259b1015-26d1-4fe3-abbd-b32997097476/volumes" Mar 20 09:05:39.146861 master-0 kubenswrapper[18707]: I0320 09:05:39.146741 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:39.371864 master-0 kubenswrapper[18707]: I0320 09:05:39.371801 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"e4758736449d9d54a89b765ee13dec32339e1cb31baf701cd5211feb30e141cf"} Mar 20 09:05:39.371864 master-0 kubenswrapper[18707]: I0320 09:05:39.371866 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"cc400ce7f8672381f42d0f58b7f45b7070b3fac0d110f8c44a483bd7d51de064"} Mar 20 09:05:39.372375 master-0 kubenswrapper[18707]: I0320 09:05:39.371880 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"6a4b0fc7e1cb6ba5396bcc25b603c990b1d17e2b997280ef75d5c824785c12be"} Mar 20 09:05:39.619154 master-0 kubenswrapper[18707]: I0320 09:05:39.613881 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c8447fc9-wqvmw"] Mar 20 09:05:39.619154 master-0 kubenswrapper[18707]: I0320 09:05:39.618769 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.629080 master-0 kubenswrapper[18707]: I0320 09:05:39.628997 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"edpm-a" Mar 20 09:05:39.631805 master-0 kubenswrapper[18707]: I0320 09:05:39.631663 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8447fc9-wqvmw"] Mar 20 09:05:39.664467 master-0 kubenswrapper[18707]: I0320 09:05:39.660010 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:05:39.805964 master-0 kubenswrapper[18707]: W0320 09:05:39.805893 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9ab5c43_515e_4d85_95df_ad62b13ab3a3.slice/crio-0b056a38fb46457b260d609e75e9222dbd98079cebb62b8303b6df19517a4ce4 WatchSource:0}: Error finding container 0b056a38fb46457b260d609e75e9222dbd98079cebb62b8303b6df19517a4ce4: Status 404 returned error can't find the container with id 0b056a38fb46457b260d609e75e9222dbd98079cebb62b8303b6df19517a4ce4 Mar 20 09:05:39.813216 master-0 kubenswrapper[18707]: I0320 09:05:39.813060 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q5t48-config-944s7"] Mar 20 09:05:39.824157 master-0 kubenswrapper[18707]: I0320 09:05:39.824009 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx9lc\" (UniqueName: \"kubernetes.io/projected/430cd83b-83e7-42ff-a0d0-c36c85ac0473-kube-api-access-bx9lc\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.824594 master-0 kubenswrapper[18707]: I0320 09:05:39.824564 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.825017 master-0 kubenswrapper[18707]: I0320 09:05:39.824984 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-dns-svc\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.825070 master-0 kubenswrapper[18707]: I0320 09:05:39.825054 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-edpm-a\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.825210 master-0 kubenswrapper[18707]: I0320 09:05:39.825174 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-config\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.825323 master-0 kubenswrapper[18707]: I0320 09:05:39.825306 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.930679 master-0 kubenswrapper[18707]: I0320 09:05:39.930606 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.930872 master-0 kubenswrapper[18707]: I0320 09:05:39.930780 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-dns-svc\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.930872 master-0 kubenswrapper[18707]: I0320 09:05:39.930819 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-edpm-a\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.930939 master-0 kubenswrapper[18707]: I0320 09:05:39.930891 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-config\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.931016 master-0 kubenswrapper[18707]: I0320 09:05:39.930983 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.931172 master-0 kubenswrapper[18707]: I0320 09:05:39.931143 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx9lc\" (UniqueName: \"kubernetes.io/projected/430cd83b-83e7-42ff-a0d0-c36c85ac0473-kube-api-access-bx9lc\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.933370 master-0 kubenswrapper[18707]: I0320 09:05:39.933330 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-edpm-a\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.933490 master-0 kubenswrapper[18707]: I0320 09:05:39.933458 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.934079 master-0 kubenswrapper[18707]: I0320 09:05:39.934046 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-dns-svc\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.935020 master-0 kubenswrapper[18707]: I0320 09:05:39.934991 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-config\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.935401 master-0 kubenswrapper[18707]: I0320 09:05:39.935366 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:39.968009 master-0 kubenswrapper[18707]: I0320 09:05:39.965941 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ndh4m"] Mar 20 09:05:39.968009 master-0 kubenswrapper[18707]: I0320 09:05:39.967452 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ndh4m" Mar 20 09:05:39.971582 master-0 kubenswrapper[18707]: I0320 09:05:39.970533 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 09:05:39.997222 master-0 kubenswrapper[18707]: I0320 09:05:39.993671 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx9lc\" (UniqueName: \"kubernetes.io/projected/430cd83b-83e7-42ff-a0d0-c36c85ac0473-kube-api-access-bx9lc\") pod \"dnsmasq-dns-6c8447fc9-wqvmw\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:40.006211 master-0 kubenswrapper[18707]: I0320 09:05:40.004628 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ndh4m"] Mar 20 09:05:40.136981 master-0 kubenswrapper[18707]: I0320 09:05:40.135322 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fqw8\" (UniqueName: \"kubernetes.io/projected/c8751adb-f918-4ff0-b087-5ced3219f41a-kube-api-access-5fqw8\") pod \"root-account-create-update-ndh4m\" (UID: \"c8751adb-f918-4ff0-b087-5ced3219f41a\") " pod="openstack/root-account-create-update-ndh4m" Mar 20 09:05:40.136981 master-0 kubenswrapper[18707]: I0320 09:05:40.135563 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8751adb-f918-4ff0-b087-5ced3219f41a-operator-scripts\") pod \"root-account-create-update-ndh4m\" (UID: \"c8751adb-f918-4ff0-b087-5ced3219f41a\") " pod="openstack/root-account-create-update-ndh4m" Mar 20 09:05:40.241427 master-0 kubenswrapper[18707]: I0320 09:05:40.237800 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fqw8\" (UniqueName: \"kubernetes.io/projected/c8751adb-f918-4ff0-b087-5ced3219f41a-kube-api-access-5fqw8\") pod \"root-account-create-update-ndh4m\" (UID: \"c8751adb-f918-4ff0-b087-5ced3219f41a\") " pod="openstack/root-account-create-update-ndh4m" Mar 20 09:05:40.241427 master-0 kubenswrapper[18707]: I0320 09:05:40.237873 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8751adb-f918-4ff0-b087-5ced3219f41a-operator-scripts\") pod \"root-account-create-update-ndh4m\" (UID: \"c8751adb-f918-4ff0-b087-5ced3219f41a\") " pod="openstack/root-account-create-update-ndh4m" Mar 20 09:05:40.241427 master-0 kubenswrapper[18707]: I0320 09:05:40.238856 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8751adb-f918-4ff0-b087-5ced3219f41a-operator-scripts\") pod \"root-account-create-update-ndh4m\" (UID: \"c8751adb-f918-4ff0-b087-5ced3219f41a\") " pod="openstack/root-account-create-update-ndh4m" Mar 20 09:05:40.254492 master-0 kubenswrapper[18707]: I0320 09:05:40.254428 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fqw8\" (UniqueName: \"kubernetes.io/projected/c8751adb-f918-4ff0-b087-5ced3219f41a-kube-api-access-5fqw8\") pod \"root-account-create-update-ndh4m\" (UID: \"c8751adb-f918-4ff0-b087-5ced3219f41a\") " pod="openstack/root-account-create-update-ndh4m" Mar 20 09:05:40.295207 master-0 kubenswrapper[18707]: I0320 09:05:40.294360 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:40.318274 master-0 kubenswrapper[18707]: I0320 09:05:40.318207 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ndh4m" Mar 20 09:05:40.383846 master-0 kubenswrapper[18707]: I0320 09:05:40.383780 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5t48-config-944s7" event={"ID":"e9ab5c43-515e-4d85-95df-ad62b13ab3a3","Type":"ContainerStarted","Data":"75255fec69697873217f7a9f3186e9a333342e288a617b7f33cf0db5508b6b0a"} Mar 20 09:05:40.383846 master-0 kubenswrapper[18707]: I0320 09:05:40.383842 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5t48-config-944s7" event={"ID":"e9ab5c43-515e-4d85-95df-ad62b13ab3a3","Type":"ContainerStarted","Data":"0b056a38fb46457b260d609e75e9222dbd98079cebb62b8303b6df19517a4ce4"} Mar 20 09:05:40.406074 master-0 kubenswrapper[18707]: I0320 09:05:40.405923 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-q5t48-config-944s7" podStartSLOduration=2.405903131 podStartE2EDuration="2.405903131s" podCreationTimestamp="2026-03-20 09:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:40.404500741 +0000 UTC m=+1485.560681107" watchObservedRunningTime="2026-03-20 09:05:40.405903131 +0000 UTC m=+1485.562083497" Mar 20 09:05:40.967321 master-0 kubenswrapper[18707]: I0320 09:05:40.963338 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ndh4m"] Mar 20 09:05:41.066228 master-0 kubenswrapper[18707]: I0320 09:05:41.063155 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8447fc9-wqvmw"] Mar 20 09:05:41.404323 master-0 kubenswrapper[18707]: I0320 09:05:41.404271 18707 generic.go:334] "Generic (PLEG): container finished" podID="430cd83b-83e7-42ff-a0d0-c36c85ac0473" containerID="904edf5efa123f5dc30a0a1b778dce0d4e4c2dc4ee6ec76dfbe044a6fdd8c2b3" exitCode=0 Mar 20 09:05:41.405508 master-0 kubenswrapper[18707]: I0320 09:05:41.404354 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" event={"ID":"430cd83b-83e7-42ff-a0d0-c36c85ac0473","Type":"ContainerDied","Data":"904edf5efa123f5dc30a0a1b778dce0d4e4c2dc4ee6ec76dfbe044a6fdd8c2b3"} Mar 20 09:05:41.405508 master-0 kubenswrapper[18707]: I0320 09:05:41.404457 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" event={"ID":"430cd83b-83e7-42ff-a0d0-c36c85ac0473","Type":"ContainerStarted","Data":"a1af2d81c922098267baf75ad45a317de156197f7657625ff96ec2c0734ba0ad"} Mar 20 09:05:41.411767 master-0 kubenswrapper[18707]: I0320 09:05:41.411730 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ndh4m" event={"ID":"c8751adb-f918-4ff0-b087-5ced3219f41a","Type":"ContainerStarted","Data":"bd79e7edae84856717132d6459c141670b40f5a54976050bc3ff49b70a15251c"} Mar 20 09:05:41.411897 master-0 kubenswrapper[18707]: I0320 09:05:41.411878 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ndh4m" event={"ID":"c8751adb-f918-4ff0-b087-5ced3219f41a","Type":"ContainerStarted","Data":"7735aaa69572240972c8026aa9b668d44438260cac590edbc031038c7c38231a"} Mar 20 09:05:41.454863 master-0 kubenswrapper[18707]: I0320 09:05:41.454706 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"2427e8b76ad1eac11c0bced135893647a70de3186e8259a1d1e26c8eb5be3a44"} Mar 20 09:05:41.454863 master-0 kubenswrapper[18707]: I0320 09:05:41.454799 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"3066496b52cc19d8cfc0a24384e61dc0c8c504e6030ee2186cc839dd33f2c547"} Mar 20 09:05:41.454863 master-0 kubenswrapper[18707]: I0320 09:05:41.454815 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"d1dcdefb1f69e66d8f4138050735e3c331e80e7fa20ec079e30c59b6e020d650"} Mar 20 09:05:41.454863 master-0 kubenswrapper[18707]: I0320 09:05:41.454852 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"d95573e01303a0c746b61d69d13d70734c2be84fa0414f3cbfeaebe8ceabbdae"} Mar 20 09:05:41.458793 master-0 kubenswrapper[18707]: I0320 09:05:41.457453 18707 generic.go:334] "Generic (PLEG): container finished" podID="e9ab5c43-515e-4d85-95df-ad62b13ab3a3" containerID="75255fec69697873217f7a9f3186e9a333342e288a617b7f33cf0db5508b6b0a" exitCode=0 Mar 20 09:05:41.458793 master-0 kubenswrapper[18707]: I0320 09:05:41.457554 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5t48-config-944s7" event={"ID":"e9ab5c43-515e-4d85-95df-ad62b13ab3a3","Type":"ContainerDied","Data":"75255fec69697873217f7a9f3186e9a333342e288a617b7f33cf0db5508b6b0a"} Mar 20 09:05:41.480535 master-0 kubenswrapper[18707]: I0320 09:05:41.480427 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-ndh4m" podStartSLOduration=2.480407905 podStartE2EDuration="2.480407905s" podCreationTimestamp="2026-03-20 09:05:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:41.467599709 +0000 UTC m=+1486.623780065" watchObservedRunningTime="2026-03-20 09:05:41.480407905 +0000 UTC m=+1486.636588261" Mar 20 09:05:41.639778 master-0 kubenswrapper[18707]: E0320 09:05:41.638041 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8751adb_f918_4ff0_b087_5ced3219f41a.slice/crio-bd79e7edae84856717132d6459c141670b40f5a54976050bc3ff49b70a15251c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8751adb_f918_4ff0_b087_5ced3219f41a.slice/crio-conmon-bd79e7edae84856717132d6459c141670b40f5a54976050bc3ff49b70a15251c.scope\": RecentStats: unable to find data in memory cache]" Mar 20 09:05:42.475727 master-0 kubenswrapper[18707]: I0320 09:05:42.475657 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" event={"ID":"430cd83b-83e7-42ff-a0d0-c36c85ac0473","Type":"ContainerStarted","Data":"19fae9d1bbd3ff36f4ea01d43ba7125422581ff0625381ea966a222ef3cabf6a"} Mar 20 09:05:42.476271 master-0 kubenswrapper[18707]: I0320 09:05:42.475760 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:42.479906 master-0 kubenswrapper[18707]: I0320 09:05:42.478469 18707 generic.go:334] "Generic (PLEG): container finished" podID="c8751adb-f918-4ff0-b087-5ced3219f41a" containerID="bd79e7edae84856717132d6459c141670b40f5a54976050bc3ff49b70a15251c" exitCode=0 Mar 20 09:05:42.479906 master-0 kubenswrapper[18707]: I0320 09:05:42.478554 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ndh4m" event={"ID":"c8751adb-f918-4ff0-b087-5ced3219f41a","Type":"ContainerDied","Data":"bd79e7edae84856717132d6459c141670b40f5a54976050bc3ff49b70a15251c"} Mar 20 09:05:42.486961 master-0 kubenswrapper[18707]: I0320 09:05:42.485677 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"761c3ad42322cbb41ec5ff444563851630215a9e12dfbe53e3f6fba30ac0cc8d"} Mar 20 09:05:42.486961 master-0 kubenswrapper[18707]: I0320 09:05:42.485724 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"f9a59ccfb7d6d43e8ba19d807983faf7d1150e0f821b30a554ac0104d6e21bc0"} Mar 20 09:05:42.486961 master-0 kubenswrapper[18707]: I0320 09:05:42.485735 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d537130-a515-4e6e-aedb-89a848cd477a","Type":"ContainerStarted","Data":"dffac5c55b384e345b5a2fe30b4f57d97a91da347965e7326128df2a2b414cbd"} Mar 20 09:05:42.506346 master-0 kubenswrapper[18707]: I0320 09:05:42.502958 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" podStartSLOduration=3.502941146 podStartE2EDuration="3.502941146s" podCreationTimestamp="2026-03-20 09:05:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:42.50029542 +0000 UTC m=+1487.656475776" watchObservedRunningTime="2026-03-20 09:05:42.502941146 +0000 UTC m=+1487.659121492" Mar 20 09:05:42.679351 master-0 kubenswrapper[18707]: I0320 09:05:42.673126 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=27.042853655 podStartE2EDuration="34.673104388s" podCreationTimestamp="2026-03-20 09:05:08 +0000 UTC" firstStartedPulling="2026-03-20 09:05:32.804780471 +0000 UTC m=+1477.960960827" lastFinishedPulling="2026-03-20 09:05:40.435031204 +0000 UTC m=+1485.591211560" observedRunningTime="2026-03-20 09:05:42.574442909 +0000 UTC m=+1487.730623275" watchObservedRunningTime="2026-03-20 09:05:42.673104388 +0000 UTC m=+1487.829284734" Mar 20 09:05:42.922163 master-0 kubenswrapper[18707]: I0320 09:05:42.922098 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:43.024687 master-0 kubenswrapper[18707]: I0320 09:05:43.024626 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-log-ovn\") pod \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " Mar 20 09:05:43.024987 master-0 kubenswrapper[18707]: I0320 09:05:43.024756 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-run\") pod \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " Mar 20 09:05:43.024987 master-0 kubenswrapper[18707]: I0320 09:05:43.024743 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e9ab5c43-515e-4d85-95df-ad62b13ab3a3" (UID: "e9ab5c43-515e-4d85-95df-ad62b13ab3a3"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:05:43.024987 master-0 kubenswrapper[18707]: I0320 09:05:43.024808 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-run-ovn\") pod \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " Mar 20 09:05:43.024987 master-0 kubenswrapper[18707]: I0320 09:05:43.024857 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-additional-scripts\") pod \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " Mar 20 09:05:43.024987 master-0 kubenswrapper[18707]: I0320 09:05:43.024894 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e9ab5c43-515e-4d85-95df-ad62b13ab3a3" (UID: "e9ab5c43-515e-4d85-95df-ad62b13ab3a3"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:05:43.025246 master-0 kubenswrapper[18707]: I0320 09:05:43.025036 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-scripts\") pod \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " Mar 20 09:05:43.025246 master-0 kubenswrapper[18707]: I0320 09:05:43.025142 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24pkc\" (UniqueName: \"kubernetes.io/projected/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-kube-api-access-24pkc\") pod \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\" (UID: \"e9ab5c43-515e-4d85-95df-ad62b13ab3a3\") " Mar 20 09:05:43.025463 master-0 kubenswrapper[18707]: I0320 09:05:43.025381 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e9ab5c43-515e-4d85-95df-ad62b13ab3a3" (UID: "e9ab5c43-515e-4d85-95df-ad62b13ab3a3"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:43.025941 master-0 kubenswrapper[18707]: I0320 09:05:43.025914 18707 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:43.026011 master-0 kubenswrapper[18707]: I0320 09:05:43.025941 18707 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:43.026011 master-0 kubenswrapper[18707]: I0320 09:05:43.025958 18707 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-additional-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:43.026219 master-0 kubenswrapper[18707]: I0320 09:05:43.026164 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-scripts" (OuterVolumeSpecName: "scripts") pod "e9ab5c43-515e-4d85-95df-ad62b13ab3a3" (UID: "e9ab5c43-515e-4d85-95df-ad62b13ab3a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:43.026296 master-0 kubenswrapper[18707]: I0320 09:05:43.024859 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-run" (OuterVolumeSpecName: "var-run") pod "e9ab5c43-515e-4d85-95df-ad62b13ab3a3" (UID: "e9ab5c43-515e-4d85-95df-ad62b13ab3a3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:05:43.029480 master-0 kubenswrapper[18707]: I0320 09:05:43.029359 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-kube-api-access-24pkc" (OuterVolumeSpecName: "kube-api-access-24pkc") pod "e9ab5c43-515e-4d85-95df-ad62b13ab3a3" (UID: "e9ab5c43-515e-4d85-95df-ad62b13ab3a3"). InnerVolumeSpecName "kube-api-access-24pkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:43.129071 master-0 kubenswrapper[18707]: I0320 09:05:43.127863 18707 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-var-run\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:43.129071 master-0 kubenswrapper[18707]: I0320 09:05:43.127906 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:43.129071 master-0 kubenswrapper[18707]: I0320 09:05:43.127918 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24pkc\" (UniqueName: \"kubernetes.io/projected/e9ab5c43-515e-4d85-95df-ad62b13ab3a3-kube-api-access-24pkc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:43.213205 master-0 kubenswrapper[18707]: I0320 09:05:43.211729 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8447fc9-wqvmw"] Mar 20 09:05:43.248118 master-0 kubenswrapper[18707]: I0320 09:05:43.247211 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6974f6bb85-z8znc"] Mar 20 09:05:43.248118 master-0 kubenswrapper[18707]: E0320 09:05:43.247832 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ab5c43-515e-4d85-95df-ad62b13ab3a3" containerName="ovn-config" Mar 20 09:05:43.248118 master-0 kubenswrapper[18707]: I0320 09:05:43.247849 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ab5c43-515e-4d85-95df-ad62b13ab3a3" containerName="ovn-config" Mar 20 09:05:43.248403 master-0 kubenswrapper[18707]: I0320 09:05:43.248245 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ab5c43-515e-4d85-95df-ad62b13ab3a3" containerName="ovn-config" Mar 20 09:05:43.250150 master-0 kubenswrapper[18707]: I0320 09:05:43.249797 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.253795 master-0 kubenswrapper[18707]: I0320 09:05:43.252786 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 09:05:43.258350 master-0 kubenswrapper[18707]: I0320 09:05:43.256694 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6974f6bb85-z8znc"] Mar 20 09:05:43.335695 master-0 kubenswrapper[18707]: I0320 09:05:43.335643 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-dns-svc\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.335695 master-0 kubenswrapper[18707]: I0320 09:05:43.335704 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-edpm-a\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.335998 master-0 kubenswrapper[18707]: I0320 09:05:43.335772 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcp6d\" (UniqueName: \"kubernetes.io/projected/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-kube-api-access-dcp6d\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.335998 master-0 kubenswrapper[18707]: I0320 09:05:43.335794 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-config\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.335998 master-0 kubenswrapper[18707]: I0320 09:05:43.335822 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-ovsdbserver-sb\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.335998 master-0 kubenswrapper[18707]: I0320 09:05:43.335856 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-dns-swift-storage-0\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.335998 master-0 kubenswrapper[18707]: I0320 09:05:43.335931 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-ovsdbserver-nb\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.438646 master-0 kubenswrapper[18707]: I0320 09:05:43.438536 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-dns-swift-storage-0\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.438862 master-0 kubenswrapper[18707]: I0320 09:05:43.438848 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-ovsdbserver-nb\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.438965 master-0 kubenswrapper[18707]: I0320 09:05:43.438947 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-dns-svc\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.439011 master-0 kubenswrapper[18707]: I0320 09:05:43.438986 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-edpm-a\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.439073 master-0 kubenswrapper[18707]: I0320 09:05:43.439057 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-config\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.439113 master-0 kubenswrapper[18707]: I0320 09:05:43.439074 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcp6d\" (UniqueName: \"kubernetes.io/projected/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-kube-api-access-dcp6d\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.439334 master-0 kubenswrapper[18707]: I0320 09:05:43.439274 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-ovsdbserver-sb\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.439756 master-0 kubenswrapper[18707]: I0320 09:05:43.439727 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-config\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.440171 master-0 kubenswrapper[18707]: I0320 09:05:43.440149 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-ovsdbserver-nb\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.440319 master-0 kubenswrapper[18707]: I0320 09:05:43.440276 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-dns-svc\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.440386 master-0 kubenswrapper[18707]: I0320 09:05:43.440226 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-ovsdbserver-sb\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.440498 master-0 kubenswrapper[18707]: I0320 09:05:43.440477 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-edpm-a\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.440993 master-0 kubenswrapper[18707]: I0320 09:05:43.440977 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-dns-swift-storage-0\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.460270 master-0 kubenswrapper[18707]: I0320 09:05:43.456685 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcp6d\" (UniqueName: \"kubernetes.io/projected/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-kube-api-access-dcp6d\") pod \"dnsmasq-dns-6974f6bb85-z8znc\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:43.504865 master-0 kubenswrapper[18707]: I0320 09:05:43.504361 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q5t48-config-944s7" Mar 20 09:05:43.504865 master-0 kubenswrapper[18707]: I0320 09:05:43.504465 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q5t48-config-944s7" event={"ID":"e9ab5c43-515e-4d85-95df-ad62b13ab3a3","Type":"ContainerDied","Data":"0b056a38fb46457b260d609e75e9222dbd98079cebb62b8303b6df19517a4ce4"} Mar 20 09:05:43.504865 master-0 kubenswrapper[18707]: I0320 09:05:43.504495 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b056a38fb46457b260d609e75e9222dbd98079cebb62b8303b6df19517a4ce4" Mar 20 09:05:43.590636 master-0 kubenswrapper[18707]: I0320 09:05:43.590586 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:44.047345 master-0 kubenswrapper[18707]: I0320 09:05:44.040333 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ndh4m" Mar 20 09:05:44.154084 master-0 kubenswrapper[18707]: I0320 09:05:44.154008 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6974f6bb85-z8znc"] Mar 20 09:05:44.181223 master-0 kubenswrapper[18707]: I0320 09:05:44.177696 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fqw8\" (UniqueName: \"kubernetes.io/projected/c8751adb-f918-4ff0-b087-5ced3219f41a-kube-api-access-5fqw8\") pod \"c8751adb-f918-4ff0-b087-5ced3219f41a\" (UID: \"c8751adb-f918-4ff0-b087-5ced3219f41a\") " Mar 20 09:05:44.181223 master-0 kubenswrapper[18707]: I0320 09:05:44.177979 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8751adb-f918-4ff0-b087-5ced3219f41a-operator-scripts\") pod \"c8751adb-f918-4ff0-b087-5ced3219f41a\" (UID: \"c8751adb-f918-4ff0-b087-5ced3219f41a\") " Mar 20 09:05:44.181223 master-0 kubenswrapper[18707]: I0320 09:05:44.180669 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8751adb-f918-4ff0-b087-5ced3219f41a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8751adb-f918-4ff0-b087-5ced3219f41a" (UID: "c8751adb-f918-4ff0-b087-5ced3219f41a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:44.185409 master-0 kubenswrapper[18707]: I0320 09:05:44.185332 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8751adb-f918-4ff0-b087-5ced3219f41a-kube-api-access-5fqw8" (OuterVolumeSpecName: "kube-api-access-5fqw8") pod "c8751adb-f918-4ff0-b087-5ced3219f41a" (UID: "c8751adb-f918-4ff0-b087-5ced3219f41a"). InnerVolumeSpecName "kube-api-access-5fqw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:44.225288 master-0 kubenswrapper[18707]: I0320 09:05:44.223327 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-q5t48-config-944s7"] Mar 20 09:05:44.249328 master-0 kubenswrapper[18707]: I0320 09:05:44.249264 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-q5t48-config-944s7"] Mar 20 09:05:44.282521 master-0 kubenswrapper[18707]: I0320 09:05:44.282334 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8751adb-f918-4ff0-b087-5ced3219f41a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:44.282521 master-0 kubenswrapper[18707]: I0320 09:05:44.282379 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fqw8\" (UniqueName: \"kubernetes.io/projected/c8751adb-f918-4ff0-b087-5ced3219f41a-kube-api-access-5fqw8\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:44.515001 master-0 kubenswrapper[18707]: I0320 09:05:44.514941 18707 generic.go:334] "Generic (PLEG): container finished" podID="0912bb76-0a2b-4ecc-92ff-6e13b6271c69" containerID="0101d045bfaf4f976767909dd68297d00498f9c1470f0da6d3064f48c3ba1d12" exitCode=0 Mar 20 09:05:44.515796 master-0 kubenswrapper[18707]: I0320 09:05:44.515030 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" event={"ID":"0912bb76-0a2b-4ecc-92ff-6e13b6271c69","Type":"ContainerDied","Data":"0101d045bfaf4f976767909dd68297d00498f9c1470f0da6d3064f48c3ba1d12"} Mar 20 09:05:44.515796 master-0 kubenswrapper[18707]: I0320 09:05:44.515769 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" event={"ID":"0912bb76-0a2b-4ecc-92ff-6e13b6271c69","Type":"ContainerStarted","Data":"8b632b3ba9edb8e2987dd26877ed052a03bb3a95ef1593b54e837c61d9d9d184"} Mar 20 09:05:44.518486 master-0 kubenswrapper[18707]: I0320 09:05:44.518431 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ndh4m" event={"ID":"c8751adb-f918-4ff0-b087-5ced3219f41a","Type":"ContainerDied","Data":"7735aaa69572240972c8026aa9b668d44438260cac590edbc031038c7c38231a"} Mar 20 09:05:44.518486 master-0 kubenswrapper[18707]: I0320 09:05:44.518483 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7735aaa69572240972c8026aa9b668d44438260cac590edbc031038c7c38231a" Mar 20 09:05:44.518660 master-0 kubenswrapper[18707]: I0320 09:05:44.518504 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" podUID="430cd83b-83e7-42ff-a0d0-c36c85ac0473" containerName="dnsmasq-dns" containerID="cri-o://19fae9d1bbd3ff36f4ea01d43ba7125422581ff0625381ea966a222ef3cabf6a" gracePeriod=10 Mar 20 09:05:44.518660 master-0 kubenswrapper[18707]: I0320 09:05:44.518543 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ndh4m" Mar 20 09:05:45.062149 master-0 kubenswrapper[18707]: I0320 09:05:45.062087 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:45.124378 master-0 kubenswrapper[18707]: I0320 09:05:45.121371 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ab5c43-515e-4d85-95df-ad62b13ab3a3" path="/var/lib/kubelet/pods/e9ab5c43-515e-4d85-95df-ad62b13ab3a3/volumes" Mar 20 09:05:45.214221 master-0 kubenswrapper[18707]: I0320 09:05:45.213155 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-dns-svc\") pod \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " Mar 20 09:05:45.214221 master-0 kubenswrapper[18707]: I0320 09:05:45.213319 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx9lc\" (UniqueName: \"kubernetes.io/projected/430cd83b-83e7-42ff-a0d0-c36c85ac0473-kube-api-access-bx9lc\") pod \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " Mar 20 09:05:45.214221 master-0 kubenswrapper[18707]: I0320 09:05:45.213403 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-config\") pod \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " Mar 20 09:05:45.214221 master-0 kubenswrapper[18707]: I0320 09:05:45.213491 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-edpm-a\") pod \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " Mar 20 09:05:45.214221 master-0 kubenswrapper[18707]: I0320 09:05:45.213569 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-ovsdbserver-sb\") pod \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " Mar 20 09:05:45.214221 master-0 kubenswrapper[18707]: I0320 09:05:45.213618 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-ovsdbserver-nb\") pod \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\" (UID: \"430cd83b-83e7-42ff-a0d0-c36c85ac0473\") " Mar 20 09:05:45.229218 master-0 kubenswrapper[18707]: I0320 09:05:45.221718 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/430cd83b-83e7-42ff-a0d0-c36c85ac0473-kube-api-access-bx9lc" (OuterVolumeSpecName: "kube-api-access-bx9lc") pod "430cd83b-83e7-42ff-a0d0-c36c85ac0473" (UID: "430cd83b-83e7-42ff-a0d0-c36c85ac0473"). InnerVolumeSpecName "kube-api-access-bx9lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:45.269680 master-0 kubenswrapper[18707]: I0320 09:05:45.269579 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "430cd83b-83e7-42ff-a0d0-c36c85ac0473" (UID: "430cd83b-83e7-42ff-a0d0-c36c85ac0473"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:45.270772 master-0 kubenswrapper[18707]: I0320 09:05:45.270732 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-config" (OuterVolumeSpecName: "config") pod "430cd83b-83e7-42ff-a0d0-c36c85ac0473" (UID: "430cd83b-83e7-42ff-a0d0-c36c85ac0473"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:45.273709 master-0 kubenswrapper[18707]: I0320 09:05:45.273568 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "430cd83b-83e7-42ff-a0d0-c36c85ac0473" (UID: "430cd83b-83e7-42ff-a0d0-c36c85ac0473"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:45.285864 master-0 kubenswrapper[18707]: I0320 09:05:45.285785 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "430cd83b-83e7-42ff-a0d0-c36c85ac0473" (UID: "430cd83b-83e7-42ff-a0d0-c36c85ac0473"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:45.286279 master-0 kubenswrapper[18707]: I0320 09:05:45.286223 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "430cd83b-83e7-42ff-a0d0-c36c85ac0473" (UID: "430cd83b-83e7-42ff-a0d0-c36c85ac0473"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:45.316487 master-0 kubenswrapper[18707]: I0320 09:05:45.316411 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx9lc\" (UniqueName: \"kubernetes.io/projected/430cd83b-83e7-42ff-a0d0-c36c85ac0473-kube-api-access-bx9lc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:45.316487 master-0 kubenswrapper[18707]: I0320 09:05:45.316481 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:45.316487 master-0 kubenswrapper[18707]: I0320 09:05:45.316497 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:45.316487 master-0 kubenswrapper[18707]: I0320 09:05:45.316506 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:45.316487 master-0 kubenswrapper[18707]: I0320 09:05:45.316515 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:45.316918 master-0 kubenswrapper[18707]: I0320 09:05:45.316526 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/430cd83b-83e7-42ff-a0d0-c36c85ac0473-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:45.538089 master-0 kubenswrapper[18707]: I0320 09:05:45.537924 18707 generic.go:334] "Generic (PLEG): container finished" podID="430cd83b-83e7-42ff-a0d0-c36c85ac0473" containerID="19fae9d1bbd3ff36f4ea01d43ba7125422581ff0625381ea966a222ef3cabf6a" exitCode=0 Mar 20 09:05:45.538089 master-0 kubenswrapper[18707]: I0320 09:05:45.537978 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" Mar 20 09:05:45.538089 master-0 kubenswrapper[18707]: I0320 09:05:45.538049 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" event={"ID":"430cd83b-83e7-42ff-a0d0-c36c85ac0473","Type":"ContainerDied","Data":"19fae9d1bbd3ff36f4ea01d43ba7125422581ff0625381ea966a222ef3cabf6a"} Mar 20 09:05:45.538089 master-0 kubenswrapper[18707]: I0320 09:05:45.538087 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8447fc9-wqvmw" event={"ID":"430cd83b-83e7-42ff-a0d0-c36c85ac0473","Type":"ContainerDied","Data":"a1af2d81c922098267baf75ad45a317de156197f7657625ff96ec2c0734ba0ad"} Mar 20 09:05:45.539127 master-0 kubenswrapper[18707]: I0320 09:05:45.538109 18707 scope.go:117] "RemoveContainer" containerID="19fae9d1bbd3ff36f4ea01d43ba7125422581ff0625381ea966a222ef3cabf6a" Mar 20 09:05:45.542038 master-0 kubenswrapper[18707]: I0320 09:05:45.541994 18707 generic.go:334] "Generic (PLEG): container finished" podID="dd5f8067-9874-4ff3-a9fb-a3251cc3622d" containerID="4d88c8c581a9c364031cb4d4336c34f4cc762f582c79aac4aa41cc2b4dca5dee" exitCode=0 Mar 20 09:05:45.542239 master-0 kubenswrapper[18707]: I0320 09:05:45.542064 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-662t6" event={"ID":"dd5f8067-9874-4ff3-a9fb-a3251cc3622d","Type":"ContainerDied","Data":"4d88c8c581a9c364031cb4d4336c34f4cc762f582c79aac4aa41cc2b4dca5dee"} Mar 20 09:05:45.545591 master-0 kubenswrapper[18707]: I0320 09:05:45.545499 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" event={"ID":"0912bb76-0a2b-4ecc-92ff-6e13b6271c69","Type":"ContainerStarted","Data":"2d321d2e4b95ca33632026f854c34906a05166ea003356db1afedac316a6cdd3"} Mar 20 09:05:45.545873 master-0 kubenswrapper[18707]: I0320 09:05:45.545823 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:45.584619 master-0 kubenswrapper[18707]: I0320 09:05:45.584560 18707 scope.go:117] "RemoveContainer" containerID="904edf5efa123f5dc30a0a1b778dce0d4e4c2dc4ee6ec76dfbe044a6fdd8c2b3" Mar 20 09:05:45.612311 master-0 kubenswrapper[18707]: I0320 09:05:45.612159 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8447fc9-wqvmw"] Mar 20 09:05:45.626061 master-0 kubenswrapper[18707]: I0320 09:05:45.626000 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c8447fc9-wqvmw"] Mar 20 09:05:45.645448 master-0 kubenswrapper[18707]: I0320 09:05:45.645367 18707 scope.go:117] "RemoveContainer" containerID="19fae9d1bbd3ff36f4ea01d43ba7125422581ff0625381ea966a222ef3cabf6a" Mar 20 09:05:45.646035 master-0 kubenswrapper[18707]: E0320 09:05:45.645987 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19fae9d1bbd3ff36f4ea01d43ba7125422581ff0625381ea966a222ef3cabf6a\": container with ID starting with 19fae9d1bbd3ff36f4ea01d43ba7125422581ff0625381ea966a222ef3cabf6a not found: ID does not exist" containerID="19fae9d1bbd3ff36f4ea01d43ba7125422581ff0625381ea966a222ef3cabf6a" Mar 20 09:05:45.646147 master-0 kubenswrapper[18707]: I0320 09:05:45.646032 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19fae9d1bbd3ff36f4ea01d43ba7125422581ff0625381ea966a222ef3cabf6a"} err="failed to get container status \"19fae9d1bbd3ff36f4ea01d43ba7125422581ff0625381ea966a222ef3cabf6a\": rpc error: code = NotFound desc = could not find container \"19fae9d1bbd3ff36f4ea01d43ba7125422581ff0625381ea966a222ef3cabf6a\": container with ID starting with 19fae9d1bbd3ff36f4ea01d43ba7125422581ff0625381ea966a222ef3cabf6a not found: ID does not exist" Mar 20 09:05:45.646147 master-0 kubenswrapper[18707]: I0320 09:05:45.646060 18707 scope.go:117] "RemoveContainer" containerID="904edf5efa123f5dc30a0a1b778dce0d4e4c2dc4ee6ec76dfbe044a6fdd8c2b3" Mar 20 09:05:45.646544 master-0 kubenswrapper[18707]: E0320 09:05:45.646481 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"904edf5efa123f5dc30a0a1b778dce0d4e4c2dc4ee6ec76dfbe044a6fdd8c2b3\": container with ID starting with 904edf5efa123f5dc30a0a1b778dce0d4e4c2dc4ee6ec76dfbe044a6fdd8c2b3 not found: ID does not exist" containerID="904edf5efa123f5dc30a0a1b778dce0d4e4c2dc4ee6ec76dfbe044a6fdd8c2b3" Mar 20 09:05:45.646629 master-0 kubenswrapper[18707]: I0320 09:05:45.646536 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"904edf5efa123f5dc30a0a1b778dce0d4e4c2dc4ee6ec76dfbe044a6fdd8c2b3"} err="failed to get container status \"904edf5efa123f5dc30a0a1b778dce0d4e4c2dc4ee6ec76dfbe044a6fdd8c2b3\": rpc error: code = NotFound desc = could not find container \"904edf5efa123f5dc30a0a1b778dce0d4e4c2dc4ee6ec76dfbe044a6fdd8c2b3\": container with ID starting with 904edf5efa123f5dc30a0a1b778dce0d4e4c2dc4ee6ec76dfbe044a6fdd8c2b3 not found: ID does not exist" Mar 20 09:05:46.357169 master-0 kubenswrapper[18707]: I0320 09:05:46.357060 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" podStartSLOduration=3.35703074 podStartE2EDuration="3.35703074s" podCreationTimestamp="2026-03-20 09:05:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:46.343120842 +0000 UTC m=+1491.499301198" watchObservedRunningTime="2026-03-20 09:05:46.35703074 +0000 UTC m=+1491.513211106" Mar 20 09:05:47.074062 master-0 kubenswrapper[18707]: I0320 09:05:47.073774 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-662t6" Mar 20 09:05:47.112856 master-0 kubenswrapper[18707]: I0320 09:05:47.112775 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="430cd83b-83e7-42ff-a0d0-c36c85ac0473" path="/var/lib/kubelet/pods/430cd83b-83e7-42ff-a0d0-c36c85ac0473/volumes" Mar 20 09:05:47.164937 master-0 kubenswrapper[18707]: I0320 09:05:47.164878 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw4wt\" (UniqueName: \"kubernetes.io/projected/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-kube-api-access-jw4wt\") pod \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " Mar 20 09:05:47.165274 master-0 kubenswrapper[18707]: I0320 09:05:47.165240 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-db-sync-config-data\") pod \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " Mar 20 09:05:47.165340 master-0 kubenswrapper[18707]: I0320 09:05:47.165279 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-combined-ca-bundle\") pod \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " Mar 20 09:05:47.165340 master-0 kubenswrapper[18707]: I0320 09:05:47.165319 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-config-data\") pod \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\" (UID: \"dd5f8067-9874-4ff3-a9fb-a3251cc3622d\") " Mar 20 09:05:47.169856 master-0 kubenswrapper[18707]: I0320 09:05:47.169656 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dd5f8067-9874-4ff3-a9fb-a3251cc3622d" (UID: "dd5f8067-9874-4ff3-a9fb-a3251cc3622d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:05:47.173273 master-0 kubenswrapper[18707]: I0320 09:05:47.172451 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-kube-api-access-jw4wt" (OuterVolumeSpecName: "kube-api-access-jw4wt") pod "dd5f8067-9874-4ff3-a9fb-a3251cc3622d" (UID: "dd5f8067-9874-4ff3-a9fb-a3251cc3622d"). InnerVolumeSpecName "kube-api-access-jw4wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:47.190521 master-0 kubenswrapper[18707]: I0320 09:05:47.190458 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd5f8067-9874-4ff3-a9fb-a3251cc3622d" (UID: "dd5f8067-9874-4ff3-a9fb-a3251cc3622d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:05:47.217470 master-0 kubenswrapper[18707]: I0320 09:05:47.217366 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-config-data" (OuterVolumeSpecName: "config-data") pod "dd5f8067-9874-4ff3-a9fb-a3251cc3622d" (UID: "dd5f8067-9874-4ff3-a9fb-a3251cc3622d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:05:47.268380 master-0 kubenswrapper[18707]: I0320 09:05:47.268258 18707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:47.268380 master-0 kubenswrapper[18707]: I0320 09:05:47.268337 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:47.268380 master-0 kubenswrapper[18707]: I0320 09:05:47.268358 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:47.268380 master-0 kubenswrapper[18707]: I0320 09:05:47.268376 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw4wt\" (UniqueName: \"kubernetes.io/projected/dd5f8067-9874-4ff3-a9fb-a3251cc3622d-kube-api-access-jw4wt\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:47.585262 master-0 kubenswrapper[18707]: I0320 09:05:47.585175 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-662t6" event={"ID":"dd5f8067-9874-4ff3-a9fb-a3251cc3622d","Type":"ContainerDied","Data":"0b4b7ae2200fe76848331e15d12821d473a328643c8836b61a6128fcf0b43bd7"} Mar 20 09:05:47.585262 master-0 kubenswrapper[18707]: I0320 09:05:47.585253 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b4b7ae2200fe76848331e15d12821d473a328643c8836b61a6128fcf0b43bd7" Mar 20 09:05:47.585570 master-0 kubenswrapper[18707]: I0320 09:05:47.585336 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-662t6" Mar 20 09:05:48.060824 master-0 kubenswrapper[18707]: I0320 09:05:48.060755 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6974f6bb85-z8znc"] Mar 20 09:05:48.061112 master-0 kubenswrapper[18707]: I0320 09:05:48.061065 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" podUID="0912bb76-0a2b-4ecc-92ff-6e13b6271c69" containerName="dnsmasq-dns" containerID="cri-o://2d321d2e4b95ca33632026f854c34906a05166ea003356db1afedac316a6cdd3" gracePeriod=10 Mar 20 09:05:48.102345 master-0 kubenswrapper[18707]: I0320 09:05:48.102286 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d96bd4f7c-7qqmd"] Mar 20 09:05:48.104613 master-0 kubenswrapper[18707]: E0320 09:05:48.102764 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8751adb-f918-4ff0-b087-5ced3219f41a" containerName="mariadb-account-create-update" Mar 20 09:05:48.104613 master-0 kubenswrapper[18707]: I0320 09:05:48.102779 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8751adb-f918-4ff0-b087-5ced3219f41a" containerName="mariadb-account-create-update" Mar 20 09:05:48.104613 master-0 kubenswrapper[18707]: E0320 09:05:48.102812 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="430cd83b-83e7-42ff-a0d0-c36c85ac0473" containerName="init" Mar 20 09:05:48.104613 master-0 kubenswrapper[18707]: I0320 09:05:48.102821 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="430cd83b-83e7-42ff-a0d0-c36c85ac0473" containerName="init" Mar 20 09:05:48.104613 master-0 kubenswrapper[18707]: E0320 09:05:48.102842 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="430cd83b-83e7-42ff-a0d0-c36c85ac0473" containerName="dnsmasq-dns" Mar 20 09:05:48.104613 master-0 kubenswrapper[18707]: I0320 09:05:48.102850 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="430cd83b-83e7-42ff-a0d0-c36c85ac0473" containerName="dnsmasq-dns" Mar 20 09:05:48.104613 master-0 kubenswrapper[18707]: E0320 09:05:48.102864 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd5f8067-9874-4ff3-a9fb-a3251cc3622d" containerName="glance-db-sync" Mar 20 09:05:48.104613 master-0 kubenswrapper[18707]: I0320 09:05:48.102871 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5f8067-9874-4ff3-a9fb-a3251cc3622d" containerName="glance-db-sync" Mar 20 09:05:48.104613 master-0 kubenswrapper[18707]: I0320 09:05:48.103083 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8751adb-f918-4ff0-b087-5ced3219f41a" containerName="mariadb-account-create-update" Mar 20 09:05:48.104613 master-0 kubenswrapper[18707]: I0320 09:05:48.103119 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="430cd83b-83e7-42ff-a0d0-c36c85ac0473" containerName="dnsmasq-dns" Mar 20 09:05:48.104613 master-0 kubenswrapper[18707]: I0320 09:05:48.103141 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd5f8067-9874-4ff3-a9fb-a3251cc3622d" containerName="glance-db-sync" Mar 20 09:05:48.104613 master-0 kubenswrapper[18707]: I0320 09:05:48.104349 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.123255 master-0 kubenswrapper[18707]: I0320 09:05:48.123033 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d96bd4f7c-7qqmd"] Mar 20 09:05:48.201691 master-0 kubenswrapper[18707]: I0320 09:05:48.201637 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-ovsdbserver-sb\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.202293 master-0 kubenswrapper[18707]: I0320 09:05:48.202219 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-ovsdbserver-nb\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.202368 master-0 kubenswrapper[18707]: I0320 09:05:48.202294 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzbpq\" (UniqueName: \"kubernetes.io/projected/138bebee-e403-4b17-9619-c78025b3dd4c-kube-api-access-mzbpq\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.202571 master-0 kubenswrapper[18707]: I0320 09:05:48.202500 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-dns-svc\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.202571 master-0 kubenswrapper[18707]: I0320 09:05:48.202552 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-config\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.204591 master-0 kubenswrapper[18707]: I0320 09:05:48.203697 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-dns-swift-storage-0\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.204591 master-0 kubenswrapper[18707]: I0320 09:05:48.203973 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-edpm-a\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.204897 master-0 kubenswrapper[18707]: I0320 09:05:48.204877 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 09:05:48.312600 master-0 kubenswrapper[18707]: I0320 09:05:48.307066 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-ovsdbserver-sb\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.312600 master-0 kubenswrapper[18707]: I0320 09:05:48.307882 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-ovsdbserver-nb\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.312600 master-0 kubenswrapper[18707]: I0320 09:05:48.307914 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzbpq\" (UniqueName: \"kubernetes.io/projected/138bebee-e403-4b17-9619-c78025b3dd4c-kube-api-access-mzbpq\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.312600 master-0 kubenswrapper[18707]: I0320 09:05:48.307995 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-dns-svc\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.312600 master-0 kubenswrapper[18707]: I0320 09:05:48.308027 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-config\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.312600 master-0 kubenswrapper[18707]: I0320 09:05:48.308226 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-dns-swift-storage-0\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.312600 master-0 kubenswrapper[18707]: I0320 09:05:48.308296 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-edpm-a\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.312600 master-0 kubenswrapper[18707]: I0320 09:05:48.310510 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-edpm-a\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.312600 master-0 kubenswrapper[18707]: I0320 09:05:48.311402 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-config\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.312600 master-0 kubenswrapper[18707]: I0320 09:05:48.312459 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-ovsdbserver-nb\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.313289 master-0 kubenswrapper[18707]: I0320 09:05:48.312710 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-dns-swift-storage-0\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.318124 master-0 kubenswrapper[18707]: I0320 09:05:48.314024 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-dns-svc\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.318124 master-0 kubenswrapper[18707]: I0320 09:05:48.315520 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-ovsdbserver-sb\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.356355 master-0 kubenswrapper[18707]: I0320 09:05:48.350035 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d96bd4f7c-7qqmd"] Mar 20 09:05:48.356355 master-0 kubenswrapper[18707]: E0320 09:05:48.352676 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-mzbpq], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" podUID="138bebee-e403-4b17-9619-c78025b3dd4c" Mar 20 09:05:48.377666 master-0 kubenswrapper[18707]: I0320 09:05:48.377539 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzbpq\" (UniqueName: \"kubernetes.io/projected/138bebee-e403-4b17-9619-c78025b3dd4c-kube-api-access-mzbpq\") pod \"dnsmasq-dns-5d96bd4f7c-7qqmd\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.429406 master-0 kubenswrapper[18707]: I0320 09:05:48.429192 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d58447bc7-xn5wl"] Mar 20 09:05:48.432824 master-0 kubenswrapper[18707]: I0320 09:05:48.432633 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.436727 master-0 kubenswrapper[18707]: I0320 09:05:48.436678 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"edpm-b" Mar 20 09:05:48.473323 master-0 kubenswrapper[18707]: I0320 09:05:48.470775 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d58447bc7-xn5wl"] Mar 20 09:05:48.526317 master-0 kubenswrapper[18707]: I0320 09:05:48.522996 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-edpm-b\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.526317 master-0 kubenswrapper[18707]: I0320 09:05:48.523055 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-edpm-a\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.526317 master-0 kubenswrapper[18707]: I0320 09:05:48.523079 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-config\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.526317 master-0 kubenswrapper[18707]: I0320 09:05:48.523104 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-ovsdbserver-nb\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.526317 master-0 kubenswrapper[18707]: I0320 09:05:48.523140 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-dns-svc\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.526317 master-0 kubenswrapper[18707]: I0320 09:05:48.523172 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztzln\" (UniqueName: \"kubernetes.io/projected/d17ed229-469c-4f0e-8def-a7322e154c7e-kube-api-access-ztzln\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.526317 master-0 kubenswrapper[18707]: I0320 09:05:48.523233 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-ovsdbserver-sb\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.526317 master-0 kubenswrapper[18707]: I0320 09:05:48.523297 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-dns-swift-storage-0\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.614053 master-0 kubenswrapper[18707]: I0320 09:05:48.610840 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d58447bc7-xn5wl"] Mar 20 09:05:48.614053 master-0 kubenswrapper[18707]: E0320 09:05:48.611749 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 edpm-a edpm-b kube-api-access-ztzln ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" podUID="d17ed229-469c-4f0e-8def-a7322e154c7e" Mar 20 09:05:48.628179 master-0 kubenswrapper[18707]: I0320 09:05:48.627984 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-edpm-b\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.628179 master-0 kubenswrapper[18707]: I0320 09:05:48.628054 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-edpm-a\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.628179 master-0 kubenswrapper[18707]: I0320 09:05:48.628085 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-config\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.628534 master-0 kubenswrapper[18707]: I0320 09:05:48.628223 18707 generic.go:334] "Generic (PLEG): container finished" podID="0912bb76-0a2b-4ecc-92ff-6e13b6271c69" containerID="2d321d2e4b95ca33632026f854c34906a05166ea003356db1afedac316a6cdd3" exitCode=0 Mar 20 09:05:48.628534 master-0 kubenswrapper[18707]: I0320 09:05:48.628299 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.630309 master-0 kubenswrapper[18707]: I0320 09:05:48.629115 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" event={"ID":"0912bb76-0a2b-4ecc-92ff-6e13b6271c69","Type":"ContainerDied","Data":"2d321d2e4b95ca33632026f854c34906a05166ea003356db1afedac316a6cdd3"} Mar 20 09:05:48.630309 master-0 kubenswrapper[18707]: I0320 09:05:48.629508 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-ovsdbserver-nb\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.630309 master-0 kubenswrapper[18707]: I0320 09:05:48.630062 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-config\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.630574 master-0 kubenswrapper[18707]: I0320 09:05:48.630488 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-edpm-a\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.631111 master-0 kubenswrapper[18707]: I0320 09:05:48.631065 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-ovsdbserver-nb\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.631176 master-0 kubenswrapper[18707]: I0320 09:05:48.631079 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-edpm-b\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.631176 master-0 kubenswrapper[18707]: I0320 09:05:48.631102 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-dns-svc\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.631328 master-0 kubenswrapper[18707]: I0320 09:05:48.631301 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztzln\" (UniqueName: \"kubernetes.io/projected/d17ed229-469c-4f0e-8def-a7322e154c7e-kube-api-access-ztzln\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.631511 master-0 kubenswrapper[18707]: I0320 09:05:48.631487 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-ovsdbserver-sb\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.631764 master-0 kubenswrapper[18707]: I0320 09:05:48.631743 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-dns-svc\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.632069 master-0 kubenswrapper[18707]: I0320 09:05:48.631849 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-dns-swift-storage-0\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.636239 master-0 kubenswrapper[18707]: I0320 09:05:48.632610 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-ovsdbserver-sb\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.636239 master-0 kubenswrapper[18707]: I0320 09:05:48.632728 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-dns-swift-storage-0\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.643772 master-0 kubenswrapper[18707]: I0320 09:05:48.643007 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b5b845b79-xn624"] Mar 20 09:05:48.645581 master-0 kubenswrapper[18707]: I0320 09:05:48.645159 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.659459 master-0 kubenswrapper[18707]: I0320 09:05:48.654504 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:48.664222 master-0 kubenswrapper[18707]: I0320 09:05:48.663172 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztzln\" (UniqueName: \"kubernetes.io/projected/d17ed229-469c-4f0e-8def-a7322e154c7e-kube-api-access-ztzln\") pod \"dnsmasq-dns-7d58447bc7-xn5wl\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:48.704623 master-0 kubenswrapper[18707]: I0320 09:05:48.704576 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b5b845b79-xn624"] Mar 20 09:05:48.733408 master-0 kubenswrapper[18707]: I0320 09:05:48.733235 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-dns-swift-storage-0\") pod \"138bebee-e403-4b17-9619-c78025b3dd4c\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " Mar 20 09:05:48.733622 master-0 kubenswrapper[18707]: I0320 09:05:48.733455 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-ovsdbserver-nb\") pod \"138bebee-e403-4b17-9619-c78025b3dd4c\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " Mar 20 09:05:48.733622 master-0 kubenswrapper[18707]: I0320 09:05:48.733573 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-dns-svc\") pod \"138bebee-e403-4b17-9619-c78025b3dd4c\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " Mar 20 09:05:48.733692 master-0 kubenswrapper[18707]: I0320 09:05:48.733635 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-config\") pod \"138bebee-e403-4b17-9619-c78025b3dd4c\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " Mar 20 09:05:48.733692 master-0 kubenswrapper[18707]: I0320 09:05:48.733665 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzbpq\" (UniqueName: \"kubernetes.io/projected/138bebee-e403-4b17-9619-c78025b3dd4c-kube-api-access-mzbpq\") pod \"138bebee-e403-4b17-9619-c78025b3dd4c\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " Mar 20 09:05:48.733754 master-0 kubenswrapper[18707]: I0320 09:05:48.733728 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-ovsdbserver-sb\") pod \"138bebee-e403-4b17-9619-c78025b3dd4c\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " Mar 20 09:05:48.733754 master-0 kubenswrapper[18707]: I0320 09:05:48.733749 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-edpm-a\") pod \"138bebee-e403-4b17-9619-c78025b3dd4c\" (UID: \"138bebee-e403-4b17-9619-c78025b3dd4c\") " Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.734050 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-dns-swift-storage-0\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.734062 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "138bebee-e403-4b17-9619-c78025b3dd4c" (UID: "138bebee-e403-4b17-9619-c78025b3dd4c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.734140 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlcd6\" (UniqueName: \"kubernetes.io/projected/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-kube-api-access-hlcd6\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.734425 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-config" (OuterVolumeSpecName: "config") pod "138bebee-e403-4b17-9619-c78025b3dd4c" (UID: "138bebee-e403-4b17-9619-c78025b3dd4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.734805 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "138bebee-e403-4b17-9619-c78025b3dd4c" (UID: "138bebee-e403-4b17-9619-c78025b3dd4c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.735283 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "138bebee-e403-4b17-9619-c78025b3dd4c" (UID: "138bebee-e403-4b17-9619-c78025b3dd4c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.734865 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-edpm-a\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.735579 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-edpm-b\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.735709 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-ovsdbserver-sb\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.735844 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-ovsdbserver-nb\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.735911 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-config\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.735998 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-dns-svc\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.736094 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.736112 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.736123 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:48.737668 master-0 kubenswrapper[18707]: I0320 09:05:48.736134 18707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:48.741572 master-0 kubenswrapper[18707]: I0320 09:05:48.740171 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "138bebee-e403-4b17-9619-c78025b3dd4c" (UID: "138bebee-e403-4b17-9619-c78025b3dd4c"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:48.741572 master-0 kubenswrapper[18707]: I0320 09:05:48.740356 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "138bebee-e403-4b17-9619-c78025b3dd4c" (UID: "138bebee-e403-4b17-9619-c78025b3dd4c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:48.749376 master-0 kubenswrapper[18707]: I0320 09:05:48.743503 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/138bebee-e403-4b17-9619-c78025b3dd4c-kube-api-access-mzbpq" (OuterVolumeSpecName: "kube-api-access-mzbpq") pod "138bebee-e403-4b17-9619-c78025b3dd4c" (UID: "138bebee-e403-4b17-9619-c78025b3dd4c"). InnerVolumeSpecName "kube-api-access-mzbpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:48.758202 master-0 kubenswrapper[18707]: I0320 09:05:48.750988 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-vtvmv"] Mar 20 09:05:48.758202 master-0 kubenswrapper[18707]: I0320 09:05:48.752430 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vtvmv" Mar 20 09:05:48.819299 master-0 kubenswrapper[18707]: I0320 09:05:48.809655 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vtvmv"] Mar 20 09:05:48.844366 master-0 kubenswrapper[18707]: I0320 09:05:48.843429 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fd5b27d-6c79-43fb-aa80-d104a32c95c1-operator-scripts\") pod \"cinder-db-create-vtvmv\" (UID: \"3fd5b27d-6c79-43fb-aa80-d104a32c95c1\") " pod="openstack/cinder-db-create-vtvmv" Mar 20 09:05:48.844366 master-0 kubenswrapper[18707]: I0320 09:05:48.843681 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlcd6\" (UniqueName: \"kubernetes.io/projected/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-kube-api-access-hlcd6\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.844366 master-0 kubenswrapper[18707]: I0320 09:05:48.843728 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-edpm-a\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.844366 master-0 kubenswrapper[18707]: I0320 09:05:48.843859 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-edpm-b\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.852308 master-0 kubenswrapper[18707]: I0320 09:05:48.845600 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-edpm-a\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.852308 master-0 kubenswrapper[18707]: I0320 09:05:48.850928 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-edpm-b\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.852308 master-0 kubenswrapper[18707]: I0320 09:05:48.851139 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-ovsdbserver-sb\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.852308 master-0 kubenswrapper[18707]: I0320 09:05:48.851746 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-ovsdbserver-sb\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.863879 master-0 kubenswrapper[18707]: I0320 09:05:48.861174 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:48.875943 master-0 kubenswrapper[18707]: I0320 09:05:48.875827 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlcd6\" (UniqueName: \"kubernetes.io/projected/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-kube-api-access-hlcd6\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.879387 master-0 kubenswrapper[18707]: I0320 09:05:48.879303 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3574-account-create-update-mw5s6"] Mar 20 09:05:48.879869 master-0 kubenswrapper[18707]: E0320 09:05:48.879849 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0912bb76-0a2b-4ecc-92ff-6e13b6271c69" containerName="init" Mar 20 09:05:48.879869 master-0 kubenswrapper[18707]: I0320 09:05:48.879867 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0912bb76-0a2b-4ecc-92ff-6e13b6271c69" containerName="init" Mar 20 09:05:48.879960 master-0 kubenswrapper[18707]: E0320 09:05:48.879900 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0912bb76-0a2b-4ecc-92ff-6e13b6271c69" containerName="dnsmasq-dns" Mar 20 09:05:48.879960 master-0 kubenswrapper[18707]: I0320 09:05:48.879907 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0912bb76-0a2b-4ecc-92ff-6e13b6271c69" containerName="dnsmasq-dns" Mar 20 09:05:48.880678 master-0 kubenswrapper[18707]: I0320 09:05:48.880123 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0912bb76-0a2b-4ecc-92ff-6e13b6271c69" containerName="dnsmasq-dns" Mar 20 09:05:48.881792 master-0 kubenswrapper[18707]: I0320 09:05:48.880869 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3574-account-create-update-mw5s6" Mar 20 09:05:48.882675 master-0 kubenswrapper[18707]: I0320 09:05:48.882655 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 09:05:48.889566 master-0 kubenswrapper[18707]: I0320 09:05:48.889524 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-ovsdbserver-nb\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.889736 master-0 kubenswrapper[18707]: I0320 09:05:48.889589 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-config\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.890161 master-0 kubenswrapper[18707]: I0320 09:05:48.890143 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rbg6\" (UniqueName: \"kubernetes.io/projected/3fd5b27d-6c79-43fb-aa80-d104a32c95c1-kube-api-access-2rbg6\") pod \"cinder-db-create-vtvmv\" (UID: \"3fd5b27d-6c79-43fb-aa80-d104a32c95c1\") " pod="openstack/cinder-db-create-vtvmv" Mar 20 09:05:48.890340 master-0 kubenswrapper[18707]: I0320 09:05:48.890173 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-dns-svc\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.890340 master-0 kubenswrapper[18707]: I0320 09:05:48.890246 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-dns-swift-storage-0\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.890340 master-0 kubenswrapper[18707]: I0320 09:05:48.890300 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzbpq\" (UniqueName: \"kubernetes.io/projected/138bebee-e403-4b17-9619-c78025b3dd4c-kube-api-access-mzbpq\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:48.890340 master-0 kubenswrapper[18707]: I0320 09:05:48.890314 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:48.890340 master-0 kubenswrapper[18707]: I0320 09:05:48.890324 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/138bebee-e403-4b17-9619-c78025b3dd4c-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:48.894127 master-0 kubenswrapper[18707]: I0320 09:05:48.893245 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-dns-svc\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.894127 master-0 kubenswrapper[18707]: I0320 09:05:48.893465 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-config\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.894127 master-0 kubenswrapper[18707]: I0320 09:05:48.893526 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-ovsdbserver-nb\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.894568 master-0 kubenswrapper[18707]: I0320 09:05:48.894538 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-dns-swift-storage-0\") pod \"dnsmasq-dns-b5b845b79-xn624\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:48.905215 master-0 kubenswrapper[18707]: I0320 09:05:48.902419 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3574-account-create-update-mw5s6"] Mar 20 09:05:48.994852 master-0 kubenswrapper[18707]: I0320 09:05:48.992043 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-ovsdbserver-sb\") pod \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " Mar 20 09:05:48.994852 master-0 kubenswrapper[18707]: I0320 09:05:48.992208 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcp6d\" (UniqueName: \"kubernetes.io/projected/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-kube-api-access-dcp6d\") pod \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " Mar 20 09:05:48.994852 master-0 kubenswrapper[18707]: I0320 09:05:48.992244 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-dns-swift-storage-0\") pod \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " Mar 20 09:05:48.994852 master-0 kubenswrapper[18707]: I0320 09:05:48.992336 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-dns-svc\") pod \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " Mar 20 09:05:48.994852 master-0 kubenswrapper[18707]: I0320 09:05:48.992382 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-config\") pod \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " Mar 20 09:05:48.994852 master-0 kubenswrapper[18707]: I0320 09:05:48.992585 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-ovsdbserver-nb\") pod \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " Mar 20 09:05:48.994852 master-0 kubenswrapper[18707]: I0320 09:05:48.992628 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-edpm-a\") pod \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\" (UID: \"0912bb76-0a2b-4ecc-92ff-6e13b6271c69\") " Mar 20 09:05:48.997056 master-0 kubenswrapper[18707]: I0320 09:05:48.997012 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rbg6\" (UniqueName: \"kubernetes.io/projected/3fd5b27d-6c79-43fb-aa80-d104a32c95c1-kube-api-access-2rbg6\") pod \"cinder-db-create-vtvmv\" (UID: \"3fd5b27d-6c79-43fb-aa80-d104a32c95c1\") " pod="openstack/cinder-db-create-vtvmv" Mar 20 09:05:48.997207 master-0 kubenswrapper[18707]: I0320 09:05:48.997172 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fd5b27d-6c79-43fb-aa80-d104a32c95c1-operator-scripts\") pod \"cinder-db-create-vtvmv\" (UID: \"3fd5b27d-6c79-43fb-aa80-d104a32c95c1\") " pod="openstack/cinder-db-create-vtvmv" Mar 20 09:05:48.997425 master-0 kubenswrapper[18707]: I0320 09:05:48.997393 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c87d5ca7-5cc3-4027-8725-0c42afdef9e9-operator-scripts\") pod \"cinder-3574-account-create-update-mw5s6\" (UID: \"c87d5ca7-5cc3-4027-8725-0c42afdef9e9\") " pod="openstack/cinder-3574-account-create-update-mw5s6" Mar 20 09:05:48.997589 master-0 kubenswrapper[18707]: I0320 09:05:48.997567 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tc8n\" (UniqueName: \"kubernetes.io/projected/c87d5ca7-5cc3-4027-8725-0c42afdef9e9-kube-api-access-4tc8n\") pod \"cinder-3574-account-create-update-mw5s6\" (UID: \"c87d5ca7-5cc3-4027-8725-0c42afdef9e9\") " pod="openstack/cinder-3574-account-create-update-mw5s6" Mar 20 09:05:49.008344 master-0 kubenswrapper[18707]: I0320 09:05:49.008287 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fd5b27d-6c79-43fb-aa80-d104a32c95c1-operator-scripts\") pod \"cinder-db-create-vtvmv\" (UID: \"3fd5b27d-6c79-43fb-aa80-d104a32c95c1\") " pod="openstack/cinder-db-create-vtvmv" Mar 20 09:05:49.035659 master-0 kubenswrapper[18707]: I0320 09:05:49.035588 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-kube-api-access-dcp6d" (OuterVolumeSpecName: "kube-api-access-dcp6d") pod "0912bb76-0a2b-4ecc-92ff-6e13b6271c69" (UID: "0912bb76-0a2b-4ecc-92ff-6e13b6271c69"). InnerVolumeSpecName "kube-api-access-dcp6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:49.051841 master-0 kubenswrapper[18707]: I0320 09:05:49.047357 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rbg6\" (UniqueName: \"kubernetes.io/projected/3fd5b27d-6c79-43fb-aa80-d104a32c95c1-kube-api-access-2rbg6\") pod \"cinder-db-create-vtvmv\" (UID: \"3fd5b27d-6c79-43fb-aa80-d104a32c95c1\") " pod="openstack/cinder-db-create-vtvmv" Mar 20 09:05:49.065436 master-0 kubenswrapper[18707]: I0320 09:05:49.061573 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j"] Mar 20 09:05:49.080291 master-0 kubenswrapper[18707]: I0320 09:05:49.067791 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" Mar 20 09:05:49.080291 master-0 kubenswrapper[18707]: I0320 09:05:49.070671 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"edpm-a-provisionserver-httpd-config" Mar 20 09:05:49.088893 master-0 kubenswrapper[18707]: I0320 09:05:49.082362 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0912bb76-0a2b-4ecc-92ff-6e13b6271c69" (UID: "0912bb76-0a2b-4ecc-92ff-6e13b6271c69"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:49.088893 master-0 kubenswrapper[18707]: I0320 09:05:49.088786 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-config" (OuterVolumeSpecName: "config") pod "0912bb76-0a2b-4ecc-92ff-6e13b6271c69" (UID: "0912bb76-0a2b-4ecc-92ff-6e13b6271c69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:49.089320 master-0 kubenswrapper[18707]: I0320 09:05:49.089247 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0912bb76-0a2b-4ecc-92ff-6e13b6271c69" (UID: "0912bb76-0a2b-4ecc-92ff-6e13b6271c69"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:49.101614 master-0 kubenswrapper[18707]: I0320 09:05:49.100984 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c87d5ca7-5cc3-4027-8725-0c42afdef9e9-operator-scripts\") pod \"cinder-3574-account-create-update-mw5s6\" (UID: \"c87d5ca7-5cc3-4027-8725-0c42afdef9e9\") " pod="openstack/cinder-3574-account-create-update-mw5s6" Mar 20 09:05:49.101614 master-0 kubenswrapper[18707]: I0320 09:05:49.101111 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tc8n\" (UniqueName: \"kubernetes.io/projected/c87d5ca7-5cc3-4027-8725-0c42afdef9e9-kube-api-access-4tc8n\") pod \"cinder-3574-account-create-update-mw5s6\" (UID: \"c87d5ca7-5cc3-4027-8725-0c42afdef9e9\") " pod="openstack/cinder-3574-account-create-update-mw5s6" Mar 20 09:05:49.103823 master-0 kubenswrapper[18707]: I0320 09:05:49.102848 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcp6d\" (UniqueName: \"kubernetes.io/projected/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-kube-api-access-dcp6d\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:49.104177 master-0 kubenswrapper[18707]: I0320 09:05:49.103859 18707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:49.104177 master-0 kubenswrapper[18707]: I0320 09:05:49.103876 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:49.104659 master-0 kubenswrapper[18707]: I0320 09:05:49.104614 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c87d5ca7-5cc3-4027-8725-0c42afdef9e9-operator-scripts\") pod \"cinder-3574-account-create-update-mw5s6\" (UID: \"c87d5ca7-5cc3-4027-8725-0c42afdef9e9\") " pod="openstack/cinder-3574-account-create-update-mw5s6" Mar 20 09:05:49.104709 master-0 kubenswrapper[18707]: I0320 09:05:49.104699 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:49.107379 master-0 kubenswrapper[18707]: I0320 09:05:49.107086 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0912bb76-0a2b-4ecc-92ff-6e13b6271c69" (UID: "0912bb76-0a2b-4ecc-92ff-6e13b6271c69"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:49.122611 master-0 kubenswrapper[18707]: I0320 09:05:49.122249 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "0912bb76-0a2b-4ecc-92ff-6e13b6271c69" (UID: "0912bb76-0a2b-4ecc-92ff-6e13b6271c69"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:49.149961 master-0 kubenswrapper[18707]: I0320 09:05:49.147260 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0912bb76-0a2b-4ecc-92ff-6e13b6271c69" (UID: "0912bb76-0a2b-4ecc-92ff-6e13b6271c69"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:49.160143 master-0 kubenswrapper[18707]: I0320 09:05:49.157262 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:49.160143 master-0 kubenswrapper[18707]: I0320 09:05:49.157384 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tc8n\" (UniqueName: \"kubernetes.io/projected/c87d5ca7-5cc3-4027-8725-0c42afdef9e9-kube-api-access-4tc8n\") pod \"cinder-3574-account-create-update-mw5s6\" (UID: \"c87d5ca7-5cc3-4027-8725-0c42afdef9e9\") " pod="openstack/cinder-3574-account-create-update-mw5s6" Mar 20 09:05:49.220531 master-0 kubenswrapper[18707]: I0320 09:05:49.208803 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vtvmv" Mar 20 09:05:49.220531 master-0 kubenswrapper[18707]: I0320 09:05:49.210889 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/90e3df0d-6e71-463d-9816-15b12a376333-image-data\") pod \"edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j\" (UID: \"90e3df0d-6e71-463d-9816-15b12a376333\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" Mar 20 09:05:49.220531 master-0 kubenswrapper[18707]: I0320 09:05:49.211506 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9tpq\" (UniqueName: \"kubernetes.io/projected/90e3df0d-6e71-463d-9816-15b12a376333-kube-api-access-n9tpq\") pod \"edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j\" (UID: \"90e3df0d-6e71-463d-9816-15b12a376333\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" Mar 20 09:05:49.220531 master-0 kubenswrapper[18707]: I0320 09:05:49.212109 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/configmap/90e3df0d-6e71-463d-9816-15b12a376333-httpd-config\") pod \"edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j\" (UID: \"90e3df0d-6e71-463d-9816-15b12a376333\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" Mar 20 09:05:49.220531 master-0 kubenswrapper[18707]: I0320 09:05:49.219900 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:49.220531 master-0 kubenswrapper[18707]: I0320 09:05:49.219933 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:49.220531 master-0 kubenswrapper[18707]: I0320 09:05:49.219945 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0912bb76-0a2b-4ecc-92ff-6e13b6271c69-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:49.230334 master-0 kubenswrapper[18707]: I0320 09:05:49.226671 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3574-account-create-update-mw5s6" Mar 20 09:05:49.306280 master-0 kubenswrapper[18707]: I0320 09:05:49.306150 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jg5hc"] Mar 20 09:05:49.308493 master-0 kubenswrapper[18707]: I0320 09:05:49.308444 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jg5hc"] Mar 20 09:05:49.308493 master-0 kubenswrapper[18707]: I0320 09:05:49.308480 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-bbvbz"] Mar 20 09:05:49.310107 master-0 kubenswrapper[18707]: I0320 09:05:49.310071 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-bbvbz"] Mar 20 09:05:49.310339 master-0 kubenswrapper[18707]: I0320 09:05:49.310311 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bbvbz" Mar 20 09:05:49.311076 master-0 kubenswrapper[18707]: I0320 09:05:49.310999 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jg5hc" Mar 20 09:05:49.316442 master-0 kubenswrapper[18707]: I0320 09:05:49.314992 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 09:05:49.316442 master-0 kubenswrapper[18707]: I0320 09:05:49.315296 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 09:05:49.316925 master-0 kubenswrapper[18707]: I0320 09:05:49.316548 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 09:05:49.333760 master-0 kubenswrapper[18707]: I0320 09:05:49.330837 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/configmap/90e3df0d-6e71-463d-9816-15b12a376333-httpd-config\") pod \"edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j\" (UID: \"90e3df0d-6e71-463d-9816-15b12a376333\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" Mar 20 09:05:49.333760 master-0 kubenswrapper[18707]: I0320 09:05:49.331196 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23011402-5090-4dcc-a36e-f9dcb3db8946-combined-ca-bundle\") pod \"keystone-db-sync-jg5hc\" (UID: \"23011402-5090-4dcc-a36e-f9dcb3db8946\") " pod="openstack/keystone-db-sync-jg5hc" Mar 20 09:05:49.333760 master-0 kubenswrapper[18707]: I0320 09:05:49.331235 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52457bb1-e081-4589-bbe2-3aaadfb92b31-operator-scripts\") pod \"neutron-db-create-bbvbz\" (UID: \"52457bb1-e081-4589-bbe2-3aaadfb92b31\") " pod="openstack/neutron-db-create-bbvbz" Mar 20 09:05:49.333760 master-0 kubenswrapper[18707]: I0320 09:05:49.331281 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/90e3df0d-6e71-463d-9816-15b12a376333-image-data\") pod \"edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j\" (UID: \"90e3df0d-6e71-463d-9816-15b12a376333\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" Mar 20 09:05:49.333760 master-0 kubenswrapper[18707]: I0320 09:05:49.331416 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9tpq\" (UniqueName: \"kubernetes.io/projected/90e3df0d-6e71-463d-9816-15b12a376333-kube-api-access-n9tpq\") pod \"edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j\" (UID: \"90e3df0d-6e71-463d-9816-15b12a376333\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" Mar 20 09:05:49.333760 master-0 kubenswrapper[18707]: I0320 09:05:49.331440 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zfjg\" (UniqueName: \"kubernetes.io/projected/52457bb1-e081-4589-bbe2-3aaadfb92b31-kube-api-access-2zfjg\") pod \"neutron-db-create-bbvbz\" (UID: \"52457bb1-e081-4589-bbe2-3aaadfb92b31\") " pod="openstack/neutron-db-create-bbvbz" Mar 20 09:05:49.333760 master-0 kubenswrapper[18707]: I0320 09:05:49.331465 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkgfc\" (UniqueName: \"kubernetes.io/projected/23011402-5090-4dcc-a36e-f9dcb3db8946-kube-api-access-tkgfc\") pod \"keystone-db-sync-jg5hc\" (UID: \"23011402-5090-4dcc-a36e-f9dcb3db8946\") " pod="openstack/keystone-db-sync-jg5hc" Mar 20 09:05:49.333760 master-0 kubenswrapper[18707]: I0320 09:05:49.331661 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23011402-5090-4dcc-a36e-f9dcb3db8946-config-data\") pod \"keystone-db-sync-jg5hc\" (UID: \"23011402-5090-4dcc-a36e-f9dcb3db8946\") " pod="openstack/keystone-db-sync-jg5hc" Mar 20 09:05:49.333760 master-0 kubenswrapper[18707]: I0320 09:05:49.332093 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/configmap/90e3df0d-6e71-463d-9816-15b12a376333-httpd-config\") pod \"edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j\" (UID: \"90e3df0d-6e71-463d-9816-15b12a376333\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" Mar 20 09:05:49.333760 master-0 kubenswrapper[18707]: I0320 09:05:49.332316 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/90e3df0d-6e71-463d-9816-15b12a376333-image-data\") pod \"edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j\" (UID: \"90e3df0d-6e71-463d-9816-15b12a376333\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" Mar 20 09:05:49.436195 master-0 kubenswrapper[18707]: I0320 09:05:49.435995 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23011402-5090-4dcc-a36e-f9dcb3db8946-combined-ca-bundle\") pod \"keystone-db-sync-jg5hc\" (UID: \"23011402-5090-4dcc-a36e-f9dcb3db8946\") " pod="openstack/keystone-db-sync-jg5hc" Mar 20 09:05:49.436537 master-0 kubenswrapper[18707]: I0320 09:05:49.436510 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52457bb1-e081-4589-bbe2-3aaadfb92b31-operator-scripts\") pod \"neutron-db-create-bbvbz\" (UID: \"52457bb1-e081-4589-bbe2-3aaadfb92b31\") " pod="openstack/neutron-db-create-bbvbz" Mar 20 09:05:49.436668 master-0 kubenswrapper[18707]: I0320 09:05:49.436647 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zfjg\" (UniqueName: \"kubernetes.io/projected/52457bb1-e081-4589-bbe2-3aaadfb92b31-kube-api-access-2zfjg\") pod \"neutron-db-create-bbvbz\" (UID: \"52457bb1-e081-4589-bbe2-3aaadfb92b31\") " pod="openstack/neutron-db-create-bbvbz" Mar 20 09:05:49.436747 master-0 kubenswrapper[18707]: I0320 09:05:49.436678 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkgfc\" (UniqueName: \"kubernetes.io/projected/23011402-5090-4dcc-a36e-f9dcb3db8946-kube-api-access-tkgfc\") pod \"keystone-db-sync-jg5hc\" (UID: \"23011402-5090-4dcc-a36e-f9dcb3db8946\") " pod="openstack/keystone-db-sync-jg5hc" Mar 20 09:05:49.436871 master-0 kubenswrapper[18707]: I0320 09:05:49.436846 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23011402-5090-4dcc-a36e-f9dcb3db8946-config-data\") pod \"keystone-db-sync-jg5hc\" (UID: \"23011402-5090-4dcc-a36e-f9dcb3db8946\") " pod="openstack/keystone-db-sync-jg5hc" Mar 20 09:05:49.437923 master-0 kubenswrapper[18707]: I0320 09:05:49.437885 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52457bb1-e081-4589-bbe2-3aaadfb92b31-operator-scripts\") pod \"neutron-db-create-bbvbz\" (UID: \"52457bb1-e081-4589-bbe2-3aaadfb92b31\") " pod="openstack/neutron-db-create-bbvbz" Mar 20 09:05:49.441883 master-0 kubenswrapper[18707]: I0320 09:05:49.439856 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23011402-5090-4dcc-a36e-f9dcb3db8946-combined-ca-bundle\") pod \"keystone-db-sync-jg5hc\" (UID: \"23011402-5090-4dcc-a36e-f9dcb3db8946\") " pod="openstack/keystone-db-sync-jg5hc" Mar 20 09:05:49.450390 master-0 kubenswrapper[18707]: I0320 09:05:49.445060 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23011402-5090-4dcc-a36e-f9dcb3db8946-config-data\") pod \"keystone-db-sync-jg5hc\" (UID: \"23011402-5090-4dcc-a36e-f9dcb3db8946\") " pod="openstack/keystone-db-sync-jg5hc" Mar 20 09:05:49.647370 master-0 kubenswrapper[18707]: I0320 09:05:49.646866 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:49.647688 master-0 kubenswrapper[18707]: I0320 09:05:49.647650 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" event={"ID":"0912bb76-0a2b-4ecc-92ff-6e13b6271c69","Type":"ContainerDied","Data":"8b632b3ba9edb8e2987dd26877ed052a03bb3a95ef1593b54e837c61d9d9d184"} Mar 20 09:05:49.647778 master-0 kubenswrapper[18707]: I0320 09:05:49.647756 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6974f6bb85-z8znc" Mar 20 09:05:49.647865 master-0 kubenswrapper[18707]: I0320 09:05:49.647838 18707 scope.go:117] "RemoveContainer" containerID="2d321d2e4b95ca33632026f854c34906a05166ea003356db1afedac316a6cdd3" Mar 20 09:05:49.649785 master-0 kubenswrapper[18707]: I0320 09:05:49.647704 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d96bd4f7c-7qqmd" Mar 20 09:05:49.660035 master-0 kubenswrapper[18707]: I0320 09:05:49.659840 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:49.663536 master-0 kubenswrapper[18707]: I0320 09:05:49.663503 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 09:05:49.677180 master-0 kubenswrapper[18707]: I0320 09:05:49.677084 18707 scope.go:117] "RemoveContainer" containerID="0101d045bfaf4f976767909dd68297d00498f9c1470f0da6d3064f48c3ba1d12" Mar 20 09:05:49.724219 master-0 kubenswrapper[18707]: I0320 09:05:49.723869 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkgfc\" (UniqueName: \"kubernetes.io/projected/23011402-5090-4dcc-a36e-f9dcb3db8946-kube-api-access-tkgfc\") pod \"keystone-db-sync-jg5hc\" (UID: \"23011402-5090-4dcc-a36e-f9dcb3db8946\") " pod="openstack/keystone-db-sync-jg5hc" Mar 20 09:05:49.739820 master-0 kubenswrapper[18707]: I0320 09:05:49.739750 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zfjg\" (UniqueName: \"kubernetes.io/projected/52457bb1-e081-4589-bbe2-3aaadfb92b31-kube-api-access-2zfjg\") pod \"neutron-db-create-bbvbz\" (UID: \"52457bb1-e081-4589-bbe2-3aaadfb92b31\") " pod="openstack/neutron-db-create-bbvbz" Mar 20 09:05:49.752939 master-0 kubenswrapper[18707]: I0320 09:05:49.752861 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9tpq\" (UniqueName: \"kubernetes.io/projected/90e3df0d-6e71-463d-9816-15b12a376333-kube-api-access-n9tpq\") pod \"edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j\" (UID: \"90e3df0d-6e71-463d-9816-15b12a376333\") " pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" Mar 20 09:05:49.963219 master-0 kubenswrapper[18707]: I0320 09:05:49.940940 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" Mar 20 09:05:49.963219 master-0 kubenswrapper[18707]: I0320 09:05:49.955425 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bbvbz" Mar 20 09:05:49.963219 master-0 kubenswrapper[18707]: I0320 09:05:49.959288 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b5b845b79-xn624"] Mar 20 09:05:49.983209 master-0 kubenswrapper[18707]: I0320 09:05:49.969371 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-3e52-account-create-update-qlzvf"] Mar 20 09:05:49.983209 master-0 kubenswrapper[18707]: I0320 09:05:49.970971 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3e52-account-create-update-qlzvf" Mar 20 09:05:49.983209 master-0 kubenswrapper[18707]: I0320 09:05:49.973668 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 09:05:49.983209 master-0 kubenswrapper[18707]: I0320 09:05:49.974368 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jg5hc" Mar 20 09:05:50.072220 master-0 kubenswrapper[18707]: I0320 09:05:50.068403 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-ovsdbserver-nb\") pod \"d17ed229-469c-4f0e-8def-a7322e154c7e\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " Mar 20 09:05:50.072220 master-0 kubenswrapper[18707]: I0320 09:05:50.068467 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-dns-svc\") pod \"d17ed229-469c-4f0e-8def-a7322e154c7e\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " Mar 20 09:05:50.072220 master-0 kubenswrapper[18707]: I0320 09:05:50.068510 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-edpm-b\") pod \"d17ed229-469c-4f0e-8def-a7322e154c7e\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " Mar 20 09:05:50.072220 master-0 kubenswrapper[18707]: I0320 09:05:50.068537 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-config\") pod \"d17ed229-469c-4f0e-8def-a7322e154c7e\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " Mar 20 09:05:50.072220 master-0 kubenswrapper[18707]: I0320 09:05:50.068597 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztzln\" (UniqueName: \"kubernetes.io/projected/d17ed229-469c-4f0e-8def-a7322e154c7e-kube-api-access-ztzln\") pod \"d17ed229-469c-4f0e-8def-a7322e154c7e\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " Mar 20 09:05:50.072220 master-0 kubenswrapper[18707]: I0320 09:05:50.068661 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-edpm-a\") pod \"d17ed229-469c-4f0e-8def-a7322e154c7e\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " Mar 20 09:05:50.072220 master-0 kubenswrapper[18707]: I0320 09:05:50.068700 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-dns-swift-storage-0\") pod \"d17ed229-469c-4f0e-8def-a7322e154c7e\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " Mar 20 09:05:50.072220 master-0 kubenswrapper[18707]: I0320 09:05:50.068801 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-ovsdbserver-sb\") pod \"d17ed229-469c-4f0e-8def-a7322e154c7e\" (UID: \"d17ed229-469c-4f0e-8def-a7322e154c7e\") " Mar 20 09:05:50.095989 master-0 kubenswrapper[18707]: I0320 09:05:50.095920 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d17ed229-469c-4f0e-8def-a7322e154c7e" (UID: "d17ed229-469c-4f0e-8def-a7322e154c7e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:50.100205 master-0 kubenswrapper[18707]: I0320 09:05:50.096545 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d17ed229-469c-4f0e-8def-a7322e154c7e" (UID: "d17ed229-469c-4f0e-8def-a7322e154c7e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:50.100205 master-0 kubenswrapper[18707]: I0320 09:05:50.097037 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-edpm-b" (OuterVolumeSpecName: "edpm-b") pod "d17ed229-469c-4f0e-8def-a7322e154c7e" (UID: "d17ed229-469c-4f0e-8def-a7322e154c7e"). InnerVolumeSpecName "edpm-b". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:50.100205 master-0 kubenswrapper[18707]: I0320 09:05:50.097529 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-config" (OuterVolumeSpecName: "config") pod "d17ed229-469c-4f0e-8def-a7322e154c7e" (UID: "d17ed229-469c-4f0e-8def-a7322e154c7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:50.100205 master-0 kubenswrapper[18707]: I0320 09:05:50.098213 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d17ed229-469c-4f0e-8def-a7322e154c7e" (UID: "d17ed229-469c-4f0e-8def-a7322e154c7e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:50.118254 master-0 kubenswrapper[18707]: I0320 09:05:50.103860 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "d17ed229-469c-4f0e-8def-a7322e154c7e" (UID: "d17ed229-469c-4f0e-8def-a7322e154c7e"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:50.118254 master-0 kubenswrapper[18707]: I0320 09:05:50.104351 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d17ed229-469c-4f0e-8def-a7322e154c7e" (UID: "d17ed229-469c-4f0e-8def-a7322e154c7e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:50.197204 master-0 kubenswrapper[18707]: I0320 09:05:50.121703 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d17ed229-469c-4f0e-8def-a7322e154c7e-kube-api-access-ztzln" (OuterVolumeSpecName: "kube-api-access-ztzln") pod "d17ed229-469c-4f0e-8def-a7322e154c7e" (UID: "d17ed229-469c-4f0e-8def-a7322e154c7e"). InnerVolumeSpecName "kube-api-access-ztzln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:50.197204 master-0 kubenswrapper[18707]: I0320 09:05:50.145466 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-vtvmv"] Mar 20 09:05:50.197204 master-0 kubenswrapper[18707]: I0320 09:05:50.172146 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:50.197204 master-0 kubenswrapper[18707]: I0320 09:05:50.172299 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:50.197204 master-0 kubenswrapper[18707]: I0320 09:05:50.172357 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:50.197204 master-0 kubenswrapper[18707]: I0320 09:05:50.172382 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:50.197204 master-0 kubenswrapper[18707]: I0320 09:05:50.172406 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztzln\" (UniqueName: \"kubernetes.io/projected/d17ed229-469c-4f0e-8def-a7322e154c7e-kube-api-access-ztzln\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:50.197204 master-0 kubenswrapper[18707]: I0320 09:05:50.172430 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:50.197204 master-0 kubenswrapper[18707]: I0320 09:05:50.172456 18707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:50.197204 master-0 kubenswrapper[18707]: I0320 09:05:50.172547 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d17ed229-469c-4f0e-8def-a7322e154c7e-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:50.197204 master-0 kubenswrapper[18707]: I0320 09:05:50.177888 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3574-account-create-update-mw5s6"] Mar 20 09:05:50.217427 master-0 kubenswrapper[18707]: W0320 09:05:50.204295 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90e3df0d_6e71_463d_9816_15b12a376333.slice/crio-0681d9538d7029b95d3ea9cecca3dcbf06b08bd8ad96e2ee31891b203cdab112 WatchSource:0}: Error finding container 0681d9538d7029b95d3ea9cecca3dcbf06b08bd8ad96e2ee31891b203cdab112: Status 404 returned error can't find the container with id 0681d9538d7029b95d3ea9cecca3dcbf06b08bd8ad96e2ee31891b203cdab112 Mar 20 09:05:50.217427 master-0 kubenswrapper[18707]: I0320 09:05:50.214265 18707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:05:50.217427 master-0 kubenswrapper[18707]: I0320 09:05:50.215625 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3e52-account-create-update-qlzvf"] Mar 20 09:05:50.278027 master-0 kubenswrapper[18707]: I0320 09:05:50.277961 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cnd7\" (UniqueName: \"kubernetes.io/projected/a238c22f-d559-4c69-aeb0-c16f67befc9e-kube-api-access-8cnd7\") pod \"neutron-3e52-account-create-update-qlzvf\" (UID: \"a238c22f-d559-4c69-aeb0-c16f67befc9e\") " pod="openstack/neutron-3e52-account-create-update-qlzvf" Mar 20 09:05:50.280441 master-0 kubenswrapper[18707]: I0320 09:05:50.278657 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a238c22f-d559-4c69-aeb0-c16f67befc9e-operator-scripts\") pod \"neutron-3e52-account-create-update-qlzvf\" (UID: \"a238c22f-d559-4c69-aeb0-c16f67befc9e\") " pod="openstack/neutron-3e52-account-create-update-qlzvf" Mar 20 09:05:50.387160 master-0 kubenswrapper[18707]: I0320 09:05:50.387102 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cnd7\" (UniqueName: \"kubernetes.io/projected/a238c22f-d559-4c69-aeb0-c16f67befc9e-kube-api-access-8cnd7\") pod \"neutron-3e52-account-create-update-qlzvf\" (UID: \"a238c22f-d559-4c69-aeb0-c16f67befc9e\") " pod="openstack/neutron-3e52-account-create-update-qlzvf" Mar 20 09:05:50.387415 master-0 kubenswrapper[18707]: I0320 09:05:50.387326 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a238c22f-d559-4c69-aeb0-c16f67befc9e-operator-scripts\") pod \"neutron-3e52-account-create-update-qlzvf\" (UID: \"a238c22f-d559-4c69-aeb0-c16f67befc9e\") " pod="openstack/neutron-3e52-account-create-update-qlzvf" Mar 20 09:05:50.389462 master-0 kubenswrapper[18707]: I0320 09:05:50.389434 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a238c22f-d559-4c69-aeb0-c16f67befc9e-operator-scripts\") pod \"neutron-3e52-account-create-update-qlzvf\" (UID: \"a238c22f-d559-4c69-aeb0-c16f67befc9e\") " pod="openstack/neutron-3e52-account-create-update-qlzvf" Mar 20 09:05:50.395606 master-0 kubenswrapper[18707]: I0320 09:05:50.395541 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6974f6bb85-z8znc"] Mar 20 09:05:50.727346 master-0 kubenswrapper[18707]: I0320 09:05:50.710231 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cnd7\" (UniqueName: \"kubernetes.io/projected/a238c22f-d559-4c69-aeb0-c16f67befc9e-kube-api-access-8cnd7\") pod \"neutron-3e52-account-create-update-qlzvf\" (UID: \"a238c22f-d559-4c69-aeb0-c16f67befc9e\") " pod="openstack/neutron-3e52-account-create-update-qlzvf" Mar 20 09:05:50.736358 master-0 kubenswrapper[18707]: I0320 09:05:50.735724 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3574-account-create-update-mw5s6" event={"ID":"c87d5ca7-5cc3-4027-8725-0c42afdef9e9","Type":"ContainerStarted","Data":"dd4bf7ffba95c34ea4c55cce49e083869243ef172e3feb457415d87ff262f530"} Mar 20 09:05:50.740299 master-0 kubenswrapper[18707]: I0320 09:05:50.737700 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vtvmv" event={"ID":"3fd5b27d-6c79-43fb-aa80-d104a32c95c1","Type":"ContainerStarted","Data":"43eee35e759a3c13cec9ebfdf2378e0939f277f66cac06ecb8cc9f317e767305"} Mar 20 09:05:50.740299 master-0 kubenswrapper[18707]: I0320 09:05:50.737751 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vtvmv" event={"ID":"3fd5b27d-6c79-43fb-aa80-d104a32c95c1","Type":"ContainerStarted","Data":"b2833f0e3ef26c7eadf6146626e2d5e925d17c4edeea3c06cb43eee13b6aac76"} Mar 20 09:05:50.750393 master-0 kubenswrapper[18707]: W0320 09:05:50.750342 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23011402_5090_4dcc_a36e_f9dcb3db8946.slice/crio-4d7907588de1e2a76a98a7608fc999222f8b019500b781314ed5124cbc89ab94 WatchSource:0}: Error finding container 4d7907588de1e2a76a98a7608fc999222f8b019500b781314ed5124cbc89ab94: Status 404 returned error can't find the container with id 4d7907588de1e2a76a98a7608fc999222f8b019500b781314ed5124cbc89ab94 Mar 20 09:05:50.754692 master-0 kubenswrapper[18707]: I0320 09:05:50.754645 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6974f6bb85-z8znc"] Mar 20 09:05:50.769282 master-0 kubenswrapper[18707]: I0320 09:05:50.768710 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jg5hc"] Mar 20 09:05:50.769282 master-0 kubenswrapper[18707]: I0320 09:05:50.769107 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b5b845b79-xn624" event={"ID":"28a3f9e1-5276-45c1-b6f3-94d2f09a223e","Type":"ContainerStarted","Data":"1e0784ea8f392a1d8a0a9db26e7fdf9ef30dc3a25a506f0f23d2aa1de0dfb6ec"} Mar 20 09:05:50.790246 master-0 kubenswrapper[18707]: I0320 09:05:50.790210 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d58447bc7-xn5wl" Mar 20 09:05:50.790443 master-0 kubenswrapper[18707]: I0320 09:05:50.790264 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" event={"ID":"90e3df0d-6e71-463d-9816-15b12a376333","Type":"ContainerStarted","Data":"0681d9538d7029b95d3ea9cecca3dcbf06b08bd8ad96e2ee31891b203cdab112"} Mar 20 09:05:50.898818 master-0 kubenswrapper[18707]: I0320 09:05:50.886702 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-bbvbz"] Mar 20 09:05:50.900481 master-0 kubenswrapper[18707]: I0320 09:05:50.898954 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3e52-account-create-update-qlzvf" Mar 20 09:05:50.927380 master-0 kubenswrapper[18707]: I0320 09:05:50.926410 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr"] Mar 20 09:05:50.993076 master-0 kubenswrapper[18707]: I0320 09:05:50.988716 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-vtvmv" podStartSLOduration=2.988698034 podStartE2EDuration="2.988698034s" podCreationTimestamp="2026-03-20 09:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:50.931947212 +0000 UTC m=+1496.088127578" watchObservedRunningTime="2026-03-20 09:05:50.988698034 +0000 UTC m=+1496.144878390" Mar 20 09:05:51.031030 master-0 kubenswrapper[18707]: I0320 09:05:51.030970 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d96bd4f7c-7qqmd"] Mar 20 09:05:51.031030 master-0 kubenswrapper[18707]: I0320 09:05:51.031022 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d96bd4f7c-7qqmd"] Mar 20 09:05:51.031925 master-0 kubenswrapper[18707]: I0320 09:05:51.031217 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" Mar 20 09:05:51.038822 master-0 kubenswrapper[18707]: I0320 09:05:51.038334 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"edpm-b-provisionserver-httpd-config" Mar 20 09:05:51.052278 master-0 kubenswrapper[18707]: I0320 09:05:51.052087 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d58447bc7-xn5wl"] Mar 20 09:05:51.072805 master-0 kubenswrapper[18707]: I0320 09:05:51.065646 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d58447bc7-xn5wl"] Mar 20 09:05:51.118410 master-0 kubenswrapper[18707]: I0320 09:05:51.114019 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0912bb76-0a2b-4ecc-92ff-6e13b6271c69" path="/var/lib/kubelet/pods/0912bb76-0a2b-4ecc-92ff-6e13b6271c69/volumes" Mar 20 09:05:51.118410 master-0 kubenswrapper[18707]: I0320 09:05:51.114781 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="138bebee-e403-4b17-9619-c78025b3dd4c" path="/var/lib/kubelet/pods/138bebee-e403-4b17-9619-c78025b3dd4c/volumes" Mar 20 09:05:51.118410 master-0 kubenswrapper[18707]: I0320 09:05:51.115174 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d17ed229-469c-4f0e-8def-a7322e154c7e" path="/var/lib/kubelet/pods/d17ed229-469c-4f0e-8def-a7322e154c7e/volumes" Mar 20 09:05:51.127208 master-0 kubenswrapper[18707]: I0320 09:05:51.121495 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc4g8\" (UniqueName: \"kubernetes.io/projected/c177a9fe-76d7-4325-8968-c7178ad8c75a-kube-api-access-tc4g8\") pod \"edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr\" (UID: \"c177a9fe-76d7-4325-8968-c7178ad8c75a\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" Mar 20 09:05:51.127208 master-0 kubenswrapper[18707]: I0320 09:05:51.121678 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/c177a9fe-76d7-4325-8968-c7178ad8c75a-image-data\") pod \"edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr\" (UID: \"c177a9fe-76d7-4325-8968-c7178ad8c75a\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" Mar 20 09:05:51.127208 master-0 kubenswrapper[18707]: I0320 09:05:51.121734 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/configmap/c177a9fe-76d7-4325-8968-c7178ad8c75a-httpd-config\") pod \"edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr\" (UID: \"c177a9fe-76d7-4325-8968-c7178ad8c75a\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" Mar 20 09:05:51.225143 master-0 kubenswrapper[18707]: I0320 09:05:51.225004 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/c177a9fe-76d7-4325-8968-c7178ad8c75a-image-data\") pod \"edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr\" (UID: \"c177a9fe-76d7-4325-8968-c7178ad8c75a\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" Mar 20 09:05:51.225859 master-0 kubenswrapper[18707]: I0320 09:05:51.225814 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/configmap/c177a9fe-76d7-4325-8968-c7178ad8c75a-httpd-config\") pod \"edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr\" (UID: \"c177a9fe-76d7-4325-8968-c7178ad8c75a\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" Mar 20 09:05:51.226178 master-0 kubenswrapper[18707]: I0320 09:05:51.226134 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/c177a9fe-76d7-4325-8968-c7178ad8c75a-image-data\") pod \"edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr\" (UID: \"c177a9fe-76d7-4325-8968-c7178ad8c75a\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" Mar 20 09:05:51.226487 master-0 kubenswrapper[18707]: I0320 09:05:51.226379 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc4g8\" (UniqueName: \"kubernetes.io/projected/c177a9fe-76d7-4325-8968-c7178ad8c75a-kube-api-access-tc4g8\") pod \"edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr\" (UID: \"c177a9fe-76d7-4325-8968-c7178ad8c75a\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" Mar 20 09:05:51.227757 master-0 kubenswrapper[18707]: I0320 09:05:51.227198 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/configmap/c177a9fe-76d7-4325-8968-c7178ad8c75a-httpd-config\") pod \"edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr\" (UID: \"c177a9fe-76d7-4325-8968-c7178ad8c75a\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" Mar 20 09:05:51.264542 master-0 kubenswrapper[18707]: I0320 09:05:51.264496 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc4g8\" (UniqueName: \"kubernetes.io/projected/c177a9fe-76d7-4325-8968-c7178ad8c75a-kube-api-access-tc4g8\") pod \"edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr\" (UID: \"c177a9fe-76d7-4325-8968-c7178ad8c75a\") " pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" Mar 20 09:05:51.379854 master-0 kubenswrapper[18707]: I0320 09:05:51.379814 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" Mar 20 09:05:51.424460 master-0 kubenswrapper[18707]: W0320 09:05:51.424402 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc177a9fe_76d7_4325_8968_c7178ad8c75a.slice/crio-58729020b218323b9010a0c79cd84437ab8a9efcc9c5beed481665de539b2b99 WatchSource:0}: Error finding container 58729020b218323b9010a0c79cd84437ab8a9efcc9c5beed481665de539b2b99: Status 404 returned error can't find the container with id 58729020b218323b9010a0c79cd84437ab8a9efcc9c5beed481665de539b2b99 Mar 20 09:05:51.537447 master-0 kubenswrapper[18707]: I0320 09:05:51.537378 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3e52-account-create-update-qlzvf"] Mar 20 09:05:51.827213 master-0 kubenswrapper[18707]: I0320 09:05:51.825876 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3e52-account-create-update-qlzvf" event={"ID":"a238c22f-d559-4c69-aeb0-c16f67befc9e","Type":"ContainerStarted","Data":"d99e9e87acc39b2aae4283d979f6e27f6734207a08061f99f0210ba24143eb5f"} Mar 20 09:05:51.827213 master-0 kubenswrapper[18707]: I0320 09:05:51.825931 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3e52-account-create-update-qlzvf" event={"ID":"a238c22f-d559-4c69-aeb0-c16f67befc9e","Type":"ContainerStarted","Data":"448399627f0b3bf4c7f1d94431243c55c10df6e29d7ac4c5c2f9f7df50b293a0"} Mar 20 09:05:51.837574 master-0 kubenswrapper[18707]: I0320 09:05:51.837523 18707 generic.go:334] "Generic (PLEG): container finished" podID="28a3f9e1-5276-45c1-b6f3-94d2f09a223e" containerID="b4c86c561b572525f81462d1f7a1709baa7ff16c4915810b9218e95aaa207747" exitCode=0 Mar 20 09:05:51.837791 master-0 kubenswrapper[18707]: I0320 09:05:51.837620 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b5b845b79-xn624" event={"ID":"28a3f9e1-5276-45c1-b6f3-94d2f09a223e","Type":"ContainerDied","Data":"b4c86c561b572525f81462d1f7a1709baa7ff16c4915810b9218e95aaa207747"} Mar 20 09:05:51.846658 master-0 kubenswrapper[18707]: I0320 09:05:51.840501 18707 generic.go:334] "Generic (PLEG): container finished" podID="52457bb1-e081-4589-bbe2-3aaadfb92b31" containerID="7f665346c34b1ed859296c42869b694dd76f834a391e6df93c790ecda567f987" exitCode=0 Mar 20 09:05:51.846658 master-0 kubenswrapper[18707]: I0320 09:05:51.840548 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-bbvbz" event={"ID":"52457bb1-e081-4589-bbe2-3aaadfb92b31","Type":"ContainerDied","Data":"7f665346c34b1ed859296c42869b694dd76f834a391e6df93c790ecda567f987"} Mar 20 09:05:51.846658 master-0 kubenswrapper[18707]: I0320 09:05:51.840569 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-bbvbz" event={"ID":"52457bb1-e081-4589-bbe2-3aaadfb92b31","Type":"ContainerStarted","Data":"ae51d12f8922637aab7b4445f260664c311d5c0f9de93fef49cda0fcae20c42e"} Mar 20 09:05:51.849148 master-0 kubenswrapper[18707]: I0320 09:05:51.849093 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jg5hc" event={"ID":"23011402-5090-4dcc-a36e-f9dcb3db8946","Type":"ContainerStarted","Data":"4d7907588de1e2a76a98a7608fc999222f8b019500b781314ed5124cbc89ab94"} Mar 20 09:05:51.865212 master-0 kubenswrapper[18707]: I0320 09:05:51.852927 18707 generic.go:334] "Generic (PLEG): container finished" podID="c87d5ca7-5cc3-4027-8725-0c42afdef9e9" containerID="ab43a5bdeb2dcfd47f72404e9a05ff319ff68d1b2595cae8c0932abb9637f51c" exitCode=0 Mar 20 09:05:51.865212 master-0 kubenswrapper[18707]: I0320 09:05:51.853022 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3574-account-create-update-mw5s6" event={"ID":"c87d5ca7-5cc3-4027-8725-0c42afdef9e9","Type":"ContainerDied","Data":"ab43a5bdeb2dcfd47f72404e9a05ff319ff68d1b2595cae8c0932abb9637f51c"} Mar 20 09:05:51.865212 master-0 kubenswrapper[18707]: I0320 09:05:51.863325 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-3e52-account-create-update-qlzvf" podStartSLOduration=2.863298306 podStartE2EDuration="2.863298306s" podCreationTimestamp="2026-03-20 09:05:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:51.859057685 +0000 UTC m=+1497.015238041" watchObservedRunningTime="2026-03-20 09:05:51.863298306 +0000 UTC m=+1497.019478662" Mar 20 09:05:51.874560 master-0 kubenswrapper[18707]: I0320 09:05:51.874510 18707 generic.go:334] "Generic (PLEG): container finished" podID="3fd5b27d-6c79-43fb-aa80-d104a32c95c1" containerID="43eee35e759a3c13cec9ebfdf2378e0939f277f66cac06ecb8cc9f317e767305" exitCode=0 Mar 20 09:05:51.874767 master-0 kubenswrapper[18707]: I0320 09:05:51.874578 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vtvmv" event={"ID":"3fd5b27d-6c79-43fb-aa80-d104a32c95c1","Type":"ContainerDied","Data":"43eee35e759a3c13cec9ebfdf2378e0939f277f66cac06ecb8cc9f317e767305"} Mar 20 09:05:51.880255 master-0 kubenswrapper[18707]: I0320 09:05:51.876290 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" event={"ID":"c177a9fe-76d7-4325-8968-c7178ad8c75a","Type":"ContainerStarted","Data":"58729020b218323b9010a0c79cd84437ab8a9efcc9c5beed481665de539b2b99"} Mar 20 09:05:52.900156 master-0 kubenswrapper[18707]: I0320 09:05:52.899713 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b5b845b79-xn624" event={"ID":"28a3f9e1-5276-45c1-b6f3-94d2f09a223e","Type":"ContainerStarted","Data":"e6b01f4b941175aade692b3f3edd18d0394a369525031c12bfced0cdaa0250f0"} Mar 20 09:05:52.901750 master-0 kubenswrapper[18707]: I0320 09:05:52.901644 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:52.904620 master-0 kubenswrapper[18707]: I0320 09:05:52.904593 18707 generic.go:334] "Generic (PLEG): container finished" podID="a238c22f-d559-4c69-aeb0-c16f67befc9e" containerID="d99e9e87acc39b2aae4283d979f6e27f6734207a08061f99f0210ba24143eb5f" exitCode=0 Mar 20 09:05:52.904991 master-0 kubenswrapper[18707]: I0320 09:05:52.904956 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3e52-account-create-update-qlzvf" event={"ID":"a238c22f-d559-4c69-aeb0-c16f67befc9e","Type":"ContainerDied","Data":"d99e9e87acc39b2aae4283d979f6e27f6734207a08061f99f0210ba24143eb5f"} Mar 20 09:05:53.034462 master-0 kubenswrapper[18707]: I0320 09:05:53.030503 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b5b845b79-xn624" podStartSLOduration=5.03047973 podStartE2EDuration="5.03047973s" podCreationTimestamp="2026-03-20 09:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:53.011765305 +0000 UTC m=+1498.167945661" watchObservedRunningTime="2026-03-20 09:05:53.03047973 +0000 UTC m=+1498.186660106" Mar 20 09:05:55.472995 master-0 kubenswrapper[18707]: I0320 09:05:55.472938 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3e52-account-create-update-qlzvf" Mar 20 09:05:55.482775 master-0 kubenswrapper[18707]: I0320 09:05:55.480313 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vtvmv" Mar 20 09:05:55.488695 master-0 kubenswrapper[18707]: I0320 09:05:55.488310 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3574-account-create-update-mw5s6" Mar 20 09:05:55.497081 master-0 kubenswrapper[18707]: I0320 09:05:55.497021 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bbvbz" Mar 20 09:05:55.569008 master-0 kubenswrapper[18707]: I0320 09:05:55.568943 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rbg6\" (UniqueName: \"kubernetes.io/projected/3fd5b27d-6c79-43fb-aa80-d104a32c95c1-kube-api-access-2rbg6\") pod \"3fd5b27d-6c79-43fb-aa80-d104a32c95c1\" (UID: \"3fd5b27d-6c79-43fb-aa80-d104a32c95c1\") " Mar 20 09:05:55.569008 master-0 kubenswrapper[18707]: I0320 09:05:55.569016 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tc8n\" (UniqueName: \"kubernetes.io/projected/c87d5ca7-5cc3-4027-8725-0c42afdef9e9-kube-api-access-4tc8n\") pod \"c87d5ca7-5cc3-4027-8725-0c42afdef9e9\" (UID: \"c87d5ca7-5cc3-4027-8725-0c42afdef9e9\") " Mar 20 09:05:55.569319 master-0 kubenswrapper[18707]: I0320 09:05:55.569118 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c87d5ca7-5cc3-4027-8725-0c42afdef9e9-operator-scripts\") pod \"c87d5ca7-5cc3-4027-8725-0c42afdef9e9\" (UID: \"c87d5ca7-5cc3-4027-8725-0c42afdef9e9\") " Mar 20 09:05:55.569319 master-0 kubenswrapper[18707]: I0320 09:05:55.569146 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a238c22f-d559-4c69-aeb0-c16f67befc9e-operator-scripts\") pod \"a238c22f-d559-4c69-aeb0-c16f67befc9e\" (UID: \"a238c22f-d559-4c69-aeb0-c16f67befc9e\") " Mar 20 09:05:55.569393 master-0 kubenswrapper[18707]: I0320 09:05:55.569344 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52457bb1-e081-4589-bbe2-3aaadfb92b31-operator-scripts\") pod \"52457bb1-e081-4589-bbe2-3aaadfb92b31\" (UID: \"52457bb1-e081-4589-bbe2-3aaadfb92b31\") " Mar 20 09:05:55.569505 master-0 kubenswrapper[18707]: I0320 09:05:55.569448 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fd5b27d-6c79-43fb-aa80-d104a32c95c1-operator-scripts\") pod \"3fd5b27d-6c79-43fb-aa80-d104a32c95c1\" (UID: \"3fd5b27d-6c79-43fb-aa80-d104a32c95c1\") " Mar 20 09:05:55.569562 master-0 kubenswrapper[18707]: I0320 09:05:55.569525 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cnd7\" (UniqueName: \"kubernetes.io/projected/a238c22f-d559-4c69-aeb0-c16f67befc9e-kube-api-access-8cnd7\") pod \"a238c22f-d559-4c69-aeb0-c16f67befc9e\" (UID: \"a238c22f-d559-4c69-aeb0-c16f67befc9e\") " Mar 20 09:05:55.569597 master-0 kubenswrapper[18707]: I0320 09:05:55.569563 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zfjg\" (UniqueName: \"kubernetes.io/projected/52457bb1-e081-4589-bbe2-3aaadfb92b31-kube-api-access-2zfjg\") pod \"52457bb1-e081-4589-bbe2-3aaadfb92b31\" (UID: \"52457bb1-e081-4589-bbe2-3aaadfb92b31\") " Mar 20 09:05:55.571174 master-0 kubenswrapper[18707]: I0320 09:05:55.571113 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a238c22f-d559-4c69-aeb0-c16f67befc9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a238c22f-d559-4c69-aeb0-c16f67befc9e" (UID: "a238c22f-d559-4c69-aeb0-c16f67befc9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:55.572082 master-0 kubenswrapper[18707]: I0320 09:05:55.572024 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52457bb1-e081-4589-bbe2-3aaadfb92b31-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "52457bb1-e081-4589-bbe2-3aaadfb92b31" (UID: "52457bb1-e081-4589-bbe2-3aaadfb92b31"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:55.572751 master-0 kubenswrapper[18707]: I0320 09:05:55.572259 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87d5ca7-5cc3-4027-8725-0c42afdef9e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c87d5ca7-5cc3-4027-8725-0c42afdef9e9" (UID: "c87d5ca7-5cc3-4027-8725-0c42afdef9e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:55.573043 master-0 kubenswrapper[18707]: I0320 09:05:55.572985 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fd5b27d-6c79-43fb-aa80-d104a32c95c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3fd5b27d-6c79-43fb-aa80-d104a32c95c1" (UID: "3fd5b27d-6c79-43fb-aa80-d104a32c95c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:55.586860 master-0 kubenswrapper[18707]: I0320 09:05:55.583005 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a238c22f-d559-4c69-aeb0-c16f67befc9e-kube-api-access-8cnd7" (OuterVolumeSpecName: "kube-api-access-8cnd7") pod "a238c22f-d559-4c69-aeb0-c16f67befc9e" (UID: "a238c22f-d559-4c69-aeb0-c16f67befc9e"). InnerVolumeSpecName "kube-api-access-8cnd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:55.586860 master-0 kubenswrapper[18707]: I0320 09:05:55.583238 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd5b27d-6c79-43fb-aa80-d104a32c95c1-kube-api-access-2rbg6" (OuterVolumeSpecName: "kube-api-access-2rbg6") pod "3fd5b27d-6c79-43fb-aa80-d104a32c95c1" (UID: "3fd5b27d-6c79-43fb-aa80-d104a32c95c1"). InnerVolumeSpecName "kube-api-access-2rbg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:55.586860 master-0 kubenswrapper[18707]: I0320 09:05:55.583805 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52457bb1-e081-4589-bbe2-3aaadfb92b31-kube-api-access-2zfjg" (OuterVolumeSpecName: "kube-api-access-2zfjg") pod "52457bb1-e081-4589-bbe2-3aaadfb92b31" (UID: "52457bb1-e081-4589-bbe2-3aaadfb92b31"). InnerVolumeSpecName "kube-api-access-2zfjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:55.586860 master-0 kubenswrapper[18707]: I0320 09:05:55.584537 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c87d5ca7-5cc3-4027-8725-0c42afdef9e9-kube-api-access-4tc8n" (OuterVolumeSpecName: "kube-api-access-4tc8n") pod "c87d5ca7-5cc3-4027-8725-0c42afdef9e9" (UID: "c87d5ca7-5cc3-4027-8725-0c42afdef9e9"). InnerVolumeSpecName "kube-api-access-4tc8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:55.671486 master-0 kubenswrapper[18707]: I0320 09:05:55.671411 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52457bb1-e081-4589-bbe2-3aaadfb92b31-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:55.671486 master-0 kubenswrapper[18707]: I0320 09:05:55.671458 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fd5b27d-6c79-43fb-aa80-d104a32c95c1-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:55.671486 master-0 kubenswrapper[18707]: I0320 09:05:55.671469 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cnd7\" (UniqueName: \"kubernetes.io/projected/a238c22f-d559-4c69-aeb0-c16f67befc9e-kube-api-access-8cnd7\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:55.671486 master-0 kubenswrapper[18707]: I0320 09:05:55.671482 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zfjg\" (UniqueName: \"kubernetes.io/projected/52457bb1-e081-4589-bbe2-3aaadfb92b31-kube-api-access-2zfjg\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:55.671486 master-0 kubenswrapper[18707]: I0320 09:05:55.671492 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rbg6\" (UniqueName: \"kubernetes.io/projected/3fd5b27d-6c79-43fb-aa80-d104a32c95c1-kube-api-access-2rbg6\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:55.671486 master-0 kubenswrapper[18707]: I0320 09:05:55.671502 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tc8n\" (UniqueName: \"kubernetes.io/projected/c87d5ca7-5cc3-4027-8725-0c42afdef9e9-kube-api-access-4tc8n\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:55.671486 master-0 kubenswrapper[18707]: I0320 09:05:55.671512 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c87d5ca7-5cc3-4027-8725-0c42afdef9e9-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:55.672128 master-0 kubenswrapper[18707]: I0320 09:05:55.671522 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a238c22f-d559-4c69-aeb0-c16f67befc9e-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:05:55.953222 master-0 kubenswrapper[18707]: I0320 09:05:55.953056 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3574-account-create-update-mw5s6" event={"ID":"c87d5ca7-5cc3-4027-8725-0c42afdef9e9","Type":"ContainerDied","Data":"dd4bf7ffba95c34ea4c55cce49e083869243ef172e3feb457415d87ff262f530"} Mar 20 09:05:55.953222 master-0 kubenswrapper[18707]: I0320 09:05:55.953111 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd4bf7ffba95c34ea4c55cce49e083869243ef172e3feb457415d87ff262f530" Mar 20 09:05:55.953708 master-0 kubenswrapper[18707]: I0320 09:05:55.953668 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3574-account-create-update-mw5s6" Mar 20 09:05:55.956528 master-0 kubenswrapper[18707]: I0320 09:05:55.955415 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-vtvmv" event={"ID":"3fd5b27d-6c79-43fb-aa80-d104a32c95c1","Type":"ContainerDied","Data":"b2833f0e3ef26c7eadf6146626e2d5e925d17c4edeea3c06cb43eee13b6aac76"} Mar 20 09:05:55.956528 master-0 kubenswrapper[18707]: I0320 09:05:55.955447 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2833f0e3ef26c7eadf6146626e2d5e925d17c4edeea3c06cb43eee13b6aac76" Mar 20 09:05:55.956528 master-0 kubenswrapper[18707]: I0320 09:05:55.955454 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-vtvmv" Mar 20 09:05:55.959006 master-0 kubenswrapper[18707]: I0320 09:05:55.958132 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3e52-account-create-update-qlzvf" event={"ID":"a238c22f-d559-4c69-aeb0-c16f67befc9e","Type":"ContainerDied","Data":"448399627f0b3bf4c7f1d94431243c55c10df6e29d7ac4c5c2f9f7df50b293a0"} Mar 20 09:05:55.959006 master-0 kubenswrapper[18707]: I0320 09:05:55.958201 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="448399627f0b3bf4c7f1d94431243c55c10df6e29d7ac4c5c2f9f7df50b293a0" Mar 20 09:05:55.959006 master-0 kubenswrapper[18707]: I0320 09:05:55.958265 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3e52-account-create-update-qlzvf" Mar 20 09:05:55.963067 master-0 kubenswrapper[18707]: I0320 09:05:55.962938 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-bbvbz" event={"ID":"52457bb1-e081-4589-bbe2-3aaadfb92b31","Type":"ContainerDied","Data":"ae51d12f8922637aab7b4445f260664c311d5c0f9de93fef49cda0fcae20c42e"} Mar 20 09:05:55.963067 master-0 kubenswrapper[18707]: I0320 09:05:55.962998 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae51d12f8922637aab7b4445f260664c311d5c0f9de93fef49cda0fcae20c42e" Mar 20 09:05:55.963067 master-0 kubenswrapper[18707]: I0320 09:05:55.963012 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-bbvbz" Mar 20 09:05:56.612030 master-0 kubenswrapper[18707]: E0320 09:05:56.611862 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice/crio-ae51d12f8922637aab7b4445f260664c311d5c0f9de93fef49cda0fcae20c42e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice\": RecentStats: unable to find data in memory cache]" Mar 20 09:05:56.617456 master-0 kubenswrapper[18707]: E0320 09:05:56.615098 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice/crio-ae51d12f8922637aab7b4445f260664c311d5c0f9de93fef49cda0fcae20c42e\": RecentStats: unable to find data in memory cache]" Mar 20 09:05:59.163258 master-0 kubenswrapper[18707]: I0320 09:05:59.161418 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:05:59.333114 master-0 kubenswrapper[18707]: I0320 09:05:59.332448 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-dhd96"] Mar 20 09:05:59.333114 master-0 kubenswrapper[18707]: I0320 09:05:59.332911 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" podUID="5e4940cf-9604-4c97-b847-f928f2dadaa9" containerName="dnsmasq-dns" containerID="cri-o://5c99fd3f403bbd68a04ab23e8e2717779b23a1758a6b5c1b6557a2e7daa2f890" gracePeriod=10 Mar 20 09:05:59.530605 master-0 kubenswrapper[18707]: I0320 09:05:59.530518 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" podUID="5e4940cf-9604-4c97-b847-f928f2dadaa9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.186:5353: connect: connection refused" Mar 20 09:05:59.927812 master-0 kubenswrapper[18707]: I0320 09:05:59.927758 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:06:00.063222 master-0 kubenswrapper[18707]: I0320 09:06:00.057714 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-dns-svc\") pod \"5e4940cf-9604-4c97-b847-f928f2dadaa9\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " Mar 20 09:06:00.063222 master-0 kubenswrapper[18707]: I0320 09:06:00.057794 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-ovsdbserver-sb\") pod \"5e4940cf-9604-4c97-b847-f928f2dadaa9\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " Mar 20 09:06:00.063222 master-0 kubenswrapper[18707]: I0320 09:06:00.057790 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jg5hc" event={"ID":"23011402-5090-4dcc-a36e-f9dcb3db8946","Type":"ContainerStarted","Data":"8abfaa08ff527dda1a412f16cc815c17755a014e017c686cc1bf1fc69b773326"} Mar 20 09:06:00.063222 master-0 kubenswrapper[18707]: I0320 09:06:00.057849 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-config\") pod \"5e4940cf-9604-4c97-b847-f928f2dadaa9\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " Mar 20 09:06:00.063222 master-0 kubenswrapper[18707]: I0320 09:06:00.057931 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7x8c\" (UniqueName: \"kubernetes.io/projected/5e4940cf-9604-4c97-b847-f928f2dadaa9-kube-api-access-s7x8c\") pod \"5e4940cf-9604-4c97-b847-f928f2dadaa9\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " Mar 20 09:06:00.063222 master-0 kubenswrapper[18707]: I0320 09:06:00.058041 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-ovsdbserver-nb\") pod \"5e4940cf-9604-4c97-b847-f928f2dadaa9\" (UID: \"5e4940cf-9604-4c97-b847-f928f2dadaa9\") " Mar 20 09:06:00.071223 master-0 kubenswrapper[18707]: I0320 09:06:00.067204 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e4940cf-9604-4c97-b847-f928f2dadaa9-kube-api-access-s7x8c" (OuterVolumeSpecName: "kube-api-access-s7x8c") pod "5e4940cf-9604-4c97-b847-f928f2dadaa9" (UID: "5e4940cf-9604-4c97-b847-f928f2dadaa9"). InnerVolumeSpecName "kube-api-access-s7x8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:00.071223 master-0 kubenswrapper[18707]: I0320 09:06:00.067552 18707 generic.go:334] "Generic (PLEG): container finished" podID="5e4940cf-9604-4c97-b847-f928f2dadaa9" containerID="5c99fd3f403bbd68a04ab23e8e2717779b23a1758a6b5c1b6557a2e7daa2f890" exitCode=0 Mar 20 09:06:00.071223 master-0 kubenswrapper[18707]: I0320 09:06:00.067586 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" event={"ID":"5e4940cf-9604-4c97-b847-f928f2dadaa9","Type":"ContainerDied","Data":"5c99fd3f403bbd68a04ab23e8e2717779b23a1758a6b5c1b6557a2e7daa2f890"} Mar 20 09:06:00.071223 master-0 kubenswrapper[18707]: I0320 09:06:00.067615 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" event={"ID":"5e4940cf-9604-4c97-b847-f928f2dadaa9","Type":"ContainerDied","Data":"54219d7425d1e1460191032517f026f3f1cc97cbe8d487a1f9b2c8d326f408a0"} Mar 20 09:06:00.071223 master-0 kubenswrapper[18707]: I0320 09:06:00.067635 18707 scope.go:117] "RemoveContainer" containerID="5c99fd3f403bbd68a04ab23e8e2717779b23a1758a6b5c1b6557a2e7daa2f890" Mar 20 09:06:00.071223 master-0 kubenswrapper[18707]: I0320 09:06:00.067774 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-dhd96" Mar 20 09:06:00.095282 master-0 kubenswrapper[18707]: I0320 09:06:00.094603 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jg5hc" podStartSLOduration=2.826426873 podStartE2EDuration="11.094579563s" podCreationTimestamp="2026-03-20 09:05:49 +0000 UTC" firstStartedPulling="2026-03-20 09:05:50.770441987 +0000 UTC m=+1495.926622343" lastFinishedPulling="2026-03-20 09:05:59.038594667 +0000 UTC m=+1504.194775033" observedRunningTime="2026-03-20 09:06:00.088954952 +0000 UTC m=+1505.245135328" watchObservedRunningTime="2026-03-20 09:06:00.094579563 +0000 UTC m=+1505.250759919" Mar 20 09:06:00.119213 master-0 kubenswrapper[18707]: I0320 09:06:00.118919 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e4940cf-9604-4c97-b847-f928f2dadaa9" (UID: "5e4940cf-9604-4c97-b847-f928f2dadaa9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:00.161608 master-0 kubenswrapper[18707]: I0320 09:06:00.161440 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:00.161608 master-0 kubenswrapper[18707]: I0320 09:06:00.161613 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7x8c\" (UniqueName: \"kubernetes.io/projected/5e4940cf-9604-4c97-b847-f928f2dadaa9-kube-api-access-s7x8c\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:00.170143 master-0 kubenswrapper[18707]: I0320 09:06:00.170037 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-config" (OuterVolumeSpecName: "config") pod "5e4940cf-9604-4c97-b847-f928f2dadaa9" (UID: "5e4940cf-9604-4c97-b847-f928f2dadaa9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:00.178229 master-0 kubenswrapper[18707]: I0320 09:06:00.175904 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e4940cf-9604-4c97-b847-f928f2dadaa9" (UID: "5e4940cf-9604-4c97-b847-f928f2dadaa9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:00.178483 master-0 kubenswrapper[18707]: I0320 09:06:00.178350 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e4940cf-9604-4c97-b847-f928f2dadaa9" (UID: "5e4940cf-9604-4c97-b847-f928f2dadaa9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:00.265113 master-0 kubenswrapper[18707]: I0320 09:06:00.263876 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:00.265113 master-0 kubenswrapper[18707]: I0320 09:06:00.263939 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:00.265113 master-0 kubenswrapper[18707]: I0320 09:06:00.263954 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e4940cf-9604-4c97-b847-f928f2dadaa9-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:00.266786 master-0 kubenswrapper[18707]: I0320 09:06:00.266723 18707 scope.go:117] "RemoveContainer" containerID="1ca630b2bcd6c9372c7740fe93dd9f588683360db12802e43e148f06eb2be378" Mar 20 09:06:00.298465 master-0 kubenswrapper[18707]: I0320 09:06:00.298414 18707 scope.go:117] "RemoveContainer" containerID="5c99fd3f403bbd68a04ab23e8e2717779b23a1758a6b5c1b6557a2e7daa2f890" Mar 20 09:06:00.299209 master-0 kubenswrapper[18707]: E0320 09:06:00.299079 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c99fd3f403bbd68a04ab23e8e2717779b23a1758a6b5c1b6557a2e7daa2f890\": container with ID starting with 5c99fd3f403bbd68a04ab23e8e2717779b23a1758a6b5c1b6557a2e7daa2f890 not found: ID does not exist" containerID="5c99fd3f403bbd68a04ab23e8e2717779b23a1758a6b5c1b6557a2e7daa2f890" Mar 20 09:06:00.299274 master-0 kubenswrapper[18707]: I0320 09:06:00.299206 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c99fd3f403bbd68a04ab23e8e2717779b23a1758a6b5c1b6557a2e7daa2f890"} err="failed to get container status \"5c99fd3f403bbd68a04ab23e8e2717779b23a1758a6b5c1b6557a2e7daa2f890\": rpc error: code = NotFound desc = could not find container \"5c99fd3f403bbd68a04ab23e8e2717779b23a1758a6b5c1b6557a2e7daa2f890\": container with ID starting with 5c99fd3f403bbd68a04ab23e8e2717779b23a1758a6b5c1b6557a2e7daa2f890 not found: ID does not exist" Mar 20 09:06:00.299274 master-0 kubenswrapper[18707]: I0320 09:06:00.299247 18707 scope.go:117] "RemoveContainer" containerID="1ca630b2bcd6c9372c7740fe93dd9f588683360db12802e43e148f06eb2be378" Mar 20 09:06:00.300340 master-0 kubenswrapper[18707]: E0320 09:06:00.300305 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ca630b2bcd6c9372c7740fe93dd9f588683360db12802e43e148f06eb2be378\": container with ID starting with 1ca630b2bcd6c9372c7740fe93dd9f588683360db12802e43e148f06eb2be378 not found: ID does not exist" containerID="1ca630b2bcd6c9372c7740fe93dd9f588683360db12802e43e148f06eb2be378" Mar 20 09:06:00.300402 master-0 kubenswrapper[18707]: I0320 09:06:00.300350 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ca630b2bcd6c9372c7740fe93dd9f588683360db12802e43e148f06eb2be378"} err="failed to get container status \"1ca630b2bcd6c9372c7740fe93dd9f588683360db12802e43e148f06eb2be378\": rpc error: code = NotFound desc = could not find container \"1ca630b2bcd6c9372c7740fe93dd9f588683360db12802e43e148f06eb2be378\": container with ID starting with 1ca630b2bcd6c9372c7740fe93dd9f588683360db12802e43e148f06eb2be378 not found: ID does not exist" Mar 20 09:06:00.432215 master-0 kubenswrapper[18707]: I0320 09:06:00.430412 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-dhd96"] Mar 20 09:06:00.449259 master-0 kubenswrapper[18707]: I0320 09:06:00.446997 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-dhd96"] Mar 20 09:06:01.130230 master-0 kubenswrapper[18707]: I0320 09:06:01.130152 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e4940cf-9604-4c97-b847-f928f2dadaa9" path="/var/lib/kubelet/pods/5e4940cf-9604-4c97-b847-f928f2dadaa9/volumes" Mar 20 09:06:05.187306 master-0 kubenswrapper[18707]: I0320 09:06:05.182538 18707 generic.go:334] "Generic (PLEG): container finished" podID="23011402-5090-4dcc-a36e-f9dcb3db8946" containerID="8abfaa08ff527dda1a412f16cc815c17755a014e017c686cc1bf1fc69b773326" exitCode=0 Mar 20 09:06:05.187306 master-0 kubenswrapper[18707]: I0320 09:06:05.182595 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jg5hc" event={"ID":"23011402-5090-4dcc-a36e-f9dcb3db8946","Type":"ContainerDied","Data":"8abfaa08ff527dda1a412f16cc815c17755a014e017c686cc1bf1fc69b773326"} Mar 20 09:06:06.628615 master-0 kubenswrapper[18707]: I0320 09:06:06.628557 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jg5hc" Mar 20 09:06:06.662281 master-0 kubenswrapper[18707]: I0320 09:06:06.658670 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkgfc\" (UniqueName: \"kubernetes.io/projected/23011402-5090-4dcc-a36e-f9dcb3db8946-kube-api-access-tkgfc\") pod \"23011402-5090-4dcc-a36e-f9dcb3db8946\" (UID: \"23011402-5090-4dcc-a36e-f9dcb3db8946\") " Mar 20 09:06:06.662281 master-0 kubenswrapper[18707]: I0320 09:06:06.658871 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23011402-5090-4dcc-a36e-f9dcb3db8946-combined-ca-bundle\") pod \"23011402-5090-4dcc-a36e-f9dcb3db8946\" (UID: \"23011402-5090-4dcc-a36e-f9dcb3db8946\") " Mar 20 09:06:06.662281 master-0 kubenswrapper[18707]: I0320 09:06:06.658924 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23011402-5090-4dcc-a36e-f9dcb3db8946-config-data\") pod \"23011402-5090-4dcc-a36e-f9dcb3db8946\" (UID: \"23011402-5090-4dcc-a36e-f9dcb3db8946\") " Mar 20 09:06:06.694103 master-0 kubenswrapper[18707]: I0320 09:06:06.693809 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23011402-5090-4dcc-a36e-f9dcb3db8946-kube-api-access-tkgfc" (OuterVolumeSpecName: "kube-api-access-tkgfc") pod "23011402-5090-4dcc-a36e-f9dcb3db8946" (UID: "23011402-5090-4dcc-a36e-f9dcb3db8946"). InnerVolumeSpecName "kube-api-access-tkgfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:06.721496 master-0 kubenswrapper[18707]: I0320 09:06:06.721355 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23011402-5090-4dcc-a36e-f9dcb3db8946-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23011402-5090-4dcc-a36e-f9dcb3db8946" (UID: "23011402-5090-4dcc-a36e-f9dcb3db8946"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:06.727090 master-0 kubenswrapper[18707]: I0320 09:06:06.727024 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23011402-5090-4dcc-a36e-f9dcb3db8946-config-data" (OuterVolumeSpecName: "config-data") pod "23011402-5090-4dcc-a36e-f9dcb3db8946" (UID: "23011402-5090-4dcc-a36e-f9dcb3db8946"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:06.761243 master-0 kubenswrapper[18707]: I0320 09:06:06.761154 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkgfc\" (UniqueName: \"kubernetes.io/projected/23011402-5090-4dcc-a36e-f9dcb3db8946-kube-api-access-tkgfc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:06.761470 master-0 kubenswrapper[18707]: I0320 09:06:06.761452 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23011402-5090-4dcc-a36e-f9dcb3db8946-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:06.761604 master-0 kubenswrapper[18707]: I0320 09:06:06.761472 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23011402-5090-4dcc-a36e-f9dcb3db8946-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:06.967921 master-0 kubenswrapper[18707]: E0320 09:06:06.967343 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice/crio-ae51d12f8922637aab7b4445f260664c311d5c0f9de93fef49cda0fcae20c42e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice\": RecentStats: unable to find data in memory cache]" Mar 20 09:06:07.210953 master-0 kubenswrapper[18707]: I0320 09:06:07.210843 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jg5hc" event={"ID":"23011402-5090-4dcc-a36e-f9dcb3db8946","Type":"ContainerDied","Data":"4d7907588de1e2a76a98a7608fc999222f8b019500b781314ed5124cbc89ab94"} Mar 20 09:06:07.210953 master-0 kubenswrapper[18707]: I0320 09:06:07.210906 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d7907588de1e2a76a98a7608fc999222f8b019500b781314ed5124cbc89ab94" Mar 20 09:06:07.211269 master-0 kubenswrapper[18707]: I0320 09:06:07.210969 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jg5hc" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: I0320 09:06:07.546266 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xz2l7"] Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: E0320 09:06:07.546957 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4940cf-9604-4c97-b847-f928f2dadaa9" containerName="dnsmasq-dns" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: I0320 09:06:07.546979 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4940cf-9604-4c97-b847-f928f2dadaa9" containerName="dnsmasq-dns" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: E0320 09:06:07.547003 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd5b27d-6c79-43fb-aa80-d104a32c95c1" containerName="mariadb-database-create" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: I0320 09:06:07.547012 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd5b27d-6c79-43fb-aa80-d104a32c95c1" containerName="mariadb-database-create" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: E0320 09:06:07.547039 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a238c22f-d559-4c69-aeb0-c16f67befc9e" containerName="mariadb-account-create-update" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: I0320 09:06:07.547049 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a238c22f-d559-4c69-aeb0-c16f67befc9e" containerName="mariadb-account-create-update" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: E0320 09:06:07.547060 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87d5ca7-5cc3-4027-8725-0c42afdef9e9" containerName="mariadb-account-create-update" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: I0320 09:06:07.547068 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87d5ca7-5cc3-4027-8725-0c42afdef9e9" containerName="mariadb-account-create-update" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: E0320 09:06:07.547094 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52457bb1-e081-4589-bbe2-3aaadfb92b31" containerName="mariadb-database-create" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: I0320 09:06:07.547102 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="52457bb1-e081-4589-bbe2-3aaadfb92b31" containerName="mariadb-database-create" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: E0320 09:06:07.547124 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4940cf-9604-4c97-b847-f928f2dadaa9" containerName="init" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: I0320 09:06:07.547133 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4940cf-9604-4c97-b847-f928f2dadaa9" containerName="init" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: E0320 09:06:07.547157 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23011402-5090-4dcc-a36e-f9dcb3db8946" containerName="keystone-db-sync" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: I0320 09:06:07.547166 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="23011402-5090-4dcc-a36e-f9dcb3db8946" containerName="keystone-db-sync" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: I0320 09:06:07.547459 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd5b27d-6c79-43fb-aa80-d104a32c95c1" containerName="mariadb-database-create" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: I0320 09:06:07.547496 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a238c22f-d559-4c69-aeb0-c16f67befc9e" containerName="mariadb-account-create-update" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: I0320 09:06:07.547520 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87d5ca7-5cc3-4027-8725-0c42afdef9e9" containerName="mariadb-account-create-update" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: I0320 09:06:07.547541 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e4940cf-9604-4c97-b847-f928f2dadaa9" containerName="dnsmasq-dns" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: I0320 09:06:07.547594 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="52457bb1-e081-4589-bbe2-3aaadfb92b31" containerName="mariadb-database-create" Mar 20 09:06:07.551256 master-0 kubenswrapper[18707]: I0320 09:06:07.547621 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="23011402-5090-4dcc-a36e-f9dcb3db8946" containerName="keystone-db-sync" Mar 20 09:06:07.556061 master-0 kubenswrapper[18707]: I0320 09:06:07.554322 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.584318 master-0 kubenswrapper[18707]: I0320 09:06:07.580692 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xz2l7"] Mar 20 09:06:07.588595 master-0 kubenswrapper[18707]: I0320 09:06:07.588544 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 09:06:07.588837 master-0 kubenswrapper[18707]: I0320 09:06:07.588661 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 09:06:07.596073 master-0 kubenswrapper[18707]: I0320 09:06:07.596022 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 09:06:07.606758 master-0 kubenswrapper[18707]: I0320 09:06:07.606710 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 09:06:07.656805 master-0 kubenswrapper[18707]: I0320 09:06:07.656722 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bd59cc7c7-jcbnm"] Mar 20 09:06:07.672958 master-0 kubenswrapper[18707]: I0320 09:06:07.672888 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.702114 master-0 kubenswrapper[18707]: I0320 09:06:07.700151 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-config\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.702114 master-0 kubenswrapper[18707]: I0320 09:06:07.700238 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-edpm-a\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.702114 master-0 kubenswrapper[18707]: I0320 09:06:07.700446 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-scripts\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.702114 master-0 kubenswrapper[18707]: I0320 09:06:07.700681 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-ovsdbserver-sb\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.702114 master-0 kubenswrapper[18707]: I0320 09:06:07.700766 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-credential-keys\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.702114 master-0 kubenswrapper[18707]: I0320 09:06:07.700795 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-edpm-b\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.702114 master-0 kubenswrapper[18707]: I0320 09:06:07.700827 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-dns-svc\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.702114 master-0 kubenswrapper[18707]: I0320 09:06:07.700859 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-config-data\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.702114 master-0 kubenswrapper[18707]: I0320 09:06:07.700896 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-dns-swift-storage-0\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.702114 master-0 kubenswrapper[18707]: I0320 09:06:07.700948 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbqbg\" (UniqueName: \"kubernetes.io/projected/971889eb-1084-4735-b4c7-82b73df6877c-kube-api-access-bbqbg\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.702114 master-0 kubenswrapper[18707]: I0320 09:06:07.701028 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m44c\" (UniqueName: \"kubernetes.io/projected/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-kube-api-access-4m44c\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.702114 master-0 kubenswrapper[18707]: I0320 09:06:07.701079 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-combined-ca-bundle\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.702114 master-0 kubenswrapper[18707]: I0320 09:06:07.701110 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-ovsdbserver-nb\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.702114 master-0 kubenswrapper[18707]: I0320 09:06:07.701162 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-fernet-keys\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.706250 master-0 kubenswrapper[18707]: I0320 09:06:07.706102 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bd59cc7c7-jcbnm"] Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.823565 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-scripts\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.823719 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-ovsdbserver-sb\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.823757 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-credential-keys\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.823787 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-edpm-b\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.823822 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-dns-svc\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.823852 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-config-data\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.823887 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-dns-swift-storage-0\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.823938 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbqbg\" (UniqueName: \"kubernetes.io/projected/971889eb-1084-4735-b4c7-82b73df6877c-kube-api-access-bbqbg\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.824013 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m44c\" (UniqueName: \"kubernetes.io/projected/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-kube-api-access-4m44c\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.824053 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-combined-ca-bundle\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.824075 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-ovsdbserver-nb\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.824121 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-fernet-keys\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.824178 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-config\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.824230 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-edpm-a\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.824866 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-ovsdbserver-sb\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.827274 master-0 kubenswrapper[18707]: I0320 09:06:07.825060 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-edpm-a\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.829142 master-0 kubenswrapper[18707]: I0320 09:06:07.828781 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-scripts\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.831307 master-0 kubenswrapper[18707]: I0320 09:06:07.830451 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-edpm-b\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.832552 master-0 kubenswrapper[18707]: I0320 09:06:07.832509 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-dns-svc\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.833455 master-0 kubenswrapper[18707]: I0320 09:06:07.833392 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-credential-keys\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.836709 master-0 kubenswrapper[18707]: I0320 09:06:07.836666 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-config-data\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.837631 master-0 kubenswrapper[18707]: I0320 09:06:07.837599 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-dns-swift-storage-0\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.837759 master-0 kubenswrapper[18707]: I0320 09:06:07.837733 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-combined-ca-bundle\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.839062 master-0 kubenswrapper[18707]: I0320 09:06:07.838993 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-config\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.839895 master-0 kubenswrapper[18707]: I0320 09:06:07.839868 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-ovsdbserver-nb\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.841843 master-0 kubenswrapper[18707]: I0320 09:06:07.841807 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-fernet-keys\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.902573 master-0 kubenswrapper[18707]: I0320 09:06:07.886255 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbqbg\" (UniqueName: \"kubernetes.io/projected/971889eb-1084-4735-b4c7-82b73df6877c-kube-api-access-bbqbg\") pod \"dnsmasq-dns-5bd59cc7c7-jcbnm\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:07.921013 master-0 kubenswrapper[18707]: I0320 09:06:07.920952 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m44c\" (UniqueName: \"kubernetes.io/projected/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-kube-api-access-4m44c\") pod \"keystone-bootstrap-xz2l7\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:07.964653 master-0 kubenswrapper[18707]: I0320 09:06:07.964484 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c920a-db-sync-5sx9b"] Mar 20 09:06:07.969770 master-0 kubenswrapper[18707]: I0320 09:06:07.968357 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:07.977888 master-0 kubenswrapper[18707]: I0320 09:06:07.977830 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-c920a-scripts" Mar 20 09:06:07.982395 master-0 kubenswrapper[18707]: I0320 09:06:07.982334 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-c920a-config-data" Mar 20 09:06:08.014821 master-0 kubenswrapper[18707]: I0320 09:06:08.012967 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:08.031642 master-0 kubenswrapper[18707]: I0320 09:06:08.031524 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-db-sync-5sx9b"] Mar 20 09:06:08.057332 master-0 kubenswrapper[18707]: I0320 09:06:08.057170 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-combined-ca-bundle\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.058126 master-0 kubenswrapper[18707]: I0320 09:06:08.058086 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm786\" (UniqueName: \"kubernetes.io/projected/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-kube-api-access-cm786\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.058303 master-0 kubenswrapper[18707]: I0320 09:06:08.058258 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-db-sync-config-data\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.062195 master-0 kubenswrapper[18707]: I0320 09:06:08.058657 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-scripts\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.062195 master-0 kubenswrapper[18707]: I0320 09:06:08.061654 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-etc-machine-id\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.062195 master-0 kubenswrapper[18707]: I0320 09:06:08.061810 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-config-data\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.071388 master-0 kubenswrapper[18707]: I0320 09:06:08.069045 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-q6j55"] Mar 20 09:06:08.072548 master-0 kubenswrapper[18707]: I0320 09:06:08.072503 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q6j55" Mar 20 09:06:08.083092 master-0 kubenswrapper[18707]: I0320 09:06:08.078690 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 09:06:08.084341 master-0 kubenswrapper[18707]: I0320 09:06:08.083730 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 09:06:08.179150 master-0 kubenswrapper[18707]: I0320 09:06:08.167688 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-etc-machine-id\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.179150 master-0 kubenswrapper[18707]: I0320 09:06:08.167820 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-config-data\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.179150 master-0 kubenswrapper[18707]: I0320 09:06:08.167901 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-config\") pod \"neutron-db-sync-q6j55\" (UID: \"d739ce18-cf6c-4eeb-a609-7ab1acac00d2\") " pod="openstack/neutron-db-sync-q6j55" Mar 20 09:06:08.179150 master-0 kubenswrapper[18707]: I0320 09:06:08.168033 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-combined-ca-bundle\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.179150 master-0 kubenswrapper[18707]: I0320 09:06:08.168094 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm786\" (UniqueName: \"kubernetes.io/projected/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-kube-api-access-cm786\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.179150 master-0 kubenswrapper[18707]: I0320 09:06:08.168127 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-db-sync-config-data\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.179150 master-0 kubenswrapper[18707]: I0320 09:06:08.168210 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-scripts\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.179150 master-0 kubenswrapper[18707]: I0320 09:06:08.168261 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8697\" (UniqueName: \"kubernetes.io/projected/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-kube-api-access-x8697\") pod \"neutron-db-sync-q6j55\" (UID: \"d739ce18-cf6c-4eeb-a609-7ab1acac00d2\") " pod="openstack/neutron-db-sync-q6j55" Mar 20 09:06:08.179150 master-0 kubenswrapper[18707]: I0320 09:06:08.168365 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-combined-ca-bundle\") pod \"neutron-db-sync-q6j55\" (UID: \"d739ce18-cf6c-4eeb-a609-7ab1acac00d2\") " pod="openstack/neutron-db-sync-q6j55" Mar 20 09:06:08.181359 master-0 kubenswrapper[18707]: I0320 09:06:08.181229 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-etc-machine-id\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.186266 master-0 kubenswrapper[18707]: I0320 09:06:08.186054 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-config-data\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.192967 master-0 kubenswrapper[18707]: I0320 09:06:08.191902 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-scripts\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.195440 master-0 kubenswrapper[18707]: I0320 09:06:08.195388 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-combined-ca-bundle\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.195675 master-0 kubenswrapper[18707]: I0320 09:06:08.195505 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-q6j55"] Mar 20 09:06:08.205120 master-0 kubenswrapper[18707]: I0320 09:06:08.200685 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-db-sync-config-data\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.230937 master-0 kubenswrapper[18707]: I0320 09:06:08.227515 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm786\" (UniqueName: \"kubernetes.io/projected/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-kube-api-access-cm786\") pod \"cinder-c920a-db-sync-5sx9b\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.232672 master-0 kubenswrapper[18707]: I0320 09:06:08.232619 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:08.281612 master-0 kubenswrapper[18707]: I0320 09:06:08.274218 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8697\" (UniqueName: \"kubernetes.io/projected/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-kube-api-access-x8697\") pod \"neutron-db-sync-q6j55\" (UID: \"d739ce18-cf6c-4eeb-a609-7ab1acac00d2\") " pod="openstack/neutron-db-sync-q6j55" Mar 20 09:06:08.281612 master-0 kubenswrapper[18707]: I0320 09:06:08.274380 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-combined-ca-bundle\") pod \"neutron-db-sync-q6j55\" (UID: \"d739ce18-cf6c-4eeb-a609-7ab1acac00d2\") " pod="openstack/neutron-db-sync-q6j55" Mar 20 09:06:08.284743 master-0 kubenswrapper[18707]: I0320 09:06:08.284670 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-config\") pod \"neutron-db-sync-q6j55\" (UID: \"d739ce18-cf6c-4eeb-a609-7ab1acac00d2\") " pod="openstack/neutron-db-sync-q6j55" Mar 20 09:06:08.314268 master-0 kubenswrapper[18707]: I0320 09:06:08.307607 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:08.314268 master-0 kubenswrapper[18707]: I0320 09:06:08.308865 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-config\") pod \"neutron-db-sync-q6j55\" (UID: \"d739ce18-cf6c-4eeb-a609-7ab1acac00d2\") " pod="openstack/neutron-db-sync-q6j55" Mar 20 09:06:08.314268 master-0 kubenswrapper[18707]: I0320 09:06:08.314162 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-combined-ca-bundle\") pod \"neutron-db-sync-q6j55\" (UID: \"d739ce18-cf6c-4eeb-a609-7ab1acac00d2\") " pod="openstack/neutron-db-sync-q6j55" Mar 20 09:06:08.333851 master-0 kubenswrapper[18707]: I0320 09:06:08.327059 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8697\" (UniqueName: \"kubernetes.io/projected/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-kube-api-access-x8697\") pod \"neutron-db-sync-q6j55\" (UID: \"d739ce18-cf6c-4eeb-a609-7ab1acac00d2\") " pod="openstack/neutron-db-sync-q6j55" Mar 20 09:06:08.333851 master-0 kubenswrapper[18707]: I0320 09:06:08.330628 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bd59cc7c7-jcbnm"] Mar 20 09:06:08.375386 master-0 kubenswrapper[18707]: I0320 09:06:08.374395 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-8jqss"] Mar 20 09:06:08.382084 master-0 kubenswrapper[18707]: I0320 09:06:08.380171 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.397253 master-0 kubenswrapper[18707]: I0320 09:06:08.388290 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 09:06:08.397253 master-0 kubenswrapper[18707]: I0320 09:06:08.388967 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 09:06:08.418312 master-0 kubenswrapper[18707]: I0320 09:06:08.417606 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8jqss"] Mar 20 09:06:08.431460 master-0 kubenswrapper[18707]: I0320 09:06:08.431128 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74db7c7d5f-t4vkw"] Mar 20 09:06:08.451706 master-0 kubenswrapper[18707]: I0320 09:06:08.435000 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.513064 master-0 kubenswrapper[18707]: I0320 09:06:08.512883 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-config\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.513347 master-0 kubenswrapper[18707]: I0320 09:06:08.513127 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdt7l\" (UniqueName: \"kubernetes.io/projected/01502368-38fe-40cc-9325-a7b83996fea1-kube-api-access-jdt7l\") pod \"placement-db-sync-8jqss\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.513347 master-0 kubenswrapper[18707]: I0320 09:06:08.513199 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-dns-swift-storage-0\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.513347 master-0 kubenswrapper[18707]: I0320 09:06:08.513329 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-edpm-a\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.513517 master-0 kubenswrapper[18707]: I0320 09:06:08.513360 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-combined-ca-bundle\") pod \"placement-db-sync-8jqss\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.513517 master-0 kubenswrapper[18707]: I0320 09:06:08.513451 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-dns-svc\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.514524 master-0 kubenswrapper[18707]: I0320 09:06:08.513845 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc64x\" (UniqueName: \"kubernetes.io/projected/6f5fb021-477c-4a7e-8f92-224e08645060-kube-api-access-lc64x\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.514524 master-0 kubenswrapper[18707]: I0320 09:06:08.513951 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-edpm-b\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.514524 master-0 kubenswrapper[18707]: I0320 09:06:08.513989 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-ovsdbserver-sb\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.514524 master-0 kubenswrapper[18707]: I0320 09:06:08.514123 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-scripts\") pod \"placement-db-sync-8jqss\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.514524 master-0 kubenswrapper[18707]: I0320 09:06:08.514340 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-config-data\") pod \"placement-db-sync-8jqss\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.514524 master-0 kubenswrapper[18707]: I0320 09:06:08.514452 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-ovsdbserver-nb\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.514524 master-0 kubenswrapper[18707]: I0320 09:06:08.514510 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01502368-38fe-40cc-9325-a7b83996fea1-logs\") pod \"placement-db-sync-8jqss\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.526353 master-0 kubenswrapper[18707]: I0320 09:06:08.525198 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q6j55" Mar 20 09:06:08.551048 master-0 kubenswrapper[18707]: I0320 09:06:08.550957 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74db7c7d5f-t4vkw"] Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.618981 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-ovsdbserver-sb\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.619092 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-scripts\") pod \"placement-db-sync-8jqss\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.619162 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-config-data\") pod \"placement-db-sync-8jqss\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.619435 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-ovsdbserver-nb\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.619484 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01502368-38fe-40cc-9325-a7b83996fea1-logs\") pod \"placement-db-sync-8jqss\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.619554 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-config\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.619577 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdt7l\" (UniqueName: \"kubernetes.io/projected/01502368-38fe-40cc-9325-a7b83996fea1-kube-api-access-jdt7l\") pod \"placement-db-sync-8jqss\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.619598 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-dns-swift-storage-0\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.619640 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-edpm-a\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.619681 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-combined-ca-bundle\") pod \"placement-db-sync-8jqss\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.619731 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-dns-svc\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.619768 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc64x\" (UniqueName: \"kubernetes.io/projected/6f5fb021-477c-4a7e-8f92-224e08645060-kube-api-access-lc64x\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.619803 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-edpm-b\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.620125 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-ovsdbserver-sb\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.621475 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-edpm-a\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.621709 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-edpm-b\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.621767 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-config\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.621812 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01502368-38fe-40cc-9325-a7b83996fea1-logs\") pod \"placement-db-sync-8jqss\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.622489 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-ovsdbserver-nb\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.622786 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-dns-swift-storage-0\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.625598 master-0 kubenswrapper[18707]: I0320 09:06:08.622851 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-dns-svc\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.631071 master-0 kubenswrapper[18707]: I0320 09:06:08.629984 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-combined-ca-bundle\") pod \"placement-db-sync-8jqss\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.638081 master-0 kubenswrapper[18707]: I0320 09:06:08.637359 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-scripts\") pod \"placement-db-sync-8jqss\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.638081 master-0 kubenswrapper[18707]: I0320 09:06:08.637653 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-config-data\") pod \"placement-db-sync-8jqss\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.640845 master-0 kubenswrapper[18707]: I0320 09:06:08.640811 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdt7l\" (UniqueName: \"kubernetes.io/projected/01502368-38fe-40cc-9325-a7b83996fea1-kube-api-access-jdt7l\") pod \"placement-db-sync-8jqss\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.655966 master-0 kubenswrapper[18707]: I0320 09:06:08.655912 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc64x\" (UniqueName: \"kubernetes.io/projected/6f5fb021-477c-4a7e-8f92-224e08645060-kube-api-access-lc64x\") pod \"dnsmasq-dns-74db7c7d5f-t4vkw\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:08.749207 master-0 kubenswrapper[18707]: I0320 09:06:08.745569 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:08.812681 master-0 kubenswrapper[18707]: I0320 09:06:08.812624 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bd59cc7c7-jcbnm"] Mar 20 09:06:08.830444 master-0 kubenswrapper[18707]: I0320 09:06:08.827450 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:09.176682 master-0 kubenswrapper[18707]: I0320 09:06:09.176615 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-db-sync-5sx9b"] Mar 20 09:06:09.188440 master-0 kubenswrapper[18707]: W0320 09:06:09.187358 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f0eaaa4_197c_49d2_a7a7_1d8c2edee22a.slice/crio-cf4797b833da7e672e7881bbcc8a3ff2ffe3bd7e56643e6f780811d2dee23c09 WatchSource:0}: Error finding container cf4797b833da7e672e7881bbcc8a3ff2ffe3bd7e56643e6f780811d2dee23c09: Status 404 returned error can't find the container with id cf4797b833da7e672e7881bbcc8a3ff2ffe3bd7e56643e6f780811d2dee23c09 Mar 20 09:06:09.206940 master-0 kubenswrapper[18707]: I0320 09:06:09.206140 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xz2l7"] Mar 20 09:06:09.343434 master-0 kubenswrapper[18707]: I0320 09:06:09.343241 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-db-sync-5sx9b" event={"ID":"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a","Type":"ContainerStarted","Data":"cf4797b833da7e672e7881bbcc8a3ff2ffe3bd7e56643e6f780811d2dee23c09"} Mar 20 09:06:09.346809 master-0 kubenswrapper[18707]: I0320 09:06:09.346744 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xz2l7" event={"ID":"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b","Type":"ContainerStarted","Data":"78d833b9f076360723ee8674e22408e36b02eb8aa0a469527accc357b62f3e52"} Mar 20 09:06:09.349221 master-0 kubenswrapper[18707]: I0320 09:06:09.349152 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" event={"ID":"971889eb-1084-4735-b4c7-82b73df6877c","Type":"ContainerStarted","Data":"a70aac2a460d259a1bc82744126e69be55bf434fe69383ce41be9c209582f5da"} Mar 20 09:06:09.418723 master-0 kubenswrapper[18707]: I0320 09:06:09.418683 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-q6j55"] Mar 20 09:06:09.559248 master-0 kubenswrapper[18707]: I0320 09:06:09.559066 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-8jqss"] Mar 20 09:06:09.691434 master-0 kubenswrapper[18707]: I0320 09:06:09.686552 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-27086-default-external-api-0"] Mar 20 09:06:09.691893 master-0 kubenswrapper[18707]: I0320 09:06:09.691837 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.695704 master-0 kubenswrapper[18707]: I0320 09:06:09.695620 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 09:06:09.696328 master-0 kubenswrapper[18707]: I0320 09:06:09.696113 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 09:06:09.738560 master-0 kubenswrapper[18707]: I0320 09:06:09.697473 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-27086-default-external-config-data" Mar 20 09:06:09.763360 master-0 kubenswrapper[18707]: I0320 09:06:09.763291 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-27086-default-external-api-0"] Mar 20 09:06:09.781419 master-0 kubenswrapper[18707]: W0320 09:06:09.781372 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f5fb021_477c_4a7e_8f92_224e08645060.slice/crio-976df8f52f81ed6312e330867f8f9fc51cabb338452c293ab99e773c53477a90 WatchSource:0}: Error finding container 976df8f52f81ed6312e330867f8f9fc51cabb338452c293ab99e773c53477a90: Status 404 returned error can't find the container with id 976df8f52f81ed6312e330867f8f9fc51cabb338452c293ab99e773c53477a90 Mar 20 09:06:09.788067 master-0 kubenswrapper[18707]: I0320 09:06:09.787961 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a81f0821-ea21-442c-955a-94d2a715be34-httpd-run\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.788273 master-0 kubenswrapper[18707]: I0320 09:06:09.788165 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-public-tls-certs\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.788627 master-0 kubenswrapper[18707]: I0320 09:06:09.788354 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-combined-ca-bundle\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.788627 master-0 kubenswrapper[18707]: I0320 09:06:09.788410 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a81f0821-ea21-442c-955a-94d2a715be34-logs\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.788719 master-0 kubenswrapper[18707]: I0320 09:06:09.788661 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.788753 master-0 kubenswrapper[18707]: I0320 09:06:09.788714 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-scripts\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.788782 master-0 kubenswrapper[18707]: I0320 09:06:09.788756 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-config-data\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.788907 master-0 kubenswrapper[18707]: I0320 09:06:09.788889 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66gcs\" (UniqueName: \"kubernetes.io/projected/a81f0821-ea21-442c-955a-94d2a715be34-kube-api-access-66gcs\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.840889 master-0 kubenswrapper[18707]: I0320 09:06:09.840834 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74db7c7d5f-t4vkw"] Mar 20 09:06:09.893672 master-0 kubenswrapper[18707]: I0320 09:06:09.893384 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.893672 master-0 kubenswrapper[18707]: I0320 09:06:09.893439 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-scripts\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.893672 master-0 kubenswrapper[18707]: I0320 09:06:09.893483 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-config-data\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.893672 master-0 kubenswrapper[18707]: I0320 09:06:09.893598 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66gcs\" (UniqueName: \"kubernetes.io/projected/a81f0821-ea21-442c-955a-94d2a715be34-kube-api-access-66gcs\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.894168 master-0 kubenswrapper[18707]: I0320 09:06:09.893695 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a81f0821-ea21-442c-955a-94d2a715be34-httpd-run\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.894168 master-0 kubenswrapper[18707]: I0320 09:06:09.893773 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-public-tls-certs\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.894778 master-0 kubenswrapper[18707]: I0320 09:06:09.894750 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-combined-ca-bundle\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.894843 master-0 kubenswrapper[18707]: I0320 09:06:09.894791 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a81f0821-ea21-442c-955a-94d2a715be34-logs\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.895960 master-0 kubenswrapper[18707]: I0320 09:06:09.895480 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a81f0821-ea21-442c-955a-94d2a715be34-logs\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.897624 master-0 kubenswrapper[18707]: I0320 09:06:09.897566 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a81f0821-ea21-442c-955a-94d2a715be34-httpd-run\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.905760 master-0 kubenswrapper[18707]: I0320 09:06:09.905721 18707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 09:06:09.905978 master-0 kubenswrapper[18707]: I0320 09:06:09.905768 18707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/83e46ef83a145054ad4570ca2d941bb2fcebc41478e778d97e5839befd9dd6f3/globalmount\"" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.906358 master-0 kubenswrapper[18707]: I0320 09:06:09.906166 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-combined-ca-bundle\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.908851 master-0 kubenswrapper[18707]: I0320 09:06:09.908748 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-config-data\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.911562 master-0 kubenswrapper[18707]: I0320 09:06:09.911490 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-public-tls-certs\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.914489 master-0 kubenswrapper[18707]: I0320 09:06:09.914443 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-scripts\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:09.933542 master-0 kubenswrapper[18707]: I0320 09:06:09.931524 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66gcs\" (UniqueName: \"kubernetes.io/projected/a81f0821-ea21-442c-955a-94d2a715be34-kube-api-access-66gcs\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:10.373911 master-0 kubenswrapper[18707]: I0320 09:06:10.373844 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q6j55" event={"ID":"d739ce18-cf6c-4eeb-a609-7ab1acac00d2","Type":"ContainerStarted","Data":"f8f53fce1724ee6b0341e8dfcfa88d2d2109f71ae5fa72bfb98e12e6f054c67a"} Mar 20 09:06:10.373911 master-0 kubenswrapper[18707]: I0320 09:06:10.373916 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q6j55" event={"ID":"d739ce18-cf6c-4eeb-a609-7ab1acac00d2","Type":"ContainerStarted","Data":"756e8e4d38a2c55a80de8ad3850f0273efdbef21ea7b88d617e793e0fb3de942"} Mar 20 09:06:10.378314 master-0 kubenswrapper[18707]: I0320 09:06:10.378236 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8jqss" event={"ID":"01502368-38fe-40cc-9325-a7b83996fea1","Type":"ContainerStarted","Data":"d7c796e585567a87c335c002450b5ee63a8a432c7044e0ba9e36fce09555a231"} Mar 20 09:06:10.380659 master-0 kubenswrapper[18707]: I0320 09:06:10.380545 18707 generic.go:334] "Generic (PLEG): container finished" podID="6f5fb021-477c-4a7e-8f92-224e08645060" containerID="99c340d8e66ad7aa183d45533d1b40835ba835715c029a0d1a60cee94fce5093" exitCode=0 Mar 20 09:06:10.380659 master-0 kubenswrapper[18707]: I0320 09:06:10.380624 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" event={"ID":"6f5fb021-477c-4a7e-8f92-224e08645060","Type":"ContainerDied","Data":"99c340d8e66ad7aa183d45533d1b40835ba835715c029a0d1a60cee94fce5093"} Mar 20 09:06:10.380944 master-0 kubenswrapper[18707]: I0320 09:06:10.380680 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" event={"ID":"6f5fb021-477c-4a7e-8f92-224e08645060","Type":"ContainerStarted","Data":"976df8f52f81ed6312e330867f8f9fc51cabb338452c293ab99e773c53477a90"} Mar 20 09:06:10.388517 master-0 kubenswrapper[18707]: I0320 09:06:10.387657 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xz2l7" event={"ID":"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b","Type":"ContainerStarted","Data":"61ce944c1e7a2281812d8c4fcb8f666a853916bb2320df168db63548e092bf3a"} Mar 20 09:06:10.391761 master-0 kubenswrapper[18707]: I0320 09:06:10.391569 18707 generic.go:334] "Generic (PLEG): container finished" podID="971889eb-1084-4735-b4c7-82b73df6877c" containerID="c02fd363c33b2c90d280588afe54535a6d07bad5209421c6082835382c150b98" exitCode=0 Mar 20 09:06:10.391761 master-0 kubenswrapper[18707]: I0320 09:06:10.391619 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" event={"ID":"971889eb-1084-4735-b4c7-82b73df6877c","Type":"ContainerDied","Data":"c02fd363c33b2c90d280588afe54535a6d07bad5209421c6082835382c150b98"} Mar 20 09:06:10.940720 master-0 kubenswrapper[18707]: I0320 09:06:10.940641 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:11.026292 master-0 kubenswrapper[18707]: I0320 09:06:11.026236 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-config\") pod \"971889eb-1084-4735-b4c7-82b73df6877c\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " Mar 20 09:06:11.026292 master-0 kubenswrapper[18707]: I0320 09:06:11.026291 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-edpm-b\") pod \"971889eb-1084-4735-b4c7-82b73df6877c\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " Mar 20 09:06:11.026577 master-0 kubenswrapper[18707]: I0320 09:06:11.026309 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-dns-svc\") pod \"971889eb-1084-4735-b4c7-82b73df6877c\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " Mar 20 09:06:11.026577 master-0 kubenswrapper[18707]: I0320 09:06:11.026358 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-ovsdbserver-nb\") pod \"971889eb-1084-4735-b4c7-82b73df6877c\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " Mar 20 09:06:11.026577 master-0 kubenswrapper[18707]: I0320 09:06:11.026510 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-ovsdbserver-sb\") pod \"971889eb-1084-4735-b4c7-82b73df6877c\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " Mar 20 09:06:11.026577 master-0 kubenswrapper[18707]: I0320 09:06:11.026534 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-edpm-a\") pod \"971889eb-1084-4735-b4c7-82b73df6877c\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " Mar 20 09:06:11.026704 master-0 kubenswrapper[18707]: I0320 09:06:11.026610 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-dns-swift-storage-0\") pod \"971889eb-1084-4735-b4c7-82b73df6877c\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " Mar 20 09:06:11.026740 master-0 kubenswrapper[18707]: I0320 09:06:11.026702 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbqbg\" (UniqueName: \"kubernetes.io/projected/971889eb-1084-4735-b4c7-82b73df6877c-kube-api-access-bbqbg\") pod \"971889eb-1084-4735-b4c7-82b73df6877c\" (UID: \"971889eb-1084-4735-b4c7-82b73df6877c\") " Mar 20 09:06:11.031576 master-0 kubenswrapper[18707]: I0320 09:06:11.031488 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/971889eb-1084-4735-b4c7-82b73df6877c-kube-api-access-bbqbg" (OuterVolumeSpecName: "kube-api-access-bbqbg") pod "971889eb-1084-4735-b4c7-82b73df6877c" (UID: "971889eb-1084-4735-b4c7-82b73df6877c"). InnerVolumeSpecName "kube-api-access-bbqbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:11.053750 master-0 kubenswrapper[18707]: I0320 09:06:11.053671 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "971889eb-1084-4735-b4c7-82b73df6877c" (UID: "971889eb-1084-4735-b4c7-82b73df6877c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:11.069766 master-0 kubenswrapper[18707]: I0320 09:06:11.069671 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "971889eb-1084-4735-b4c7-82b73df6877c" (UID: "971889eb-1084-4735-b4c7-82b73df6877c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:11.070231 master-0 kubenswrapper[18707]: I0320 09:06:11.070143 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-config" (OuterVolumeSpecName: "config") pod "971889eb-1084-4735-b4c7-82b73df6877c" (UID: "971889eb-1084-4735-b4c7-82b73df6877c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:11.072678 master-0 kubenswrapper[18707]: I0320 09:06:11.072622 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "971889eb-1084-4735-b4c7-82b73df6877c" (UID: "971889eb-1084-4735-b4c7-82b73df6877c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:11.072768 master-0 kubenswrapper[18707]: I0320 09:06:11.072678 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-edpm-b" (OuterVolumeSpecName: "edpm-b") pod "971889eb-1084-4735-b4c7-82b73df6877c" (UID: "971889eb-1084-4735-b4c7-82b73df6877c"). InnerVolumeSpecName "edpm-b". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:11.079120 master-0 kubenswrapper[18707]: I0320 09:06:11.079050 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "971889eb-1084-4735-b4c7-82b73df6877c" (UID: "971889eb-1084-4735-b4c7-82b73df6877c"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:11.081206 master-0 kubenswrapper[18707]: I0320 09:06:11.081119 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "971889eb-1084-4735-b4c7-82b73df6877c" (UID: "971889eb-1084-4735-b4c7-82b73df6877c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:11.134441 master-0 kubenswrapper[18707]: I0320 09:06:11.134385 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:11.134441 master-0 kubenswrapper[18707]: I0320 09:06:11.134441 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:11.134683 master-0 kubenswrapper[18707]: I0320 09:06:11.134461 18707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:11.134683 master-0 kubenswrapper[18707]: I0320 09:06:11.134482 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbqbg\" (UniqueName: \"kubernetes.io/projected/971889eb-1084-4735-b4c7-82b73df6877c-kube-api-access-bbqbg\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:11.134683 master-0 kubenswrapper[18707]: I0320 09:06:11.134497 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:11.134683 master-0 kubenswrapper[18707]: I0320 09:06:11.134510 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:11.134683 master-0 kubenswrapper[18707]: I0320 09:06:11.134526 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:11.134683 master-0 kubenswrapper[18707]: I0320 09:06:11.134538 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/971889eb-1084-4735-b4c7-82b73df6877c-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:11.251414 master-0 kubenswrapper[18707]: I0320 09:06:11.251308 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-27086-default-external-api-0"] Mar 20 09:06:11.253045 master-0 kubenswrapper[18707]: E0320 09:06:11.253003 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-27086-default-external-api-0" podUID="a81f0821-ea21-442c-955a-94d2a715be34" Mar 20 09:06:11.312566 master-0 kubenswrapper[18707]: I0320 09:06:11.312515 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") pod \"glance-27086-default-external-api-0\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:11.391820 master-0 kubenswrapper[18707]: E0320 09:06:11.391684 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice/crio-ae51d12f8922637aab7b4445f260664c311d5c0f9de93fef49cda0fcae20c42e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice\": RecentStats: unable to find data in memory cache]" Mar 20 09:06:11.431515 master-0 kubenswrapper[18707]: I0320 09:06:11.431457 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" event={"ID":"971889eb-1084-4735-b4c7-82b73df6877c","Type":"ContainerDied","Data":"a70aac2a460d259a1bc82744126e69be55bf434fe69383ce41be9c209582f5da"} Mar 20 09:06:11.431515 master-0 kubenswrapper[18707]: I0320 09:06:11.431525 18707 scope.go:117] "RemoveContainer" containerID="c02fd363c33b2c90d280588afe54535a6d07bad5209421c6082835382c150b98" Mar 20 09:06:11.431809 master-0 kubenswrapper[18707]: I0320 09:06:11.431652 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bd59cc7c7-jcbnm" Mar 20 09:06:11.438028 master-0 kubenswrapper[18707]: I0320 09:06:11.437985 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" event={"ID":"6f5fb021-477c-4a7e-8f92-224e08645060","Type":"ContainerStarted","Data":"3280555efd3ec0463eb77d7ddf1f443095161c778750f56af1f757094b933059"} Mar 20 09:06:11.438168 master-0 kubenswrapper[18707]: I0320 09:06:11.438034 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:11.438561 master-0 kubenswrapper[18707]: I0320 09:06:11.438497 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:11.449246 master-0 kubenswrapper[18707]: I0320 09:06:11.449204 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:11.547690 master-0 kubenswrapper[18707]: I0320 09:06:11.547603 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-scripts\") pod \"a81f0821-ea21-442c-955a-94d2a715be34\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " Mar 20 09:06:11.548107 master-0 kubenswrapper[18707]: I0320 09:06:11.547799 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-combined-ca-bundle\") pod \"a81f0821-ea21-442c-955a-94d2a715be34\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " Mar 20 09:06:11.548107 master-0 kubenswrapper[18707]: I0320 09:06:11.547976 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a81f0821-ea21-442c-955a-94d2a715be34-httpd-run\") pod \"a81f0821-ea21-442c-955a-94d2a715be34\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " Mar 20 09:06:11.548907 master-0 kubenswrapper[18707]: I0320 09:06:11.548841 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81f0821-ea21-442c-955a-94d2a715be34-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a81f0821-ea21-442c-955a-94d2a715be34" (UID: "a81f0821-ea21-442c-955a-94d2a715be34"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:06:11.549347 master-0 kubenswrapper[18707]: I0320 09:06:11.549294 18707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a81f0821-ea21-442c-955a-94d2a715be34-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:11.551215 master-0 kubenswrapper[18707]: I0320 09:06:11.551123 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a81f0821-ea21-442c-955a-94d2a715be34" (UID: "a81f0821-ea21-442c-955a-94d2a715be34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:11.557272 master-0 kubenswrapper[18707]: I0320 09:06:11.556151 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-scripts" (OuterVolumeSpecName: "scripts") pod "a81f0821-ea21-442c-955a-94d2a715be34" (UID: "a81f0821-ea21-442c-955a-94d2a715be34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:11.650743 master-0 kubenswrapper[18707]: I0320 09:06:11.650044 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-config-data\") pod \"a81f0821-ea21-442c-955a-94d2a715be34\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " Mar 20 09:06:11.650743 master-0 kubenswrapper[18707]: I0320 09:06:11.650155 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a81f0821-ea21-442c-955a-94d2a715be34-logs\") pod \"a81f0821-ea21-442c-955a-94d2a715be34\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " Mar 20 09:06:11.650743 master-0 kubenswrapper[18707]: I0320 09:06:11.650233 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66gcs\" (UniqueName: \"kubernetes.io/projected/a81f0821-ea21-442c-955a-94d2a715be34-kube-api-access-66gcs\") pod \"a81f0821-ea21-442c-955a-94d2a715be34\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " Mar 20 09:06:11.650743 master-0 kubenswrapper[18707]: I0320 09:06:11.650264 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-public-tls-certs\") pod \"a81f0821-ea21-442c-955a-94d2a715be34\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " Mar 20 09:06:11.650743 master-0 kubenswrapper[18707]: I0320 09:06:11.650712 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:11.650743 master-0 kubenswrapper[18707]: I0320 09:06:11.650730 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:11.651272 master-0 kubenswrapper[18707]: I0320 09:06:11.651132 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81f0821-ea21-442c-955a-94d2a715be34-logs" (OuterVolumeSpecName: "logs") pod "a81f0821-ea21-442c-955a-94d2a715be34" (UID: "a81f0821-ea21-442c-955a-94d2a715be34"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:06:11.653972 master-0 kubenswrapper[18707]: I0320 09:06:11.653837 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-config-data" (OuterVolumeSpecName: "config-data") pod "a81f0821-ea21-442c-955a-94d2a715be34" (UID: "a81f0821-ea21-442c-955a-94d2a715be34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:11.654460 master-0 kubenswrapper[18707]: I0320 09:06:11.654115 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81f0821-ea21-442c-955a-94d2a715be34-kube-api-access-66gcs" (OuterVolumeSpecName: "kube-api-access-66gcs") pod "a81f0821-ea21-442c-955a-94d2a715be34" (UID: "a81f0821-ea21-442c-955a-94d2a715be34"). InnerVolumeSpecName "kube-api-access-66gcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:11.657761 master-0 kubenswrapper[18707]: I0320 09:06:11.657695 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a81f0821-ea21-442c-955a-94d2a715be34" (UID: "a81f0821-ea21-442c-955a-94d2a715be34"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:11.753264 master-0 kubenswrapper[18707]: I0320 09:06:11.753158 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:11.753264 master-0 kubenswrapper[18707]: I0320 09:06:11.753246 18707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a81f0821-ea21-442c-955a-94d2a715be34-logs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:11.753264 master-0 kubenswrapper[18707]: I0320 09:06:11.753259 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66gcs\" (UniqueName: \"kubernetes.io/projected/a81f0821-ea21-442c-955a-94d2a715be34-kube-api-access-66gcs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:11.753264 master-0 kubenswrapper[18707]: I0320 09:06:11.753271 18707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a81f0821-ea21-442c-955a-94d2a715be34-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:12.250791 master-0 kubenswrapper[18707]: I0320 09:06:12.241599 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-q6j55" podStartSLOduration=5.241574166 podStartE2EDuration="5.241574166s" podCreationTimestamp="2026-03-20 09:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:06:12.20601696 +0000 UTC m=+1517.362197376" watchObservedRunningTime="2026-03-20 09:06:12.241574166 +0000 UTC m=+1517.397754532" Mar 20 09:06:12.268413 master-0 kubenswrapper[18707]: I0320 09:06:12.268340 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-27086-default-internal-api-0"] Mar 20 09:06:12.268890 master-0 kubenswrapper[18707]: E0320 09:06:12.268864 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="971889eb-1084-4735-b4c7-82b73df6877c" containerName="init" Mar 20 09:06:12.268890 master-0 kubenswrapper[18707]: I0320 09:06:12.268883 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="971889eb-1084-4735-b4c7-82b73df6877c" containerName="init" Mar 20 09:06:12.269149 master-0 kubenswrapper[18707]: I0320 09:06:12.269126 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="971889eb-1084-4735-b4c7-82b73df6877c" containerName="init" Mar 20 09:06:12.270614 master-0 kubenswrapper[18707]: I0320 09:06:12.270589 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:12.280333 master-0 kubenswrapper[18707]: I0320 09:06:12.275282 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 09:06:12.280333 master-0 kubenswrapper[18707]: I0320 09:06:12.275573 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-27086-default-internal-config-data" Mar 20 09:06:12.392563 master-0 kubenswrapper[18707]: I0320 09:06:12.392492 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-27086-default-internal-api-0"] Mar 20 09:06:12.452334 master-0 kubenswrapper[18707]: I0320 09:06:12.452268 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:12.724467 master-0 kubenswrapper[18707]: I0320 09:06:12.724364 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xz2l7" podStartSLOduration=5.724339072 podStartE2EDuration="5.724339072s" podCreationTimestamp="2026-03-20 09:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:06:12.694502059 +0000 UTC m=+1517.850682425" watchObservedRunningTime="2026-03-20 09:06:12.724339072 +0000 UTC m=+1517.880519428" Mar 20 09:06:13.000492 master-0 kubenswrapper[18707]: I0320 09:06:13.000389 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") pod \"a81f0821-ea21-442c-955a-94d2a715be34\" (UID: \"a81f0821-ea21-442c-955a-94d2a715be34\") " Mar 20 09:06:13.000819 master-0 kubenswrapper[18707]: I0320 09:06:13.000793 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-logs\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.000937 master-0 kubenswrapper[18707]: I0320 09:06:13.000916 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-scripts\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.000972 master-0 kubenswrapper[18707]: I0320 09:06:13.000958 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj9gx\" (UniqueName: \"kubernetes.io/projected/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-kube-api-access-cj9gx\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.001029 master-0 kubenswrapper[18707]: I0320 09:06:13.001013 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-config-data\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.001094 master-0 kubenswrapper[18707]: I0320 09:06:13.001076 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ed7fc861-4795-473d-84e6-66068cd18122\" (UniqueName: \"kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.001135 master-0 kubenswrapper[18707]: I0320 09:06:13.001103 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-combined-ca-bundle\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.001226 master-0 kubenswrapper[18707]: I0320 09:06:13.001203 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-httpd-run\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.001319 master-0 kubenswrapper[18707]: I0320 09:06:13.001302 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-internal-tls-certs\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.041282 master-0 kubenswrapper[18707]: I0320 09:06:13.041211 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e" (OuterVolumeSpecName: "glance") pod "a81f0821-ea21-442c-955a-94d2a715be34" (UID: "a81f0821-ea21-442c-955a-94d2a715be34"). InnerVolumeSpecName "pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 09:06:13.102960 master-0 kubenswrapper[18707]: I0320 09:06:13.102868 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-scripts\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.104805 master-0 kubenswrapper[18707]: I0320 09:06:13.103294 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj9gx\" (UniqueName: \"kubernetes.io/projected/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-kube-api-access-cj9gx\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.104805 master-0 kubenswrapper[18707]: I0320 09:06:13.103448 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-config-data\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.104805 master-0 kubenswrapper[18707]: I0320 09:06:13.103561 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-combined-ca-bundle\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.104805 master-0 kubenswrapper[18707]: I0320 09:06:13.103681 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-httpd-run\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.104805 master-0 kubenswrapper[18707]: I0320 09:06:13.103872 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-internal-tls-certs\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.104805 master-0 kubenswrapper[18707]: I0320 09:06:13.103904 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-logs\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.104805 master-0 kubenswrapper[18707]: I0320 09:06:13.104027 18707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") on node \"master-0\" " Mar 20 09:06:13.104805 master-0 kubenswrapper[18707]: I0320 09:06:13.104423 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-logs\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.104805 master-0 kubenswrapper[18707]: I0320 09:06:13.104644 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-httpd-run\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.107733 master-0 kubenswrapper[18707]: I0320 09:06:13.107699 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-scripts\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.109110 master-0 kubenswrapper[18707]: I0320 09:06:13.108175 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-internal-tls-certs\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.109110 master-0 kubenswrapper[18707]: I0320 09:06:13.108524 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-combined-ca-bundle\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.111208 master-0 kubenswrapper[18707]: I0320 09:06:13.110442 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-config-data\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.139711 master-0 kubenswrapper[18707]: I0320 09:06:13.139668 18707 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 09:06:13.139912 master-0 kubenswrapper[18707]: I0320 09:06:13.139861 18707 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647" (UniqueName: "kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e") on node "master-0" Mar 20 09:06:13.193222 master-0 kubenswrapper[18707]: I0320 09:06:13.191130 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj9gx\" (UniqueName: \"kubernetes.io/projected/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-kube-api-access-cj9gx\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.206071 master-0 kubenswrapper[18707]: I0320 09:06:13.206011 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ed7fc861-4795-473d-84e6-66068cd18122\" (UniqueName: \"kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.206319 master-0 kubenswrapper[18707]: I0320 09:06:13.206128 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-27086-default-internal-api-0"] Mar 20 09:06:13.206360 master-0 kubenswrapper[18707]: I0320 09:06:13.206263 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" podStartSLOduration=5.205297895 podStartE2EDuration="5.205297895s" podCreationTimestamp="2026-03-20 09:06:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:06:13.192348845 +0000 UTC m=+1518.348529201" watchObservedRunningTime="2026-03-20 09:06:13.205297895 +0000 UTC m=+1518.361478251" Mar 20 09:06:13.207172 master-0 kubenswrapper[18707]: E0320 09:06:13.207134 18707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-27086-default-internal-api-0" podUID="35f5b085-d0e4-44eb-9d5a-bb7c14881a70" Mar 20 09:06:13.207263 master-0 kubenswrapper[18707]: I0320 09:06:13.207212 18707 reconciler_common.go:293] "Volume detached for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:13.220652 master-0 kubenswrapper[18707]: I0320 09:06:13.220599 18707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 09:06:13.220881 master-0 kubenswrapper[18707]: I0320 09:06:13.220661 18707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ed7fc861-4795-473d-84e6-66068cd18122\" (UniqueName: \"kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c5eaa71b76f2346f9535597ed8ece2d32bc4c6547a6a7c691a392bf9719026bd/globalmount\"" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.322951 master-0 kubenswrapper[18707]: I0320 09:06:13.320664 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bd59cc7c7-jcbnm"] Mar 20 09:06:13.338429 master-0 kubenswrapper[18707]: I0320 09:06:13.338331 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bd59cc7c7-jcbnm"] Mar 20 09:06:13.452343 master-0 kubenswrapper[18707]: I0320 09:06:13.449575 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-27086-default-external-api-0"] Mar 20 09:06:13.484337 master-0 kubenswrapper[18707]: I0320 09:06:13.472586 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.484337 master-0 kubenswrapper[18707]: I0320 09:06:13.481133 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-27086-default-external-api-0"] Mar 20 09:06:13.495573 master-0 kubenswrapper[18707]: I0320 09:06:13.495486 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:13.507018 master-0 kubenswrapper[18707]: I0320 09:06:13.506957 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-27086-default-external-api-0"] Mar 20 09:06:13.515745 master-0 kubenswrapper[18707]: I0320 09:06:13.513896 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.519889 master-0 kubenswrapper[18707]: I0320 09:06:13.517615 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 09:06:13.519889 master-0 kubenswrapper[18707]: I0320 09:06:13.518456 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-27086-default-external-config-data" Mar 20 09:06:13.528210 master-0 kubenswrapper[18707]: I0320 09:06:13.524107 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-27086-default-external-api-0"] Mar 20 09:06:13.530961 master-0 kubenswrapper[18707]: I0320 09:06:13.529379 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-httpd-run\") pod \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " Mar 20 09:06:13.530961 master-0 kubenswrapper[18707]: I0320 09:06:13.529587 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-logs\") pod \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " Mar 20 09:06:13.530961 master-0 kubenswrapper[18707]: I0320 09:06:13.529668 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj9gx\" (UniqueName: \"kubernetes.io/projected/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-kube-api-access-cj9gx\") pod \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " Mar 20 09:06:13.530961 master-0 kubenswrapper[18707]: I0320 09:06:13.530060 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "35f5b085-d0e4-44eb-9d5a-bb7c14881a70" (UID: "35f5b085-d0e4-44eb-9d5a-bb7c14881a70"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:06:13.530961 master-0 kubenswrapper[18707]: I0320 09:06:13.530378 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-logs" (OuterVolumeSpecName: "logs") pod "35f5b085-d0e4-44eb-9d5a-bb7c14881a70" (UID: "35f5b085-d0e4-44eb-9d5a-bb7c14881a70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:06:13.531321 master-0 kubenswrapper[18707]: I0320 09:06:13.531055 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-combined-ca-bundle\") pod \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " Mar 20 09:06:13.531321 master-0 kubenswrapper[18707]: I0320 09:06:13.531136 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-config-data\") pod \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " Mar 20 09:06:13.531321 master-0 kubenswrapper[18707]: I0320 09:06:13.531286 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-scripts\") pod \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " Mar 20 09:06:13.531499 master-0 kubenswrapper[18707]: I0320 09:06:13.531349 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-internal-tls-certs\") pod \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " Mar 20 09:06:13.532143 master-0 kubenswrapper[18707]: I0320 09:06:13.531969 18707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:13.532143 master-0 kubenswrapper[18707]: I0320 09:06:13.531990 18707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-logs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:13.536840 master-0 kubenswrapper[18707]: I0320 09:06:13.535494 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35f5b085-d0e4-44eb-9d5a-bb7c14881a70" (UID: "35f5b085-d0e4-44eb-9d5a-bb7c14881a70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:13.536840 master-0 kubenswrapper[18707]: I0320 09:06:13.536585 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-kube-api-access-cj9gx" (OuterVolumeSpecName: "kube-api-access-cj9gx") pod "35f5b085-d0e4-44eb-9d5a-bb7c14881a70" (UID: "35f5b085-d0e4-44eb-9d5a-bb7c14881a70"). InnerVolumeSpecName "kube-api-access-cj9gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:13.550327 master-0 kubenswrapper[18707]: I0320 09:06:13.548399 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "35f5b085-d0e4-44eb-9d5a-bb7c14881a70" (UID: "35f5b085-d0e4-44eb-9d5a-bb7c14881a70"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:13.550553 master-0 kubenswrapper[18707]: I0320 09:06:13.550394 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-scripts" (OuterVolumeSpecName: "scripts") pod "35f5b085-d0e4-44eb-9d5a-bb7c14881a70" (UID: "35f5b085-d0e4-44eb-9d5a-bb7c14881a70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:13.550553 master-0 kubenswrapper[18707]: I0320 09:06:13.550435 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-config-data" (OuterVolumeSpecName: "config-data") pod "35f5b085-d0e4-44eb-9d5a-bb7c14881a70" (UID: "35f5b085-d0e4-44eb-9d5a-bb7c14881a70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:13.634705 master-0 kubenswrapper[18707]: I0320 09:06:13.634619 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-scripts\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.635004 master-0 kubenswrapper[18707]: I0320 09:06:13.634724 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-combined-ca-bundle\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.635004 master-0 kubenswrapper[18707]: I0320 09:06:13.634750 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-public-tls-certs\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.635004 master-0 kubenswrapper[18707]: I0320 09:06:13.634792 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-config-data\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.635004 master-0 kubenswrapper[18707]: I0320 09:06:13.634819 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zlh8\" (UniqueName: \"kubernetes.io/projected/dbd0581a-055a-4728-8979-e0cfd24dd897-kube-api-access-5zlh8\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.635004 master-0 kubenswrapper[18707]: I0320 09:06:13.634876 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbd0581a-055a-4728-8979-e0cfd24dd897-logs\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.635004 master-0 kubenswrapper[18707]: I0320 09:06:13.634917 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.635004 master-0 kubenswrapper[18707]: I0320 09:06:13.634954 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbd0581a-055a-4728-8979-e0cfd24dd897-httpd-run\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.635501 master-0 kubenswrapper[18707]: I0320 09:06:13.635017 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:13.635501 master-0 kubenswrapper[18707]: I0320 09:06:13.635030 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:13.635501 master-0 kubenswrapper[18707]: I0320 09:06:13.635040 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:13.635501 master-0 kubenswrapper[18707]: I0320 09:06:13.635050 18707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:13.635501 master-0 kubenswrapper[18707]: I0320 09:06:13.635061 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj9gx\" (UniqueName: \"kubernetes.io/projected/35f5b085-d0e4-44eb-9d5a-bb7c14881a70-kube-api-access-cj9gx\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:13.752080 master-0 kubenswrapper[18707]: I0320 09:06:13.739726 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-scripts\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.752080 master-0 kubenswrapper[18707]: I0320 09:06:13.739862 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-combined-ca-bundle\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.752080 master-0 kubenswrapper[18707]: I0320 09:06:13.739898 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-public-tls-certs\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.752080 master-0 kubenswrapper[18707]: I0320 09:06:13.739952 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-config-data\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.752080 master-0 kubenswrapper[18707]: I0320 09:06:13.739991 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zlh8\" (UniqueName: \"kubernetes.io/projected/dbd0581a-055a-4728-8979-e0cfd24dd897-kube-api-access-5zlh8\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.752080 master-0 kubenswrapper[18707]: I0320 09:06:13.740079 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbd0581a-055a-4728-8979-e0cfd24dd897-logs\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.752080 master-0 kubenswrapper[18707]: I0320 09:06:13.740155 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.752080 master-0 kubenswrapper[18707]: I0320 09:06:13.740222 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbd0581a-055a-4728-8979-e0cfd24dd897-httpd-run\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.752080 master-0 kubenswrapper[18707]: I0320 09:06:13.740845 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbd0581a-055a-4728-8979-e0cfd24dd897-httpd-run\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.752080 master-0 kubenswrapper[18707]: I0320 09:06:13.743221 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbd0581a-055a-4728-8979-e0cfd24dd897-logs\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.752080 master-0 kubenswrapper[18707]: I0320 09:06:13.750757 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-combined-ca-bundle\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.752080 master-0 kubenswrapper[18707]: I0320 09:06:13.752010 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-scripts\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.760529 master-0 kubenswrapper[18707]: I0320 09:06:13.759900 18707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 09:06:13.760529 master-0 kubenswrapper[18707]: I0320 09:06:13.759959 18707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/83e46ef83a145054ad4570ca2d941bb2fcebc41478e778d97e5839befd9dd6f3/globalmount\"" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.760529 master-0 kubenswrapper[18707]: I0320 09:06:13.760131 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-public-tls-certs\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.760529 master-0 kubenswrapper[18707]: I0320 09:06:13.760479 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-config-data\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:13.787312 master-0 kubenswrapper[18707]: I0320 09:06:13.787251 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zlh8\" (UniqueName: \"kubernetes.io/projected/dbd0581a-055a-4728-8979-e0cfd24dd897-kube-api-access-5zlh8\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:14.484530 master-0 kubenswrapper[18707]: I0320 09:06:14.484374 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:14.756576 master-0 kubenswrapper[18707]: I0320 09:06:14.755802 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ed7fc861-4795-473d-84e6-66068cd18122\" (UniqueName: \"kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526\") pod \"glance-27086-default-internal-api-0\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:14.895872 master-0 kubenswrapper[18707]: I0320 09:06:14.895811 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526\") pod \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\" (UID: \"35f5b085-d0e4-44eb-9d5a-bb7c14881a70\") " Mar 20 09:06:15.111493 master-0 kubenswrapper[18707]: I0320 09:06:15.111419 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="971889eb-1084-4735-b4c7-82b73df6877c" path="/var/lib/kubelet/pods/971889eb-1084-4735-b4c7-82b73df6877c/volumes" Mar 20 09:06:15.112538 master-0 kubenswrapper[18707]: I0320 09:06:15.112492 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a81f0821-ea21-442c-955a-94d2a715be34" path="/var/lib/kubelet/pods/a81f0821-ea21-442c-955a-94d2a715be34/volumes" Mar 20 09:06:15.191658 master-0 kubenswrapper[18707]: I0320 09:06:15.191482 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-27086-default-internal-api-0"] Mar 20 09:06:15.221288 master-0 kubenswrapper[18707]: I0320 09:06:15.221220 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-27086-default-internal-api-0"] Mar 20 09:06:15.241117 master-0 kubenswrapper[18707]: I0320 09:06:15.238382 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-27086-default-internal-api-0"] Mar 20 09:06:15.270748 master-0 kubenswrapper[18707]: I0320 09:06:15.270686 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-27086-default-internal-api-0"] Mar 20 09:06:15.271099 master-0 kubenswrapper[18707]: I0320 09:06:15.270802 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.273891 master-0 kubenswrapper[18707]: I0320 09:06:15.273803 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 09:06:15.274741 master-0 kubenswrapper[18707]: I0320 09:06:15.273926 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-27086-default-internal-config-data" Mar 20 09:06:15.311882 master-0 kubenswrapper[18707]: I0320 09:06:15.310612 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59x5s\" (UniqueName: \"kubernetes.io/projected/dcee9936-db51-42d9-ada7-279cec63d27e-kube-api-access-59x5s\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.311882 master-0 kubenswrapper[18707]: I0320 09:06:15.310707 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-scripts\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.311882 master-0 kubenswrapper[18707]: I0320 09:06:15.310734 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dcee9936-db51-42d9-ada7-279cec63d27e-httpd-run\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.311882 master-0 kubenswrapper[18707]: I0320 09:06:15.310773 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-combined-ca-bundle\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.311882 master-0 kubenswrapper[18707]: I0320 09:06:15.310808 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcee9936-db51-42d9-ada7-279cec63d27e-logs\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.311882 master-0 kubenswrapper[18707]: I0320 09:06:15.310853 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-internal-tls-certs\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.311882 master-0 kubenswrapper[18707]: I0320 09:06:15.310977 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-config-data\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.413710 master-0 kubenswrapper[18707]: I0320 09:06:15.413383 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59x5s\" (UniqueName: \"kubernetes.io/projected/dcee9936-db51-42d9-ada7-279cec63d27e-kube-api-access-59x5s\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.413944 master-0 kubenswrapper[18707]: I0320 09:06:15.413803 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dcee9936-db51-42d9-ada7-279cec63d27e-httpd-run\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.413944 master-0 kubenswrapper[18707]: I0320 09:06:15.413840 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-scripts\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.413944 master-0 kubenswrapper[18707]: I0320 09:06:15.413889 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-combined-ca-bundle\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.413944 master-0 kubenswrapper[18707]: I0320 09:06:15.413936 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcee9936-db51-42d9-ada7-279cec63d27e-logs\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.414146 master-0 kubenswrapper[18707]: I0320 09:06:15.413985 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-internal-tls-certs\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.414536 master-0 kubenswrapper[18707]: I0320 09:06:15.414506 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-config-data\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.414964 master-0 kubenswrapper[18707]: I0320 09:06:15.414914 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcee9936-db51-42d9-ada7-279cec63d27e-logs\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.415806 master-0 kubenswrapper[18707]: I0320 09:06:15.415783 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dcee9936-db51-42d9-ada7-279cec63d27e-httpd-run\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.422861 master-0 kubenswrapper[18707]: I0320 09:06:15.422789 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-scripts\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.425619 master-0 kubenswrapper[18707]: I0320 09:06:15.425564 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-internal-tls-certs\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.426362 master-0 kubenswrapper[18707]: I0320 09:06:15.426326 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-config-data\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.434307 master-0 kubenswrapper[18707]: I0320 09:06:15.430087 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-combined-ca-bundle\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.440678 master-0 kubenswrapper[18707]: I0320 09:06:15.440602 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59x5s\" (UniqueName: \"kubernetes.io/projected/dcee9936-db51-42d9-ada7-279cec63d27e-kube-api-access-59x5s\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.709425 master-0 kubenswrapper[18707]: I0320 09:06:15.708845 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526" (OuterVolumeSpecName: "glance") pod "35f5b085-d0e4-44eb-9d5a-bb7c14881a70" (UID: "35f5b085-d0e4-44eb-9d5a-bb7c14881a70"). InnerVolumeSpecName "pvc-ed7fc861-4795-473d-84e6-66068cd18122". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 09:06:15.712551 master-0 kubenswrapper[18707]: I0320 09:06:15.712499 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") pod \"glance-27086-default-external-api-0\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:15.725328 master-0 kubenswrapper[18707]: I0320 09:06:15.725224 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ed7fc861-4795-473d-84e6-66068cd18122\" (UniqueName: \"kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:15.999092 master-0 kubenswrapper[18707]: I0320 09:06:15.998963 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:16.722427 master-0 kubenswrapper[18707]: I0320 09:06:16.722340 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ed7fc861-4795-473d-84e6-66068cd18122\" (UniqueName: \"kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526\") pod \"glance-27086-default-internal-api-0\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:16.839156 master-0 kubenswrapper[18707]: I0320 09:06:16.839001 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:17.107171 master-0 kubenswrapper[18707]: I0320 09:06:17.107053 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35f5b085-d0e4-44eb-9d5a-bb7c14881a70" path="/var/lib/kubelet/pods/35f5b085-d0e4-44eb-9d5a-bb7c14881a70/volumes" Mar 20 09:06:17.309086 master-0 kubenswrapper[18707]: E0320 09:06:17.308840 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice/crio-ae51d12f8922637aab7b4445f260664c311d5c0f9de93fef49cda0fcae20c42e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice\": RecentStats: unable to find data in memory cache]" Mar 20 09:06:18.829513 master-0 kubenswrapper[18707]: I0320 09:06:18.829383 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:06:18.917604 master-0 kubenswrapper[18707]: I0320 09:06:18.917546 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b5b845b79-xn624"] Mar 20 09:06:18.918003 master-0 kubenswrapper[18707]: I0320 09:06:18.917964 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b5b845b79-xn624" podUID="28a3f9e1-5276-45c1-b6f3-94d2f09a223e" containerName="dnsmasq-dns" containerID="cri-o://e6b01f4b941175aade692b3f3edd18d0394a369525031c12bfced0cdaa0250f0" gracePeriod=10 Mar 20 09:06:19.159175 master-0 kubenswrapper[18707]: I0320 09:06:19.159013 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b5b845b79-xn624" podUID="28a3f9e1-5276-45c1-b6f3-94d2f09a223e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.203:5353: connect: connection refused" Mar 20 09:06:21.589211 master-0 kubenswrapper[18707]: I0320 09:06:21.588405 18707 generic.go:334] "Generic (PLEG): container finished" podID="c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b" containerID="61ce944c1e7a2281812d8c4fcb8f666a853916bb2320df168db63548e092bf3a" exitCode=0 Mar 20 09:06:21.589211 master-0 kubenswrapper[18707]: I0320 09:06:21.588497 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xz2l7" event={"ID":"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b","Type":"ContainerDied","Data":"61ce944c1e7a2281812d8c4fcb8f666a853916bb2320df168db63548e092bf3a"} Mar 20 09:06:24.158403 master-0 kubenswrapper[18707]: I0320 09:06:24.158343 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b5b845b79-xn624" podUID="28a3f9e1-5276-45c1-b6f3-94d2f09a223e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.203:5353: connect: connection refused" Mar 20 09:06:26.645103 master-0 kubenswrapper[18707]: E0320 09:06:26.645031 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice/crio-ae51d12f8922637aab7b4445f260664c311d5c0f9de93fef49cda0fcae20c42e\": RecentStats: unable to find data in memory cache]" Mar 20 09:06:27.375847 master-0 kubenswrapper[18707]: E0320 09:06:27.375783 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice/crio-ae51d12f8922637aab7b4445f260664c311d5c0f9de93fef49cda0fcae20c42e\": RecentStats: unable to find data in memory cache]" Mar 20 09:06:28.592101 master-0 kubenswrapper[18707]: I0320 09:06:28.592052 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:28.636058 master-0 kubenswrapper[18707]: I0320 09:06:28.635996 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-config-data\") pod \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " Mar 20 09:06:28.636285 master-0 kubenswrapper[18707]: I0320 09:06:28.636107 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-credential-keys\") pod \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " Mar 20 09:06:28.636285 master-0 kubenswrapper[18707]: I0320 09:06:28.636177 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-fernet-keys\") pod \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " Mar 20 09:06:28.636285 master-0 kubenswrapper[18707]: I0320 09:06:28.636260 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m44c\" (UniqueName: \"kubernetes.io/projected/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-kube-api-access-4m44c\") pod \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " Mar 20 09:06:28.636405 master-0 kubenswrapper[18707]: I0320 09:06:28.636334 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-scripts\") pod \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " Mar 20 09:06:28.636405 master-0 kubenswrapper[18707]: I0320 09:06:28.636379 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-combined-ca-bundle\") pod \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\" (UID: \"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b\") " Mar 20 09:06:28.641153 master-0 kubenswrapper[18707]: I0320 09:06:28.641065 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b" (UID: "c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:28.641928 master-0 kubenswrapper[18707]: I0320 09:06:28.641896 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-scripts" (OuterVolumeSpecName: "scripts") pod "c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b" (UID: "c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:28.652384 master-0 kubenswrapper[18707]: I0320 09:06:28.642399 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-kube-api-access-4m44c" (OuterVolumeSpecName: "kube-api-access-4m44c") pod "c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b" (UID: "c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b"). InnerVolumeSpecName "kube-api-access-4m44c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:28.652384 master-0 kubenswrapper[18707]: I0320 09:06:28.643273 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b" (UID: "c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:28.680300 master-0 kubenswrapper[18707]: I0320 09:06:28.680013 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b" (UID: "c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:28.694926 master-0 kubenswrapper[18707]: I0320 09:06:28.694848 18707 generic.go:334] "Generic (PLEG): container finished" podID="28a3f9e1-5276-45c1-b6f3-94d2f09a223e" containerID="e6b01f4b941175aade692b3f3edd18d0394a369525031c12bfced0cdaa0250f0" exitCode=0 Mar 20 09:06:28.695276 master-0 kubenswrapper[18707]: I0320 09:06:28.694932 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b5b845b79-xn624" event={"ID":"28a3f9e1-5276-45c1-b6f3-94d2f09a223e","Type":"ContainerDied","Data":"e6b01f4b941175aade692b3f3edd18d0394a369525031c12bfced0cdaa0250f0"} Mar 20 09:06:28.701418 master-0 kubenswrapper[18707]: I0320 09:06:28.701359 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xz2l7" event={"ID":"c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b","Type":"ContainerDied","Data":"78d833b9f076360723ee8674e22408e36b02eb8aa0a469527accc357b62f3e52"} Mar 20 09:06:28.701418 master-0 kubenswrapper[18707]: I0320 09:06:28.701413 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78d833b9f076360723ee8674e22408e36b02eb8aa0a469527accc357b62f3e52" Mar 20 09:06:28.701659 master-0 kubenswrapper[18707]: I0320 09:06:28.701446 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xz2l7" Mar 20 09:06:28.734333 master-0 kubenswrapper[18707]: I0320 09:06:28.734261 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-config-data" (OuterVolumeSpecName: "config-data") pod "c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b" (UID: "c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:28.753091 master-0 kubenswrapper[18707]: I0320 09:06:28.753040 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:28.753418 master-0 kubenswrapper[18707]: I0320 09:06:28.753114 18707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:28.753418 master-0 kubenswrapper[18707]: I0320 09:06:28.753130 18707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:28.753418 master-0 kubenswrapper[18707]: I0320 09:06:28.753144 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m44c\" (UniqueName: \"kubernetes.io/projected/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-kube-api-access-4m44c\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:28.753418 master-0 kubenswrapper[18707]: I0320 09:06:28.753236 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:28.753418 master-0 kubenswrapper[18707]: I0320 09:06:28.753249 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:29.809214 master-0 kubenswrapper[18707]: I0320 09:06:29.808994 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xz2l7"] Mar 20 09:06:29.822066 master-0 kubenswrapper[18707]: I0320 09:06:29.818521 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xz2l7"] Mar 20 09:06:29.933061 master-0 kubenswrapper[18707]: I0320 09:06:29.927951 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-x7bjf"] Mar 20 09:06:29.933061 master-0 kubenswrapper[18707]: E0320 09:06:29.928510 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b" containerName="keystone-bootstrap" Mar 20 09:06:29.933061 master-0 kubenswrapper[18707]: I0320 09:06:29.928525 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b" containerName="keystone-bootstrap" Mar 20 09:06:29.933061 master-0 kubenswrapper[18707]: I0320 09:06:29.928801 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b" containerName="keystone-bootstrap" Mar 20 09:06:29.933061 master-0 kubenswrapper[18707]: I0320 09:06:29.929636 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:29.933061 master-0 kubenswrapper[18707]: I0320 09:06:29.932503 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 09:06:29.934231 master-0 kubenswrapper[18707]: I0320 09:06:29.934171 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 09:06:29.936683 master-0 kubenswrapper[18707]: I0320 09:06:29.936480 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 09:06:29.936942 master-0 kubenswrapper[18707]: I0320 09:06:29.936809 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 09:06:29.968376 master-0 kubenswrapper[18707]: I0320 09:06:29.964827 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x7bjf"] Mar 20 09:06:29.984873 master-0 kubenswrapper[18707]: I0320 09:06:29.983760 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-credential-keys\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:29.984873 master-0 kubenswrapper[18707]: I0320 09:06:29.983838 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-combined-ca-bundle\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:29.984873 master-0 kubenswrapper[18707]: I0320 09:06:29.983963 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-config-data\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:29.987164 master-0 kubenswrapper[18707]: I0320 09:06:29.986260 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-scripts\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:29.987164 master-0 kubenswrapper[18707]: I0320 09:06:29.986450 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-fernet-keys\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:29.987164 master-0 kubenswrapper[18707]: I0320 09:06:29.986479 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-242s4\" (UniqueName: \"kubernetes.io/projected/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-kube-api-access-242s4\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:30.075401 master-0 kubenswrapper[18707]: I0320 09:06:30.075327 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:06:30.092865 master-0 kubenswrapper[18707]: I0320 09:06:30.092780 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-scripts\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:30.093082 master-0 kubenswrapper[18707]: I0320 09:06:30.092998 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-fernet-keys\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:30.094274 master-0 kubenswrapper[18707]: I0320 09:06:30.093030 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-242s4\" (UniqueName: \"kubernetes.io/projected/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-kube-api-access-242s4\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:30.094274 master-0 kubenswrapper[18707]: I0320 09:06:30.093345 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-credential-keys\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:30.094274 master-0 kubenswrapper[18707]: I0320 09:06:30.093386 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-combined-ca-bundle\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:30.095022 master-0 kubenswrapper[18707]: I0320 09:06:30.094700 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-config-data\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:30.111070 master-0 kubenswrapper[18707]: I0320 09:06:30.111020 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-fernet-keys\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:30.117297 master-0 kubenswrapper[18707]: I0320 09:06:30.117217 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-combined-ca-bundle\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:30.119734 master-0 kubenswrapper[18707]: I0320 09:06:30.119669 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-scripts\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:30.121874 master-0 kubenswrapper[18707]: I0320 09:06:30.121782 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-242s4\" (UniqueName: \"kubernetes.io/projected/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-kube-api-access-242s4\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:30.135777 master-0 kubenswrapper[18707]: I0320 09:06:30.135712 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-config-data\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:30.143871 master-0 kubenswrapper[18707]: I0320 09:06:30.143626 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-credential-keys\") pod \"keystone-bootstrap-x7bjf\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:30.196994 master-0 kubenswrapper[18707]: I0320 09:06:30.196934 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-dns-swift-storage-0\") pod \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " Mar 20 09:06:30.197282 master-0 kubenswrapper[18707]: I0320 09:06:30.197239 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-dns-svc\") pod \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " Mar 20 09:06:30.197373 master-0 kubenswrapper[18707]: I0320 09:06:30.197350 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-edpm-a\") pod \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " Mar 20 09:06:30.197415 master-0 kubenswrapper[18707]: I0320 09:06:30.197385 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-ovsdbserver-sb\") pod \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " Mar 20 09:06:30.197522 master-0 kubenswrapper[18707]: I0320 09:06:30.197500 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-ovsdbserver-nb\") pod \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " Mar 20 09:06:30.197613 master-0 kubenswrapper[18707]: I0320 09:06:30.197589 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlcd6\" (UniqueName: \"kubernetes.io/projected/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-kube-api-access-hlcd6\") pod \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " Mar 20 09:06:30.197656 master-0 kubenswrapper[18707]: I0320 09:06:30.197630 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-edpm-b\") pod \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " Mar 20 09:06:30.198241 master-0 kubenswrapper[18707]: I0320 09:06:30.197742 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-config\") pod \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\" (UID: \"28a3f9e1-5276-45c1-b6f3-94d2f09a223e\") " Mar 20 09:06:30.206832 master-0 kubenswrapper[18707]: I0320 09:06:30.206787 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-kube-api-access-hlcd6" (OuterVolumeSpecName: "kube-api-access-hlcd6") pod "28a3f9e1-5276-45c1-b6f3-94d2f09a223e" (UID: "28a3f9e1-5276-45c1-b6f3-94d2f09a223e"). InnerVolumeSpecName "kube-api-access-hlcd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:30.254308 master-0 kubenswrapper[18707]: I0320 09:06:30.254169 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "28a3f9e1-5276-45c1-b6f3-94d2f09a223e" (UID: "28a3f9e1-5276-45c1-b6f3-94d2f09a223e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:30.259123 master-0 kubenswrapper[18707]: I0320 09:06:30.258698 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "28a3f9e1-5276-45c1-b6f3-94d2f09a223e" (UID: "28a3f9e1-5276-45c1-b6f3-94d2f09a223e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:30.259123 master-0 kubenswrapper[18707]: I0320 09:06:30.258916 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "28a3f9e1-5276-45c1-b6f3-94d2f09a223e" (UID: "28a3f9e1-5276-45c1-b6f3-94d2f09a223e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:30.260205 master-0 kubenswrapper[18707]: I0320 09:06:30.260165 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-config" (OuterVolumeSpecName: "config") pod "28a3f9e1-5276-45c1-b6f3-94d2f09a223e" (UID: "28a3f9e1-5276-45c1-b6f3-94d2f09a223e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:30.273224 master-0 kubenswrapper[18707]: I0320 09:06:30.273160 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-edpm-b" (OuterVolumeSpecName: "edpm-b") pod "28a3f9e1-5276-45c1-b6f3-94d2f09a223e" (UID: "28a3f9e1-5276-45c1-b6f3-94d2f09a223e"). InnerVolumeSpecName "edpm-b". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:30.277412 master-0 kubenswrapper[18707]: I0320 09:06:30.276912 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "28a3f9e1-5276-45c1-b6f3-94d2f09a223e" (UID: "28a3f9e1-5276-45c1-b6f3-94d2f09a223e"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:30.280041 master-0 kubenswrapper[18707]: I0320 09:06:30.280001 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "28a3f9e1-5276-45c1-b6f3-94d2f09a223e" (UID: "28a3f9e1-5276-45c1-b6f3-94d2f09a223e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:30.286712 master-0 kubenswrapper[18707]: I0320 09:06:30.284982 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:30.303008 master-0 kubenswrapper[18707]: I0320 09:06:30.302957 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:30.303231 master-0 kubenswrapper[18707]: I0320 09:06:30.303010 18707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:30.303231 master-0 kubenswrapper[18707]: I0320 09:06:30.303028 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:30.303231 master-0 kubenswrapper[18707]: I0320 09:06:30.303040 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:30.303231 master-0 kubenswrapper[18707]: I0320 09:06:30.303054 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:30.303231 master-0 kubenswrapper[18707]: I0320 09:06:30.303068 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:30.303231 master-0 kubenswrapper[18707]: I0320 09:06:30.303079 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlcd6\" (UniqueName: \"kubernetes.io/projected/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-kube-api-access-hlcd6\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:30.303231 master-0 kubenswrapper[18707]: I0320 09:06:30.303092 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/28a3f9e1-5276-45c1-b6f3-94d2f09a223e-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:30.374233 master-0 kubenswrapper[18707]: I0320 09:06:30.374157 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-27086-default-external-api-0"] Mar 20 09:06:30.376336 master-0 kubenswrapper[18707]: W0320 09:06:30.376257 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbd0581a_055a_4728_8979_e0cfd24dd897.slice/crio-3ed72e9fb703d6eef89fbb39d6abf266470f90b00500eaccc685f161e32630ce WatchSource:0}: Error finding container 3ed72e9fb703d6eef89fbb39d6abf266470f90b00500eaccc685f161e32630ce: Status 404 returned error can't find the container with id 3ed72e9fb703d6eef89fbb39d6abf266470f90b00500eaccc685f161e32630ce Mar 20 09:06:30.627227 master-0 kubenswrapper[18707]: I0320 09:06:30.624459 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-27086-default-internal-api-0"] Mar 20 09:06:30.749210 master-0 kubenswrapper[18707]: I0320 09:06:30.745937 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" event={"ID":"c177a9fe-76d7-4325-8968-c7178ad8c75a","Type":"ContainerStarted","Data":"4b624e139d528f49877f9a2c1de9dba1b5ca7a444de67d80f04ec0832af4b4a9"} Mar 20 09:06:30.767262 master-0 kubenswrapper[18707]: I0320 09:06:30.762889 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b5b845b79-xn624" event={"ID":"28a3f9e1-5276-45c1-b6f3-94d2f09a223e","Type":"ContainerDied","Data":"1e0784ea8f392a1d8a0a9db26e7fdf9ef30dc3a25a506f0f23d2aa1de0dfb6ec"} Mar 20 09:06:30.767262 master-0 kubenswrapper[18707]: I0320 09:06:30.762959 18707 scope.go:117] "RemoveContainer" containerID="e6b01f4b941175aade692b3f3edd18d0394a369525031c12bfced0cdaa0250f0" Mar 20 09:06:30.767262 master-0 kubenswrapper[18707]: I0320 09:06:30.763115 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b5b845b79-xn624" Mar 20 09:06:30.819543 master-0 kubenswrapper[18707]: I0320 09:06:30.818424 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" event={"ID":"90e3df0d-6e71-463d-9816-15b12a376333","Type":"ContainerStarted","Data":"e5809075a4e1990ebd15757f73b56351a1a78b444092637d0223fa4594857d85"} Mar 20 09:06:30.831318 master-0 kubenswrapper[18707]: I0320 09:06:30.831176 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-external-api-0" event={"ID":"dbd0581a-055a-4728-8979-e0cfd24dd897","Type":"ContainerStarted","Data":"3ed72e9fb703d6eef89fbb39d6abf266470f90b00500eaccc685f161e32630ce"} Mar 20 09:06:30.856360 master-0 kubenswrapper[18707]: I0320 09:06:30.856283 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8jqss" event={"ID":"01502368-38fe-40cc-9325-a7b83996fea1","Type":"ContainerStarted","Data":"a07bd35b6065a17b0625937319043507f44d30ff119dcba2e53672991382ccf2"} Mar 20 09:06:30.887345 master-0 kubenswrapper[18707]: I0320 09:06:30.887298 18707 scope.go:117] "RemoveContainer" containerID="b4c86c561b572525f81462d1f7a1709baa7ff16c4915810b9218e95aaa207747" Mar 20 09:06:30.890665 master-0 kubenswrapper[18707]: I0320 09:06:30.889157 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-internal-api-0" event={"ID":"dcee9936-db51-42d9-ada7-279cec63d27e","Type":"ContainerStarted","Data":"7ec36585b68818943f960e71df365caced8449fdda07a25674d1a668effffffd"} Mar 20 09:06:30.896302 master-0 kubenswrapper[18707]: I0320 09:06:30.896240 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x7bjf"] Mar 20 09:06:30.981765 master-0 kubenswrapper[18707]: I0320 09:06:30.980501 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b5b845b79-xn624"] Mar 20 09:06:30.999266 master-0 kubenswrapper[18707]: I0320 09:06:30.999203 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b5b845b79-xn624"] Mar 20 09:06:31.002368 master-0 kubenswrapper[18707]: I0320 09:06:31.002276 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-8jqss" podStartSLOduration=2.953349903 podStartE2EDuration="23.00224787s" podCreationTimestamp="2026-03-20 09:06:08 +0000 UTC" firstStartedPulling="2026-03-20 09:06:09.675484627 +0000 UTC m=+1514.831664983" lastFinishedPulling="2026-03-20 09:06:29.724382594 +0000 UTC m=+1534.880562950" observedRunningTime="2026-03-20 09:06:30.965414727 +0000 UTC m=+1536.121595083" watchObservedRunningTime="2026-03-20 09:06:31.00224787 +0000 UTC m=+1536.158428236" Mar 20 09:06:31.128990 master-0 kubenswrapper[18707]: I0320 09:06:31.128818 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28a3f9e1-5276-45c1-b6f3-94d2f09a223e" path="/var/lib/kubelet/pods/28a3f9e1-5276-45c1-b6f3-94d2f09a223e/volumes" Mar 20 09:06:31.132317 master-0 kubenswrapper[18707]: I0320 09:06:31.132284 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b" path="/var/lib/kubelet/pods/c5ef3cdb-d9d8-4f24-9920-3e21d56dcd6b/volumes" Mar 20 09:06:31.908212 master-0 kubenswrapper[18707]: I0320 09:06:31.907357 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-db-sync-5sx9b" event={"ID":"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a","Type":"ContainerStarted","Data":"9c1abde5f519d612e85bab52eac717cd2c5a4a1ca596cc149088edc0fbbbd35d"} Mar 20 09:06:31.914877 master-0 kubenswrapper[18707]: I0320 09:06:31.910716 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7bjf" event={"ID":"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6","Type":"ContainerStarted","Data":"caf385a86308743d6ea997839039856b2382a85322e22d2510752a140219e721"} Mar 20 09:06:31.914877 master-0 kubenswrapper[18707]: I0320 09:06:31.910792 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7bjf" event={"ID":"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6","Type":"ContainerStarted","Data":"b9006497ff3e14acf221d03993381f2283c12f51ee4484efe5a9cd880b17b880"} Mar 20 09:06:31.920212 master-0 kubenswrapper[18707]: I0320 09:06:31.915740 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-external-api-0" event={"ID":"dbd0581a-055a-4728-8979-e0cfd24dd897","Type":"ContainerStarted","Data":"07e92ed41c0a2abbb89f39a0f628c94ad2999142ab7519ad10f78eb258075752"} Mar 20 09:06:31.920212 master-0 kubenswrapper[18707]: I0320 09:06:31.918912 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-internal-api-0" event={"ID":"dcee9936-db51-42d9-ada7-279cec63d27e","Type":"ContainerStarted","Data":"188e328723a831b8c019606c55e41bb77c059e030357cb18f0d149f0569d2036"} Mar 20 09:06:32.273982 master-0 kubenswrapper[18707]: I0320 09:06:32.273195 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-c920a-db-sync-5sx9b" podStartSLOduration=4.662365564 podStartE2EDuration="25.273150507s" podCreationTimestamp="2026-03-20 09:06:07 +0000 UTC" firstStartedPulling="2026-03-20 09:06:09.264637227 +0000 UTC m=+1514.420817583" lastFinishedPulling="2026-03-20 09:06:29.87542217 +0000 UTC m=+1535.031602526" observedRunningTime="2026-03-20 09:06:32.251169829 +0000 UTC m=+1537.407350185" watchObservedRunningTime="2026-03-20 09:06:32.273150507 +0000 UTC m=+1537.429330873" Mar 20 09:06:32.831222 master-0 kubenswrapper[18707]: I0320 09:06:32.828269 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-x7bjf" podStartSLOduration=3.82824142 podStartE2EDuration="3.82824142s" podCreationTimestamp="2026-03-20 09:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:06:32.819677105 +0000 UTC m=+1537.975857461" watchObservedRunningTime="2026-03-20 09:06:32.82824142 +0000 UTC m=+1537.984421806" Mar 20 09:06:32.928902 master-0 kubenswrapper[18707]: I0320 09:06:32.928807 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-internal-api-0" event={"ID":"dcee9936-db51-42d9-ada7-279cec63d27e","Type":"ContainerStarted","Data":"0b3e345b6cd271ceb5c5d53750e1102868ebb3da5bfa88101d7a166c578666dd"} Mar 20 09:06:32.930826 master-0 kubenswrapper[18707]: I0320 09:06:32.930789 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-external-api-0" event={"ID":"dbd0581a-055a-4728-8979-e0cfd24dd897","Type":"ContainerStarted","Data":"52a8b1a6a8e5d9b43c350b60899b2648676b53d28fa03fd9d17da7a7480b7c39"} Mar 20 09:06:33.725541 master-0 kubenswrapper[18707]: I0320 09:06:33.725432 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-27086-default-external-api-0" podStartSLOduration=20.725408918 podStartE2EDuration="20.725408918s" podCreationTimestamp="2026-03-20 09:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:06:33.704761138 +0000 UTC m=+1538.860941534" watchObservedRunningTime="2026-03-20 09:06:33.725408918 +0000 UTC m=+1538.881589264" Mar 20 09:06:34.159281 master-0 kubenswrapper[18707]: I0320 09:06:34.159179 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b5b845b79-xn624" podUID="28a3f9e1-5276-45c1-b6f3-94d2f09a223e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.203:5353: i/o timeout" Mar 20 09:06:35.189213 master-0 kubenswrapper[18707]: I0320 09:06:35.188488 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-27086-default-internal-api-0" podStartSLOduration=20.188464745 podStartE2EDuration="20.188464745s" podCreationTimestamp="2026-03-20 09:06:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:06:35.170017738 +0000 UTC m=+1540.326198094" watchObservedRunningTime="2026-03-20 09:06:35.188464745 +0000 UTC m=+1540.344645101" Mar 20 09:06:36.000733 master-0 kubenswrapper[18707]: I0320 09:06:36.000633 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:36.000733 master-0 kubenswrapper[18707]: I0320 09:06:36.000720 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:36.043223 master-0 kubenswrapper[18707]: I0320 09:06:36.043110 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:36.080014 master-0 kubenswrapper[18707]: I0320 09:06:36.079940 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:36.839367 master-0 kubenswrapper[18707]: I0320 09:06:36.839291 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:36.839367 master-0 kubenswrapper[18707]: I0320 09:06:36.839364 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:36.875113 master-0 kubenswrapper[18707]: I0320 09:06:36.875047 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:36.883177 master-0 kubenswrapper[18707]: I0320 09:06:36.883121 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:36.984245 master-0 kubenswrapper[18707]: I0320 09:06:36.982995 18707 generic.go:334] "Generic (PLEG): container finished" podID="01502368-38fe-40cc-9325-a7b83996fea1" containerID="a07bd35b6065a17b0625937319043507f44d30ff119dcba2e53672991382ccf2" exitCode=0 Mar 20 09:06:36.984245 master-0 kubenswrapper[18707]: I0320 09:06:36.983327 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8jqss" event={"ID":"01502368-38fe-40cc-9325-a7b83996fea1","Type":"ContainerDied","Data":"a07bd35b6065a17b0625937319043507f44d30ff119dcba2e53672991382ccf2"} Mar 20 09:06:36.984245 master-0 kubenswrapper[18707]: I0320 09:06:36.983762 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:36.984245 master-0 kubenswrapper[18707]: I0320 09:06:36.983807 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:36.984245 master-0 kubenswrapper[18707]: I0320 09:06:36.983823 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:36.984245 master-0 kubenswrapper[18707]: I0320 09:06:36.983834 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:37.663269 master-0 kubenswrapper[18707]: E0320 09:06:37.657724 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice/crio-ae51d12f8922637aab7b4445f260664c311d5c0f9de93fef49cda0fcae20c42e\": RecentStats: unable to find data in memory cache]" Mar 20 09:06:37.663269 master-0 kubenswrapper[18707]: E0320 09:06:37.657836 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice/crio-ae51d12f8922637aab7b4445f260664c311d5c0f9de93fef49cda0fcae20c42e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice\": RecentStats: unable to find data in memory cache]" Mar 20 09:06:37.663269 master-0 kubenswrapper[18707]: E0320 09:06:37.658113 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice/crio-ae51d12f8922637aab7b4445f260664c311d5c0f9de93fef49cda0fcae20c42e\": RecentStats: unable to find data in memory cache]" Mar 20 09:06:38.006443 master-0 kubenswrapper[18707]: I0320 09:06:38.006285 18707 generic.go:334] "Generic (PLEG): container finished" podID="90e3df0d-6e71-463d-9816-15b12a376333" containerID="e5809075a4e1990ebd15757f73b56351a1a78b444092637d0223fa4594857d85" exitCode=0 Mar 20 09:06:38.006443 master-0 kubenswrapper[18707]: I0320 09:06:38.006381 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" event={"ID":"90e3df0d-6e71-463d-9816-15b12a376333","Type":"ContainerDied","Data":"e5809075a4e1990ebd15757f73b56351a1a78b444092637d0223fa4594857d85"} Mar 20 09:06:38.012672 master-0 kubenswrapper[18707]: I0320 09:06:38.012612 18707 generic.go:334] "Generic (PLEG): container finished" podID="c177a9fe-76d7-4325-8968-c7178ad8c75a" containerID="4b624e139d528f49877f9a2c1de9dba1b5ca7a444de67d80f04ec0832af4b4a9" exitCode=0 Mar 20 09:06:38.013118 master-0 kubenswrapper[18707]: I0320 09:06:38.012855 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" event={"ID":"c177a9fe-76d7-4325-8968-c7178ad8c75a","Type":"ContainerDied","Data":"4b624e139d528f49877f9a2c1de9dba1b5ca7a444de67d80f04ec0832af4b4a9"} Mar 20 09:06:38.653271 master-0 kubenswrapper[18707]: I0320 09:06:38.653216 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:38.949287 master-0 kubenswrapper[18707]: I0320 09:06:38.949129 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-config-data\") pod \"01502368-38fe-40cc-9325-a7b83996fea1\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " Mar 20 09:06:38.949287 master-0 kubenswrapper[18707]: I0320 09:06:38.949256 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdt7l\" (UniqueName: \"kubernetes.io/projected/01502368-38fe-40cc-9325-a7b83996fea1-kube-api-access-jdt7l\") pod \"01502368-38fe-40cc-9325-a7b83996fea1\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " Mar 20 09:06:38.949516 master-0 kubenswrapper[18707]: I0320 09:06:38.949352 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-scripts\") pod \"01502368-38fe-40cc-9325-a7b83996fea1\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " Mar 20 09:06:38.949516 master-0 kubenswrapper[18707]: I0320 09:06:38.949447 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-combined-ca-bundle\") pod \"01502368-38fe-40cc-9325-a7b83996fea1\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " Mar 20 09:06:38.949516 master-0 kubenswrapper[18707]: I0320 09:06:38.949477 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01502368-38fe-40cc-9325-a7b83996fea1-logs\") pod \"01502368-38fe-40cc-9325-a7b83996fea1\" (UID: \"01502368-38fe-40cc-9325-a7b83996fea1\") " Mar 20 09:06:38.950925 master-0 kubenswrapper[18707]: I0320 09:06:38.950871 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01502368-38fe-40cc-9325-a7b83996fea1-logs" (OuterVolumeSpecName: "logs") pod "01502368-38fe-40cc-9325-a7b83996fea1" (UID: "01502368-38fe-40cc-9325-a7b83996fea1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:06:38.952587 master-0 kubenswrapper[18707]: I0320 09:06:38.952539 18707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01502368-38fe-40cc-9325-a7b83996fea1-logs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:38.979871 master-0 kubenswrapper[18707]: I0320 09:06:38.979805 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-scripts" (OuterVolumeSpecName: "scripts") pod "01502368-38fe-40cc-9325-a7b83996fea1" (UID: "01502368-38fe-40cc-9325-a7b83996fea1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:38.984176 master-0 kubenswrapper[18707]: I0320 09:06:38.980415 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01502368-38fe-40cc-9325-a7b83996fea1-kube-api-access-jdt7l" (OuterVolumeSpecName: "kube-api-access-jdt7l") pod "01502368-38fe-40cc-9325-a7b83996fea1" (UID: "01502368-38fe-40cc-9325-a7b83996fea1"). InnerVolumeSpecName "kube-api-access-jdt7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:39.011308 master-0 kubenswrapper[18707]: I0320 09:06:39.011232 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-config-data" (OuterVolumeSpecName: "config-data") pod "01502368-38fe-40cc-9325-a7b83996fea1" (UID: "01502368-38fe-40cc-9325-a7b83996fea1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:39.013036 master-0 kubenswrapper[18707]: I0320 09:06:39.012981 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01502368-38fe-40cc-9325-a7b83996fea1" (UID: "01502368-38fe-40cc-9325-a7b83996fea1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:39.036572 master-0 kubenswrapper[18707]: I0320 09:06:39.036494 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-8jqss" event={"ID":"01502368-38fe-40cc-9325-a7b83996fea1","Type":"ContainerDied","Data":"d7c796e585567a87c335c002450b5ee63a8a432c7044e0ba9e36fce09555a231"} Mar 20 09:06:39.036572 master-0 kubenswrapper[18707]: I0320 09:06:39.036571 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7c796e585567a87c335c002450b5ee63a8a432c7044e0ba9e36fce09555a231" Mar 20 09:06:39.036950 master-0 kubenswrapper[18707]: I0320 09:06:39.036597 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-8jqss" Mar 20 09:06:39.054620 master-0 kubenswrapper[18707]: I0320 09:06:39.054534 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:39.054620 master-0 kubenswrapper[18707]: I0320 09:06:39.054616 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdt7l\" (UniqueName: \"kubernetes.io/projected/01502368-38fe-40cc-9325-a7b83996fea1-kube-api-access-jdt7l\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:39.054620 master-0 kubenswrapper[18707]: I0320 09:06:39.054632 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:39.054948 master-0 kubenswrapper[18707]: I0320 09:06:39.054647 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01502368-38fe-40cc-9325-a7b83996fea1-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:40.773150 master-0 kubenswrapper[18707]: I0320 09:06:40.772950 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8b47bf878-2ftwn"] Mar 20 09:06:40.777215 master-0 kubenswrapper[18707]: E0320 09:06:40.774083 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a3f9e1-5276-45c1-b6f3-94d2f09a223e" containerName="init" Mar 20 09:06:40.777215 master-0 kubenswrapper[18707]: I0320 09:06:40.774106 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a3f9e1-5276-45c1-b6f3-94d2f09a223e" containerName="init" Mar 20 09:06:40.777215 master-0 kubenswrapper[18707]: E0320 09:06:40.774120 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01502368-38fe-40cc-9325-a7b83996fea1" containerName="placement-db-sync" Mar 20 09:06:40.777215 master-0 kubenswrapper[18707]: I0320 09:06:40.774126 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="01502368-38fe-40cc-9325-a7b83996fea1" containerName="placement-db-sync" Mar 20 09:06:40.777215 master-0 kubenswrapper[18707]: E0320 09:06:40.774179 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a3f9e1-5276-45c1-b6f3-94d2f09a223e" containerName="dnsmasq-dns" Mar 20 09:06:40.777215 master-0 kubenswrapper[18707]: I0320 09:06:40.774201 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a3f9e1-5276-45c1-b6f3-94d2f09a223e" containerName="dnsmasq-dns" Mar 20 09:06:40.777215 master-0 kubenswrapper[18707]: I0320 09:06:40.774447 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="01502368-38fe-40cc-9325-a7b83996fea1" containerName="placement-db-sync" Mar 20 09:06:40.777215 master-0 kubenswrapper[18707]: I0320 09:06:40.774459 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a3f9e1-5276-45c1-b6f3-94d2f09a223e" containerName="dnsmasq-dns" Mar 20 09:06:40.783213 master-0 kubenswrapper[18707]: I0320 09:06:40.782303 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:40.785697 master-0 kubenswrapper[18707]: I0320 09:06:40.784688 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 09:06:40.785697 master-0 kubenswrapper[18707]: I0320 09:06:40.784990 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 09:06:40.785697 master-0 kubenswrapper[18707]: I0320 09:06:40.785659 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 09:06:40.789954 master-0 kubenswrapper[18707]: I0320 09:06:40.789913 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 09:06:40.902635 master-0 kubenswrapper[18707]: I0320 09:06:40.902549 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8b47bf878-2ftwn"] Mar 20 09:06:40.946013 master-0 kubenswrapper[18707]: I0320 09:06:40.944810 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-scripts\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:40.946013 master-0 kubenswrapper[18707]: I0320 09:06:40.945099 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-public-tls-certs\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:40.947092 master-0 kubenswrapper[18707]: I0320 09:06:40.946018 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dafeade-6a38-4134-ac1c-7331ea038aec-logs\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:40.947092 master-0 kubenswrapper[18707]: I0320 09:06:40.946252 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-internal-tls-certs\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:40.947092 master-0 kubenswrapper[18707]: I0320 09:06:40.946293 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m99fb\" (UniqueName: \"kubernetes.io/projected/2dafeade-6a38-4134-ac1c-7331ea038aec-kube-api-access-m99fb\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:40.947092 master-0 kubenswrapper[18707]: I0320 09:06:40.946504 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-config-data\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:40.947092 master-0 kubenswrapper[18707]: I0320 09:06:40.946552 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-combined-ca-bundle\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.052290 master-0 kubenswrapper[18707]: I0320 09:06:41.051877 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dafeade-6a38-4134-ac1c-7331ea038aec-logs\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.052290 master-0 kubenswrapper[18707]: I0320 09:06:41.051966 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-internal-tls-certs\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.052290 master-0 kubenswrapper[18707]: I0320 09:06:41.052010 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m99fb\" (UniqueName: \"kubernetes.io/projected/2dafeade-6a38-4134-ac1c-7331ea038aec-kube-api-access-m99fb\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.052290 master-0 kubenswrapper[18707]: I0320 09:06:41.052138 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-config-data\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.052290 master-0 kubenswrapper[18707]: I0320 09:06:41.052212 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-combined-ca-bundle\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.052290 master-0 kubenswrapper[18707]: I0320 09:06:41.052256 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-scripts\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.052780 master-0 kubenswrapper[18707]: I0320 09:06:41.052422 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-public-tls-certs\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.058232 master-0 kubenswrapper[18707]: I0320 09:06:41.054881 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dafeade-6a38-4134-ac1c-7331ea038aec-logs\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.065920 master-0 kubenswrapper[18707]: I0320 09:06:41.065369 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-internal-tls-certs\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.065920 master-0 kubenswrapper[18707]: I0320 09:06:41.065596 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-config-data\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.066230 master-0 kubenswrapper[18707]: I0320 09:06:41.065951 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-combined-ca-bundle\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.071214 master-0 kubenswrapper[18707]: I0320 09:06:41.068427 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-scripts\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.083225 master-0 kubenswrapper[18707]: I0320 09:06:41.079065 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-public-tls-certs\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.083225 master-0 kubenswrapper[18707]: I0320 09:06:41.082960 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m99fb\" (UniqueName: \"kubernetes.io/projected/2dafeade-6a38-4134-ac1c-7331ea038aec-kube-api-access-m99fb\") pod \"placement-8b47bf878-2ftwn\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.167212 master-0 kubenswrapper[18707]: I0320 09:06:41.166499 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:41.392867 master-0 kubenswrapper[18707]: E0320 09:06:41.389336 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice/crio-ae51d12f8922637aab7b4445f260664c311d5c0f9de93fef49cda0fcae20c42e\": RecentStats: unable to find data in memory cache]" Mar 20 09:06:41.768529 master-0 kubenswrapper[18707]: I0320 09:06:41.768485 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8b47bf878-2ftwn"] Mar 20 09:06:42.179139 master-0 kubenswrapper[18707]: I0320 09:06:42.178761 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8b47bf878-2ftwn" event={"ID":"2dafeade-6a38-4134-ac1c-7331ea038aec","Type":"ContainerStarted","Data":"3a775266802c1a940d94e0ee69c727654606b52f10dd599b4ae5f931674df38d"} Mar 20 09:06:43.192822 master-0 kubenswrapper[18707]: I0320 09:06:43.192713 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8b47bf878-2ftwn" event={"ID":"2dafeade-6a38-4134-ac1c-7331ea038aec","Type":"ContainerStarted","Data":"1dda1c268f511fb2abda022887e5169f2df07d603ed922eb6f445725a0b163af"} Mar 20 09:06:43.192822 master-0 kubenswrapper[18707]: I0320 09:06:43.192776 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8b47bf878-2ftwn" event={"ID":"2dafeade-6a38-4134-ac1c-7331ea038aec","Type":"ContainerStarted","Data":"89b2d0395c21db068f91d1a8531bc530fddf5e8897d2499384549e1ac365ed68"} Mar 20 09:06:43.194443 master-0 kubenswrapper[18707]: I0320 09:06:43.193017 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:43.365174 master-0 kubenswrapper[18707]: I0320 09:06:43.364081 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8b47bf878-2ftwn" podStartSLOduration=3.364060992 podStartE2EDuration="3.364060992s" podCreationTimestamp="2026-03-20 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:06:43.355777875 +0000 UTC m=+1548.511958241" watchObservedRunningTime="2026-03-20 09:06:43.364060992 +0000 UTC m=+1548.520241348" Mar 20 09:06:43.446212 master-0 kubenswrapper[18707]: I0320 09:06:43.445485 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:43.448093 master-0 kubenswrapper[18707]: I0320 09:06:43.448059 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:43.451326 master-0 kubenswrapper[18707]: I0320 09:06:43.451293 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:06:43.480204 master-0 kubenswrapper[18707]: I0320 09:06:43.479460 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:06:44.211665 master-0 kubenswrapper[18707]: I0320 09:06:44.211211 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:06:45.225634 master-0 kubenswrapper[18707]: I0320 09:06:45.225566 18707 generic.go:334] "Generic (PLEG): container finished" podID="eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6" containerID="caf385a86308743d6ea997839039856b2382a85322e22d2510752a140219e721" exitCode=0 Mar 20 09:06:45.226312 master-0 kubenswrapper[18707]: I0320 09:06:45.226063 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7bjf" event={"ID":"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6","Type":"ContainerDied","Data":"caf385a86308743d6ea997839039856b2382a85322e22d2510752a140219e721"} Mar 20 09:06:47.992300 master-0 kubenswrapper[18707]: E0320 09:06:47.992223 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52457bb1_e081_4589_bbe2_3aaadfb92b31.slice/crio-ae51d12f8922637aab7b4445f260664c311d5c0f9de93fef49cda0fcae20c42e\": RecentStats: unable to find data in memory cache]" Mar 20 09:06:49.271225 master-0 kubenswrapper[18707]: I0320 09:06:49.271141 18707 generic.go:334] "Generic (PLEG): container finished" podID="d739ce18-cf6c-4eeb-a609-7ab1acac00d2" containerID="f8f53fce1724ee6b0341e8dfcfa88d2d2109f71ae5fa72bfb98e12e6f054c67a" exitCode=0 Mar 20 09:06:49.272552 master-0 kubenswrapper[18707]: I0320 09:06:49.271281 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q6j55" event={"ID":"d739ce18-cf6c-4eeb-a609-7ab1acac00d2","Type":"ContainerDied","Data":"f8f53fce1724ee6b0341e8dfcfa88d2d2109f71ae5fa72bfb98e12e6f054c67a"} Mar 20 09:06:49.277164 master-0 kubenswrapper[18707]: I0320 09:06:49.277109 18707 generic.go:334] "Generic (PLEG): container finished" podID="7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a" containerID="9c1abde5f519d612e85bab52eac717cd2c5a4a1ca596cc149088edc0fbbbd35d" exitCode=0 Mar 20 09:06:49.277164 master-0 kubenswrapper[18707]: I0320 09:06:49.277150 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-db-sync-5sx9b" event={"ID":"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a","Type":"ContainerDied","Data":"9c1abde5f519d612e85bab52eac717cd2c5a4a1ca596cc149088edc0fbbbd35d"} Mar 20 09:06:49.279783 master-0 kubenswrapper[18707]: I0320 09:06:49.279719 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x7bjf" event={"ID":"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6","Type":"ContainerDied","Data":"b9006497ff3e14acf221d03993381f2283c12f51ee4484efe5a9cd880b17b880"} Mar 20 09:06:49.279783 master-0 kubenswrapper[18707]: I0320 09:06:49.279781 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9006497ff3e14acf221d03993381f2283c12f51ee4484efe5a9cd880b17b880" Mar 20 09:06:49.331585 master-0 kubenswrapper[18707]: I0320 09:06:49.331536 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:49.474960 master-0 kubenswrapper[18707]: I0320 09:06:49.421910 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-scripts\") pod \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " Mar 20 09:06:49.474960 master-0 kubenswrapper[18707]: I0320 09:06:49.421977 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-config-data\") pod \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " Mar 20 09:06:49.474960 master-0 kubenswrapper[18707]: I0320 09:06:49.422020 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-242s4\" (UniqueName: \"kubernetes.io/projected/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-kube-api-access-242s4\") pod \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " Mar 20 09:06:49.474960 master-0 kubenswrapper[18707]: I0320 09:06:49.422081 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-fernet-keys\") pod \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " Mar 20 09:06:49.474960 master-0 kubenswrapper[18707]: I0320 09:06:49.422102 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-combined-ca-bundle\") pod \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " Mar 20 09:06:49.474960 master-0 kubenswrapper[18707]: I0320 09:06:49.422134 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-credential-keys\") pod \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\" (UID: \"eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6\") " Mar 20 09:06:49.476076 master-0 kubenswrapper[18707]: I0320 09:06:49.476027 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6" (UID: "eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:49.478555 master-0 kubenswrapper[18707]: I0320 09:06:49.478505 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-scripts" (OuterVolumeSpecName: "scripts") pod "eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6" (UID: "eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:49.480577 master-0 kubenswrapper[18707]: I0320 09:06:49.480087 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6" (UID: "eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:49.498478 master-0 kubenswrapper[18707]: I0320 09:06:49.498417 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-kube-api-access-242s4" (OuterVolumeSpecName: "kube-api-access-242s4") pod "eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6" (UID: "eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6"). InnerVolumeSpecName "kube-api-access-242s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:49.526061 master-0 kubenswrapper[18707]: I0320 09:06:49.526018 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-242s4\" (UniqueName: \"kubernetes.io/projected/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-kube-api-access-242s4\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:49.527206 master-0 kubenswrapper[18707]: I0320 09:06:49.527177 18707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:49.527373 master-0 kubenswrapper[18707]: I0320 09:06:49.527308 18707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:49.527373 master-0 kubenswrapper[18707]: I0320 09:06:49.527323 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:49.551314 master-0 kubenswrapper[18707]: I0320 09:06:49.551241 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6" (UID: "eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:49.552361 master-0 kubenswrapper[18707]: I0320 09:06:49.552261 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-config-data" (OuterVolumeSpecName: "config-data") pod "eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6" (UID: "eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:49.629746 master-0 kubenswrapper[18707]: I0320 09:06:49.629623 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:49.629746 master-0 kubenswrapper[18707]: I0320 09:06:49.629676 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:50.292834 master-0 kubenswrapper[18707]: I0320 09:06:50.292765 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" event={"ID":"90e3df0d-6e71-463d-9816-15b12a376333","Type":"ContainerStarted","Data":"b2f1fc7b0a9182c031a1517463f8b12b39da59b394d4a30eb45bafb47e23dc7f"} Mar 20 09:06:50.299758 master-0 kubenswrapper[18707]: I0320 09:06:50.299686 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x7bjf" Mar 20 09:06:50.300810 master-0 kubenswrapper[18707]: I0320 09:06:50.300743 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" event={"ID":"c177a9fe-76d7-4325-8968-c7178ad8c75a","Type":"ContainerStarted","Data":"8773a8b62968e3019505e9621087d14f1b18b88064c100244f3fa1ae81fc5856"} Mar 20 09:06:50.348476 master-0 kubenswrapper[18707]: I0320 09:06:50.348311 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" podStartSLOduration=3.306426393 podStartE2EDuration="1m2.348292013s" podCreationTimestamp="2026-03-20 09:05:48 +0000 UTC" firstStartedPulling="2026-03-20 09:05:50.214214412 +0000 UTC m=+1495.370394768" lastFinishedPulling="2026-03-20 09:06:49.256079992 +0000 UTC m=+1554.412260388" observedRunningTime="2026-03-20 09:06:50.345994397 +0000 UTC m=+1555.502174753" watchObservedRunningTime="2026-03-20 09:06:50.348292013 +0000 UTC m=+1555.504472369" Mar 20 09:06:50.392597 master-0 kubenswrapper[18707]: I0320 09:06:50.392235 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" podStartSLOduration=2.520273001 podStartE2EDuration="1m0.392155426s" podCreationTimestamp="2026-03-20 09:05:50 +0000 UTC" firstStartedPulling="2026-03-20 09:05:51.43507025 +0000 UTC m=+1496.591250606" lastFinishedPulling="2026-03-20 09:06:49.306952665 +0000 UTC m=+1554.463133031" observedRunningTime="2026-03-20 09:06:50.389386877 +0000 UTC m=+1555.545567233" watchObservedRunningTime="2026-03-20 09:06:50.392155426 +0000 UTC m=+1555.548335782" Mar 20 09:06:50.621954 master-0 kubenswrapper[18707]: I0320 09:06:50.619303 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5987d8f9fd-q4m4b"] Mar 20 09:06:50.621954 master-0 kubenswrapper[18707]: E0320 09:06:50.620068 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6" containerName="keystone-bootstrap" Mar 20 09:06:50.621954 master-0 kubenswrapper[18707]: I0320 09:06:50.620089 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6" containerName="keystone-bootstrap" Mar 20 09:06:50.621954 master-0 kubenswrapper[18707]: I0320 09:06:50.620474 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6" containerName="keystone-bootstrap" Mar 20 09:06:50.621954 master-0 kubenswrapper[18707]: I0320 09:06:50.621743 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.627999 master-0 kubenswrapper[18707]: I0320 09:06:50.627697 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 09:06:50.627999 master-0 kubenswrapper[18707]: I0320 09:06:50.628000 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 09:06:50.628253 master-0 kubenswrapper[18707]: I0320 09:06:50.628208 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 09:06:50.633810 master-0 kubenswrapper[18707]: I0320 09:06:50.632448 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 09:06:50.634774 master-0 kubenswrapper[18707]: I0320 09:06:50.634350 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 09:06:50.660422 master-0 kubenswrapper[18707]: I0320 09:06:50.660344 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-scripts\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.660895 master-0 kubenswrapper[18707]: I0320 09:06:50.660785 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-combined-ca-bundle\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.661052 master-0 kubenswrapper[18707]: I0320 09:06:50.661010 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2qtn\" (UniqueName: \"kubernetes.io/projected/60a95766-91c5-4eff-bb31-ade533ae6a4a-kube-api-access-m2qtn\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.661204 master-0 kubenswrapper[18707]: I0320 09:06:50.661167 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-public-tls-certs\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.661362 master-0 kubenswrapper[18707]: I0320 09:06:50.661313 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-credential-keys\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.661522 master-0 kubenswrapper[18707]: I0320 09:06:50.661467 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-config-data\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.661720 master-0 kubenswrapper[18707]: I0320 09:06:50.661577 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-internal-tls-certs\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.661720 master-0 kubenswrapper[18707]: I0320 09:06:50.661606 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-fernet-keys\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.701984 master-0 kubenswrapper[18707]: I0320 09:06:50.701890 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5987d8f9fd-q4m4b"] Mar 20 09:06:50.744296 master-0 kubenswrapper[18707]: I0320 09:06:50.744205 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.762755 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm786\" (UniqueName: \"kubernetes.io/projected/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-kube-api-access-cm786\") pod \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.762946 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-combined-ca-bundle\") pod \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.762994 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-db-sync-config-data\") pod \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.763019 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-etc-machine-id\") pod \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.763070 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-scripts\") pod \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.763155 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-config-data\") pod \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\" (UID: \"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a\") " Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.763428 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2qtn\" (UniqueName: \"kubernetes.io/projected/60a95766-91c5-4eff-bb31-ade533ae6a4a-kube-api-access-m2qtn\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.763454 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a" (UID: "7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.763497 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-public-tls-certs\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.763538 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-credential-keys\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.763570 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-config-data\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.763605 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-internal-tls-certs\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.763630 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-fernet-keys\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.763725 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-scripts\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.763820 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-combined-ca-bundle\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.765093 master-0 kubenswrapper[18707]: I0320 09:06:50.763913 18707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:50.771390 master-0 kubenswrapper[18707]: I0320 09:06:50.766986 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a" (UID: "7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:50.771390 master-0 kubenswrapper[18707]: I0320 09:06:50.767245 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-kube-api-access-cm786" (OuterVolumeSpecName: "kube-api-access-cm786") pod "7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a" (UID: "7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a"). InnerVolumeSpecName "kube-api-access-cm786". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:50.771390 master-0 kubenswrapper[18707]: I0320 09:06:50.767753 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-combined-ca-bundle\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.771390 master-0 kubenswrapper[18707]: I0320 09:06:50.770343 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-credential-keys\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.771390 master-0 kubenswrapper[18707]: I0320 09:06:50.771359 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-config-data\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.773438 master-0 kubenswrapper[18707]: I0320 09:06:50.773379 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-public-tls-certs\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.774985 master-0 kubenswrapper[18707]: I0320 09:06:50.774762 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-internal-tls-certs\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.775656 master-0 kubenswrapper[18707]: I0320 09:06:50.775620 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-scripts" (OuterVolumeSpecName: "scripts") pod "7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a" (UID: "7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:50.775863 master-0 kubenswrapper[18707]: I0320 09:06:50.775821 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-fernet-keys\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.775921 master-0 kubenswrapper[18707]: I0320 09:06:50.775903 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60a95766-91c5-4eff-bb31-ade533ae6a4a-scripts\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.801552 master-0 kubenswrapper[18707]: I0320 09:06:50.801336 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2qtn\" (UniqueName: \"kubernetes.io/projected/60a95766-91c5-4eff-bb31-ade533ae6a4a-kube-api-access-m2qtn\") pod \"keystone-5987d8f9fd-q4m4b\" (UID: \"60a95766-91c5-4eff-bb31-ade533ae6a4a\") " pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:50.844882 master-0 kubenswrapper[18707]: I0320 09:06:50.844809 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a" (UID: "7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:50.858528 master-0 kubenswrapper[18707]: I0320 09:06:50.858451 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-config-data" (OuterVolumeSpecName: "config-data") pod "7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a" (UID: "7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:50.866244 master-0 kubenswrapper[18707]: I0320 09:06:50.865924 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:50.866244 master-0 kubenswrapper[18707]: I0320 09:06:50.865976 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm786\" (UniqueName: \"kubernetes.io/projected/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-kube-api-access-cm786\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:50.866244 master-0 kubenswrapper[18707]: I0320 09:06:50.865991 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:50.866244 master-0 kubenswrapper[18707]: I0320 09:06:50.866004 18707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:50.866244 master-0 kubenswrapper[18707]: I0320 09:06:50.866015 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:50.902152 master-0 kubenswrapper[18707]: I0320 09:06:50.900493 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q6j55" Mar 20 09:06:51.038925 master-0 kubenswrapper[18707]: I0320 09:06:51.038839 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:51.070506 master-0 kubenswrapper[18707]: I0320 09:06:51.070446 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8697\" (UniqueName: \"kubernetes.io/projected/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-kube-api-access-x8697\") pod \"d739ce18-cf6c-4eeb-a609-7ab1acac00d2\" (UID: \"d739ce18-cf6c-4eeb-a609-7ab1acac00d2\") " Mar 20 09:06:51.070630 master-0 kubenswrapper[18707]: I0320 09:06:51.070573 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-combined-ca-bundle\") pod \"d739ce18-cf6c-4eeb-a609-7ab1acac00d2\" (UID: \"d739ce18-cf6c-4eeb-a609-7ab1acac00d2\") " Mar 20 09:06:51.071150 master-0 kubenswrapper[18707]: I0320 09:06:51.070772 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-config\") pod \"d739ce18-cf6c-4eeb-a609-7ab1acac00d2\" (UID: \"d739ce18-cf6c-4eeb-a609-7ab1acac00d2\") " Mar 20 09:06:51.074072 master-0 kubenswrapper[18707]: I0320 09:06:51.073647 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-kube-api-access-x8697" (OuterVolumeSpecName: "kube-api-access-x8697") pod "d739ce18-cf6c-4eeb-a609-7ab1acac00d2" (UID: "d739ce18-cf6c-4eeb-a609-7ab1acac00d2"). InnerVolumeSpecName "kube-api-access-x8697". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:51.107424 master-0 kubenswrapper[18707]: I0320 09:06:51.107373 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-config" (OuterVolumeSpecName: "config") pod "d739ce18-cf6c-4eeb-a609-7ab1acac00d2" (UID: "d739ce18-cf6c-4eeb-a609-7ab1acac00d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:51.117411 master-0 kubenswrapper[18707]: I0320 09:06:51.117080 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d739ce18-cf6c-4eeb-a609-7ab1acac00d2" (UID: "d739ce18-cf6c-4eeb-a609-7ab1acac00d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:51.179269 master-0 kubenswrapper[18707]: I0320 09:06:51.173221 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:51.179269 master-0 kubenswrapper[18707]: I0320 09:06:51.173265 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8697\" (UniqueName: \"kubernetes.io/projected/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-kube-api-access-x8697\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:51.179269 master-0 kubenswrapper[18707]: I0320 09:06:51.173279 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d739ce18-cf6c-4eeb-a609-7ab1acac00d2-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:51.329486 master-0 kubenswrapper[18707]: I0320 09:06:51.329421 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-q6j55" event={"ID":"d739ce18-cf6c-4eeb-a609-7ab1acac00d2","Type":"ContainerDied","Data":"756e8e4d38a2c55a80de8ad3850f0273efdbef21ea7b88d617e793e0fb3de942"} Mar 20 09:06:51.329486 master-0 kubenswrapper[18707]: I0320 09:06:51.329478 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="756e8e4d38a2c55a80de8ad3850f0273efdbef21ea7b88d617e793e0fb3de942" Mar 20 09:06:51.330118 master-0 kubenswrapper[18707]: I0320 09:06:51.329541 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-q6j55" Mar 20 09:06:51.338147 master-0 kubenswrapper[18707]: I0320 09:06:51.338066 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-db-sync-5sx9b" event={"ID":"7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a","Type":"ContainerDied","Data":"cf4797b833da7e672e7881bbcc8a3ff2ffe3bd7e56643e6f780811d2dee23c09"} Mar 20 09:06:51.338147 master-0 kubenswrapper[18707]: I0320 09:06:51.338134 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf4797b833da7e672e7881bbcc8a3ff2ffe3bd7e56643e6f780811d2dee23c09" Mar 20 09:06:51.338466 master-0 kubenswrapper[18707]: I0320 09:06:51.338249 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-db-sync-5sx9b" Mar 20 09:06:51.382331 master-0 kubenswrapper[18707]: I0320 09:06:51.381378 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" Mar 20 09:06:51.382331 master-0 kubenswrapper[18707]: I0320 09:06:51.381474 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" Mar 20 09:06:51.392780 master-0 kubenswrapper[18707]: I0320 09:06:51.392697 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" Mar 20 09:06:51.484533 master-0 kubenswrapper[18707]: I0320 09:06:51.484490 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5987d8f9fd-q4m4b"] Mar 20 09:06:51.724626 master-0 kubenswrapper[18707]: I0320 09:06:51.724505 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bcb7479f5-z74kg"] Mar 20 09:06:51.725371 master-0 kubenswrapper[18707]: E0320 09:06:51.725352 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d739ce18-cf6c-4eeb-a609-7ab1acac00d2" containerName="neutron-db-sync" Mar 20 09:06:51.725472 master-0 kubenswrapper[18707]: I0320 09:06:51.725461 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d739ce18-cf6c-4eeb-a609-7ab1acac00d2" containerName="neutron-db-sync" Mar 20 09:06:51.725545 master-0 kubenswrapper[18707]: E0320 09:06:51.725534 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a" containerName="cinder-c920a-db-sync" Mar 20 09:06:51.725601 master-0 kubenswrapper[18707]: I0320 09:06:51.725591 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a" containerName="cinder-c920a-db-sync" Mar 20 09:06:51.726151 master-0 kubenswrapper[18707]: I0320 09:06:51.726136 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d739ce18-cf6c-4eeb-a609-7ab1acac00d2" containerName="neutron-db-sync" Mar 20 09:06:51.726249 master-0 kubenswrapper[18707]: I0320 09:06:51.726238 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a" containerName="cinder-c920a-db-sync" Mar 20 09:06:51.739401 master-0 kubenswrapper[18707]: I0320 09:06:51.735034 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.791728 master-0 kubenswrapper[18707]: I0320 09:06:51.791465 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bcb7479f5-z74kg"] Mar 20 09:06:51.815343 master-0 kubenswrapper[18707]: I0320 09:06:51.813845 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-edpm-a\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.815343 master-0 kubenswrapper[18707]: I0320 09:06:51.814090 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-dns-swift-storage-0\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.815343 master-0 kubenswrapper[18707]: I0320 09:06:51.814257 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-ovsdbserver-nb\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.815343 master-0 kubenswrapper[18707]: I0320 09:06:51.814368 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm75l\" (UniqueName: \"kubernetes.io/projected/1019781e-d4ae-4e36-8e6d-761b752b1aeb-kube-api-access-jm75l\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.815343 master-0 kubenswrapper[18707]: I0320 09:06:51.814583 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-config\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.815343 master-0 kubenswrapper[18707]: I0320 09:06:51.814645 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-ovsdbserver-sb\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.815343 master-0 kubenswrapper[18707]: I0320 09:06:51.814674 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-dns-svc\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.815343 master-0 kubenswrapper[18707]: I0320 09:06:51.814758 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-edpm-b\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.917924 master-0 kubenswrapper[18707]: I0320 09:06:51.917862 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-edpm-b\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.918096 master-0 kubenswrapper[18707]: I0320 09:06:51.918021 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-edpm-a\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.918096 master-0 kubenswrapper[18707]: I0320 09:06:51.918058 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-dns-swift-storage-0\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.918167 master-0 kubenswrapper[18707]: I0320 09:06:51.918095 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-ovsdbserver-nb\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.918167 master-0 kubenswrapper[18707]: I0320 09:06:51.918137 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm75l\" (UniqueName: \"kubernetes.io/projected/1019781e-d4ae-4e36-8e6d-761b752b1aeb-kube-api-access-jm75l\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.918307 master-0 kubenswrapper[18707]: I0320 09:06:51.918254 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-config\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.918307 master-0 kubenswrapper[18707]: I0320 09:06:51.918289 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-ovsdbserver-sb\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.918390 master-0 kubenswrapper[18707]: I0320 09:06:51.918307 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-dns-svc\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.919385 master-0 kubenswrapper[18707]: I0320 09:06:51.919352 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-dns-svc\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.920371 master-0 kubenswrapper[18707]: I0320 09:06:51.920345 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-edpm-b\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.922377 master-0 kubenswrapper[18707]: I0320 09:06:51.922331 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-config\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.923171 master-0 kubenswrapper[18707]: I0320 09:06:51.923137 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-ovsdbserver-sb\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.924104 master-0 kubenswrapper[18707]: I0320 09:06:51.924052 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-ovsdbserver-nb\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.924285 master-0 kubenswrapper[18707]: I0320 09:06:51.924256 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-edpm-a\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:51.925218 master-0 kubenswrapper[18707]: I0320 09:06:51.925172 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-dns-swift-storage-0\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:52.067888 master-0 kubenswrapper[18707]: I0320 09:06:52.067736 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm75l\" (UniqueName: \"kubernetes.io/projected/1019781e-d4ae-4e36-8e6d-761b752b1aeb-kube-api-access-jm75l\") pod \"dnsmasq-dns-7bcb7479f5-z74kg\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:52.162408 master-0 kubenswrapper[18707]: I0320 09:06:52.161000 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c920a-volume-lvm-iscsi-0"] Mar 20 09:06:52.173370 master-0 kubenswrapper[18707]: I0320 09:06:52.165450 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.180214 master-0 kubenswrapper[18707]: I0320 09:06:52.176623 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-c920a-config-data" Mar 20 09:06:52.180214 master-0 kubenswrapper[18707]: I0320 09:06:52.176843 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-c920a-scripts" Mar 20 09:06:52.180214 master-0 kubenswrapper[18707]: I0320 09:06:52.177007 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-c920a-volume-lvm-iscsi-config-data" Mar 20 09:06:52.218701 master-0 kubenswrapper[18707]: I0320 09:06:52.215822 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c920a-scheduler-0"] Mar 20 09:06:52.218902 master-0 kubenswrapper[18707]: I0320 09:06:52.218733 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.242899 master-0 kubenswrapper[18707]: I0320 09:06:52.241358 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-scheduler-0"] Mar 20 09:06:52.257541 master-0 kubenswrapper[18707]: I0320 09:06:52.245494 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-c920a-scheduler-config-data" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.261542 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-sys\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.261616 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-config-data\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.261649 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-config-data-custom\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.261863 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-nvme\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.261927 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-878x5\" (UniqueName: \"kubernetes.io/projected/dd73e9d3-234d-4470-b5a4-9abf382a12d7-kube-api-access-878x5\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.262162 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-config-data-custom\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.262375 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-combined-ca-bundle\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.262412 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-lib-cinder\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.262468 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-scripts\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.262516 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-config-data\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.262538 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-iscsi\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.262594 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-lib-modules\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.262632 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-run\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.262670 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-locks-cinder\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.262735 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-locks-brick\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.262767 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-machine-id\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.262789 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-dev\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.262841 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-etc-machine-id\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.262900 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-scripts\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.263039 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-combined-ca-bundle\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.265385 master-0 kubenswrapper[18707]: I0320 09:06:52.263068 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pz6d\" (UniqueName: \"kubernetes.io/projected/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-kube-api-access-7pz6d\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.302595 master-0 kubenswrapper[18707]: I0320 09:06:52.300244 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-volume-lvm-iscsi-0"] Mar 20 09:06:52.358660 master-0 kubenswrapper[18707]: I0320 09:06:52.358477 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c920a-backup-0"] Mar 20 09:06:52.374542 master-0 kubenswrapper[18707]: I0320 09:06:52.368957 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:52.374542 master-0 kubenswrapper[18707]: I0320 09:06:52.373599 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-combined-ca-bundle\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.374542 master-0 kubenswrapper[18707]: I0320 09:06:52.373661 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-lib-cinder\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.380211 master-0 kubenswrapper[18707]: I0320 09:06:52.379250 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-lib-cinder\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.381434 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-scripts\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.381540 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-config-data\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.381585 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-iscsi\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.381676 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-lib-modules\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.381710 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-run\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.381758 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-locks-cinder\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.381822 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-locks-brick\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.381855 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-machine-id\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.381881 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-dev\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.381943 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-lib-modules\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.381964 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-dev\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.381982 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-etc-machine-id\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.382037 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-etc-machine-id\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.382078 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-run\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.382378 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-locks-cinder\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.382853 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-locks-brick\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.383131 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-machine-id\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.383702 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-scripts\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.384661 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pz6d\" (UniqueName: \"kubernetes.io/projected/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-kube-api-access-7pz6d\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.384879 master-0 kubenswrapper[18707]: I0320 09:06:52.384728 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-combined-ca-bundle\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.385709 master-0 kubenswrapper[18707]: I0320 09:06:52.385215 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-sys\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.385709 master-0 kubenswrapper[18707]: I0320 09:06:52.385339 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-config-data\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.385709 master-0 kubenswrapper[18707]: I0320 09:06:52.385413 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-config-data-custom\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.393879 master-0 kubenswrapper[18707]: I0320 09:06:52.385867 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-nvme\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.393879 master-0 kubenswrapper[18707]: I0320 09:06:52.385911 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-878x5\" (UniqueName: \"kubernetes.io/projected/dd73e9d3-234d-4470-b5a4-9abf382a12d7-kube-api-access-878x5\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.393879 master-0 kubenswrapper[18707]: I0320 09:06:52.386424 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-config-data-custom\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.393879 master-0 kubenswrapper[18707]: I0320 09:06:52.387811 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.400642 master-0 kubenswrapper[18707]: I0320 09:06:52.396605 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-iscsi\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.400642 master-0 kubenswrapper[18707]: I0320 09:06:52.397393 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-sys\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.401219 master-0 kubenswrapper[18707]: I0320 09:06:52.401103 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-nvme\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.447405 master-0 kubenswrapper[18707]: I0320 09:06:52.402742 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-combined-ca-bundle\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.447405 master-0 kubenswrapper[18707]: I0320 09:06:52.409245 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-c920a-backup-config-data" Mar 20 09:06:52.447405 master-0 kubenswrapper[18707]: I0320 09:06:52.409296 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-555d58bf7b-nplsv"] Mar 20 09:06:52.447405 master-0 kubenswrapper[18707]: I0320 09:06:52.410471 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-scripts\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.447405 master-0 kubenswrapper[18707]: I0320 09:06:52.428027 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.447405 master-0 kubenswrapper[18707]: I0320 09:06:52.437830 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 09:06:52.447405 master-0 kubenswrapper[18707]: I0320 09:06:52.438094 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 09:06:52.447405 master-0 kubenswrapper[18707]: I0320 09:06:52.438276 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-backup-0"] Mar 20 09:06:52.447405 master-0 kubenswrapper[18707]: I0320 09:06:52.438430 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 09:06:52.447405 master-0 kubenswrapper[18707]: I0320 09:06:52.440505 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pz6d\" (UniqueName: \"kubernetes.io/projected/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-kube-api-access-7pz6d\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.447405 master-0 kubenswrapper[18707]: I0320 09:06:52.440559 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5987d8f9fd-q4m4b" event={"ID":"60a95766-91c5-4eff-bb31-ade533ae6a4a","Type":"ContainerStarted","Data":"b282e5e79c4d227a02f4edb890669f53f556d4f907a21d4996d9f2816efed1f7"} Mar 20 09:06:52.447405 master-0 kubenswrapper[18707]: I0320 09:06:52.440590 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5987d8f9fd-q4m4b" event={"ID":"60a95766-91c5-4eff-bb31-ade533ae6a4a","Type":"ContainerStarted","Data":"642f012b144da60a3370cfe92074002405ae375faa03df5ec3a43d5ba912f3ad"} Mar 20 09:06:52.447405 master-0 kubenswrapper[18707]: I0320 09:06:52.441085 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:06:52.466251 master-0 kubenswrapper[18707]: I0320 09:06:52.455346 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-878x5\" (UniqueName: \"kubernetes.io/projected/dd73e9d3-234d-4470-b5a4-9abf382a12d7-kube-api-access-878x5\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.466251 master-0 kubenswrapper[18707]: I0320 09:06:52.459897 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-config-data-custom\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.466251 master-0 kubenswrapper[18707]: I0320 09:06:52.460216 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-combined-ca-bundle\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.466251 master-0 kubenswrapper[18707]: I0320 09:06:52.462569 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-config-data-custom\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.466251 master-0 kubenswrapper[18707]: I0320 09:06:52.464432 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-config-data\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.466251 master-0 kubenswrapper[18707]: I0320 09:06:52.464593 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-scripts\") pod \"cinder-c920a-scheduler-0\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.491350 master-0 kubenswrapper[18707]: I0320 09:06:52.485388 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-config-data\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.491350 master-0 kubenswrapper[18707]: I0320 09:06:52.486947 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-555d58bf7b-nplsv"] Mar 20 09:06:52.491350 master-0 kubenswrapper[18707]: I0320 09:06:52.489096 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-config\") pod \"neutron-555d58bf7b-nplsv\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.491350 master-0 kubenswrapper[18707]: I0320 09:06:52.489149 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-dev\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.503437 master-0 kubenswrapper[18707]: I0320 09:06:52.502437 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-machine-id\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.503437 master-0 kubenswrapper[18707]: I0320 09:06:52.502575 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-config-data\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.503437 master-0 kubenswrapper[18707]: I0320 09:06:52.502638 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-lib-modules\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.503437 master-0 kubenswrapper[18707]: I0320 09:06:52.502666 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-lib-cinder\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.503437 master-0 kubenswrapper[18707]: I0320 09:06:52.502753 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stl6s\" (UniqueName: \"kubernetes.io/projected/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-kube-api-access-stl6s\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.503437 master-0 kubenswrapper[18707]: I0320 09:06:52.502793 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-locks-brick\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.503437 master-0 kubenswrapper[18707]: I0320 09:06:52.503218 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-combined-ca-bundle\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.503437 master-0 kubenswrapper[18707]: I0320 09:06:52.503298 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-scripts\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.503863 master-0 kubenswrapper[18707]: I0320 09:06:52.503651 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-locks-cinder\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.516221 master-0 kubenswrapper[18707]: I0320 09:06:52.504608 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-httpd-config\") pod \"neutron-555d58bf7b-nplsv\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.516221 master-0 kubenswrapper[18707]: I0320 09:06:52.504971 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-nvme\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.516221 master-0 kubenswrapper[18707]: I0320 09:06:52.505307 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkskk\" (UniqueName: \"kubernetes.io/projected/48c5811a-c534-461a-9cc1-2f35f6b6a43b-kube-api-access-pkskk\") pod \"neutron-555d58bf7b-nplsv\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.516221 master-0 kubenswrapper[18707]: I0320 09:06:52.505399 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-iscsi\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.516221 master-0 kubenswrapper[18707]: I0320 09:06:52.505427 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-combined-ca-bundle\") pod \"neutron-555d58bf7b-nplsv\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.516221 master-0 kubenswrapper[18707]: I0320 09:06:52.505585 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-run\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.516221 master-0 kubenswrapper[18707]: I0320 09:06:52.505644 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-sys\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.516221 master-0 kubenswrapper[18707]: I0320 09:06:52.505709 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-ovndb-tls-certs\") pod \"neutron-555d58bf7b-nplsv\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.516221 master-0 kubenswrapper[18707]: I0320 09:06:52.505808 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-config-data-custom\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.565239 master-0 kubenswrapper[18707]: I0320 09:06:52.563664 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bcb7479f5-z74kg"] Mar 20 09:06:52.588350 master-0 kubenswrapper[18707]: I0320 09:06:52.586242 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cd5f68b57-cbttm"] Mar 20 09:06:52.588350 master-0 kubenswrapper[18707]: I0320 09:06:52.588162 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.593476 master-0 kubenswrapper[18707]: I0320 09:06:52.592889 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.608584 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkskk\" (UniqueName: \"kubernetes.io/projected/48c5811a-c534-461a-9cc1-2f35f6b6a43b-kube-api-access-pkskk\") pod \"neutron-555d58bf7b-nplsv\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.608643 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-iscsi\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.608665 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-combined-ca-bundle\") pod \"neutron-555d58bf7b-nplsv\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.608698 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-run\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.608719 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-sys\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.608746 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-ovndb-tls-certs\") pod \"neutron-555d58bf7b-nplsv\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.608781 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-config-data-custom\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.608834 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-config\") pod \"neutron-555d58bf7b-nplsv\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.608856 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-dev\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.608895 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-machine-id\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.608917 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-config-data\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.608940 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-lib-modules\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.608959 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-lib-cinder\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.608982 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stl6s\" (UniqueName: \"kubernetes.io/projected/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-kube-api-access-stl6s\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.608998 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-locks-brick\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.609026 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-combined-ca-bundle\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.609043 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-scripts\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.609082 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-locks-cinder\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.609100 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-httpd-config\") pod \"neutron-555d58bf7b-nplsv\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.609116 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-nvme\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.609246 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-nvme\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.610259 master-0 kubenswrapper[18707]: I0320 09:06:52.609523 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-iscsi\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.621205 master-0 kubenswrapper[18707]: I0320 09:06:52.611247 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-sys\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.621205 master-0 kubenswrapper[18707]: I0320 09:06:52.611345 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-run\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.621205 master-0 kubenswrapper[18707]: I0320 09:06:52.611414 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-locks-brick\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.621205 master-0 kubenswrapper[18707]: I0320 09:06:52.611452 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-lib-modules\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.621205 master-0 kubenswrapper[18707]: I0320 09:06:52.611507 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-lib-cinder\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.621205 master-0 kubenswrapper[18707]: I0320 09:06:52.612304 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-dev\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.621205 master-0 kubenswrapper[18707]: I0320 09:06:52.617422 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-machine-id\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.621205 master-0 kubenswrapper[18707]: I0320 09:06:52.617543 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-locks-cinder\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.622799 master-0 kubenswrapper[18707]: I0320 09:06:52.621821 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cd5f68b57-cbttm"] Mar 20 09:06:52.622799 master-0 kubenswrapper[18707]: I0320 09:06:52.622738 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-combined-ca-bundle\") pod \"neutron-555d58bf7b-nplsv\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.623591 master-0 kubenswrapper[18707]: I0320 09:06:52.623382 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-httpd-config\") pod \"neutron-555d58bf7b-nplsv\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.623591 master-0 kubenswrapper[18707]: I0320 09:06:52.623559 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-config-data-custom\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.637694 master-0 kubenswrapper[18707]: I0320 09:06:52.624075 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-config\") pod \"neutron-555d58bf7b-nplsv\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.637694 master-0 kubenswrapper[18707]: I0320 09:06:52.627833 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-combined-ca-bundle\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.637694 master-0 kubenswrapper[18707]: I0320 09:06:52.635136 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-scripts\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.637694 master-0 kubenswrapper[18707]: I0320 09:06:52.636054 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-ovndb-tls-certs\") pod \"neutron-555d58bf7b-nplsv\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.643631 master-0 kubenswrapper[18707]: I0320 09:06:52.643577 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkskk\" (UniqueName: \"kubernetes.io/projected/48c5811a-c534-461a-9cc1-2f35f6b6a43b-kube-api-access-pkskk\") pod \"neutron-555d58bf7b-nplsv\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.657220 master-0 kubenswrapper[18707]: I0320 09:06:52.649788 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:06:52.657220 master-0 kubenswrapper[18707]: I0320 09:06:52.653474 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-config-data\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.657220 master-0 kubenswrapper[18707]: I0320 09:06:52.653935 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5987d8f9fd-q4m4b" podStartSLOduration=2.653907308 podStartE2EDuration="2.653907308s" podCreationTimestamp="2026-03-20 09:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:06:52.651071507 +0000 UTC m=+1557.807251873" watchObservedRunningTime="2026-03-20 09:06:52.653907308 +0000 UTC m=+1557.810087664" Mar 20 09:06:52.677941 master-0 kubenswrapper[18707]: I0320 09:06:52.674451 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stl6s\" (UniqueName: \"kubernetes.io/projected/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-kube-api-access-stl6s\") pod \"cinder-c920a-backup-0\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.722355 master-0 kubenswrapper[18707]: I0320 09:06:52.712404 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c920a-api-0"] Mar 20 09:06:52.722355 master-0 kubenswrapper[18707]: I0320 09:06:52.712762 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-ovsdbserver-sb\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.722355 master-0 kubenswrapper[18707]: I0320 09:06:52.712892 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz429\" (UniqueName: \"kubernetes.io/projected/4428e0d9-0da9-4aa4-8422-8baa68054f53-kube-api-access-lz429\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.722355 master-0 kubenswrapper[18707]: I0320 09:06:52.712959 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-edpm-a\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.722355 master-0 kubenswrapper[18707]: I0320 09:06:52.712999 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-dns-swift-storage-0\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.722355 master-0 kubenswrapper[18707]: I0320 09:06:52.713034 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-ovsdbserver-nb\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.722355 master-0 kubenswrapper[18707]: I0320 09:06:52.713062 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-edpm-b\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.722355 master-0 kubenswrapper[18707]: I0320 09:06:52.713100 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-config\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.722355 master-0 kubenswrapper[18707]: I0320 09:06:52.713714 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-dns-svc\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.725765 master-0 kubenswrapper[18707]: I0320 09:06:52.724406 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:52.746212 master-0 kubenswrapper[18707]: I0320 09:06:52.726307 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.746212 master-0 kubenswrapper[18707]: I0320 09:06:52.738071 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-c920a-api-config-data" Mar 20 09:06:52.809512 master-0 kubenswrapper[18707]: I0320 09:06:52.807718 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-api-0"] Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.817650 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-config-data-custom\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.817802 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmncv\" (UniqueName: \"kubernetes.io/projected/166568a4-b9dc-4ebc-899f-f5ef2ede8598-kube-api-access-bmncv\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.817838 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-dns-svc\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.817865 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/166568a4-b9dc-4ebc-899f-f5ef2ede8598-etc-machine-id\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.817899 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-scripts\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.817952 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-combined-ca-bundle\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.817976 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-ovsdbserver-sb\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.818002 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/166568a4-b9dc-4ebc-899f-f5ef2ede8598-logs\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.818038 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz429\" (UniqueName: \"kubernetes.io/projected/4428e0d9-0da9-4aa4-8422-8baa68054f53-kube-api-access-lz429\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.818065 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-edpm-a\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.818083 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-dns-swift-storage-0\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.818106 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-ovsdbserver-nb\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.818123 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-edpm-b\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.818143 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-config-data\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.818160 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-config\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.819572 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-dns-svc\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.820609 master-0 kubenswrapper[18707]: I0320 09:06:52.820345 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-ovsdbserver-sb\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.827254 master-0 kubenswrapper[18707]: I0320 09:06:52.825431 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-edpm-a\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.848648 master-0 kubenswrapper[18707]: I0320 09:06:52.845964 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-edpm-b\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.848648 master-0 kubenswrapper[18707]: I0320 09:06:52.846680 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-ovsdbserver-nb\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.848648 master-0 kubenswrapper[18707]: I0320 09:06:52.848548 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-dns-swift-storage-0\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.849830 master-0 kubenswrapper[18707]: I0320 09:06:52.848929 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-config\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:52.935248 master-0 kubenswrapper[18707]: I0320 09:06:52.932287 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmncv\" (UniqueName: \"kubernetes.io/projected/166568a4-b9dc-4ebc-899f-f5ef2ede8598-kube-api-access-bmncv\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.935248 master-0 kubenswrapper[18707]: I0320 09:06:52.932463 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/166568a4-b9dc-4ebc-899f-f5ef2ede8598-etc-machine-id\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.935248 master-0 kubenswrapper[18707]: I0320 09:06:52.932516 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-scripts\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.935248 master-0 kubenswrapper[18707]: I0320 09:06:52.932619 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-combined-ca-bundle\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.935248 master-0 kubenswrapper[18707]: I0320 09:06:52.932721 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/166568a4-b9dc-4ebc-899f-f5ef2ede8598-logs\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.935248 master-0 kubenswrapper[18707]: I0320 09:06:52.932965 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-config-data\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.935248 master-0 kubenswrapper[18707]: I0320 09:06:52.933142 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-config-data-custom\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.936131 master-0 kubenswrapper[18707]: I0320 09:06:52.936098 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/166568a4-b9dc-4ebc-899f-f5ef2ede8598-logs\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.936808 master-0 kubenswrapper[18707]: I0320 09:06:52.936791 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/166568a4-b9dc-4ebc-899f-f5ef2ede8598-etc-machine-id\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.982730 master-0 kubenswrapper[18707]: I0320 09:06:52.970858 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:52.983086 master-0 kubenswrapper[18707]: I0320 09:06:52.973865 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-combined-ca-bundle\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.983235 master-0 kubenswrapper[18707]: I0320 09:06:52.977465 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-config-data\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:52.998250 master-0 kubenswrapper[18707]: I0320 09:06:52.990106 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz429\" (UniqueName: \"kubernetes.io/projected/4428e0d9-0da9-4aa4-8422-8baa68054f53-kube-api-access-lz429\") pod \"dnsmasq-dns-7cd5f68b57-cbttm\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:53.019067 master-0 kubenswrapper[18707]: I0320 09:06:53.018997 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-config-data-custom\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:53.023739 master-0 kubenswrapper[18707]: I0320 09:06:53.023704 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-scripts\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:53.104448 master-0 kubenswrapper[18707]: I0320 09:06:53.104143 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:53.123252 master-0 kubenswrapper[18707]: I0320 09:06:53.122870 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmncv\" (UniqueName: \"kubernetes.io/projected/166568a4-b9dc-4ebc-899f-f5ef2ede8598-kube-api-access-bmncv\") pod \"cinder-c920a-api-0\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:06:53.149206 master-0 kubenswrapper[18707]: I0320 09:06:53.148576 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-api-0" Mar 20 09:06:53.216004 master-0 kubenswrapper[18707]: I0320 09:06:53.215818 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bcb7479f5-z74kg"] Mar 20 09:06:53.453817 master-0 kubenswrapper[18707]: I0320 09:06:53.453745 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" event={"ID":"1019781e-d4ae-4e36-8e6d-761b752b1aeb","Type":"ContainerStarted","Data":"26f99201bdb884e0419158dd5c806db5277595598d280ad97e65f2ea4a2e0d09"} Mar 20 09:06:53.513524 master-0 kubenswrapper[18707]: W0320 09:06:53.513333 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd73e9d3_234d_4470_b5a4_9abf382a12d7.slice/crio-6ae22b9f66da8d403dd2fa33209f3211afecbc48a42e679878c7895133fd3b4f WatchSource:0}: Error finding container 6ae22b9f66da8d403dd2fa33209f3211afecbc48a42e679878c7895133fd3b4f: Status 404 returned error can't find the container with id 6ae22b9f66da8d403dd2fa33209f3211afecbc48a42e679878c7895133fd3b4f Mar 20 09:06:53.518700 master-0 kubenswrapper[18707]: I0320 09:06:53.517250 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-volume-lvm-iscsi-0"] Mar 20 09:06:53.692543 master-0 kubenswrapper[18707]: I0320 09:06:53.692485 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-555d58bf7b-nplsv"] Mar 20 09:06:53.714790 master-0 kubenswrapper[18707]: W0320 09:06:53.714737 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48c5811a_c534_461a_9cc1_2f35f6b6a43b.slice/crio-a9d9d13423ecb8693cb1c4426f89b105f400046abcfd59780ef658cdcc1afc48 WatchSource:0}: Error finding container a9d9d13423ecb8693cb1c4426f89b105f400046abcfd59780ef658cdcc1afc48: Status 404 returned error can't find the container with id a9d9d13423ecb8693cb1c4426f89b105f400046abcfd59780ef658cdcc1afc48 Mar 20 09:06:53.820988 master-0 kubenswrapper[18707]: I0320 09:06:53.820947 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-scheduler-0"] Mar 20 09:06:53.917250 master-0 kubenswrapper[18707]: I0320 09:06:53.916999 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-api-0"] Mar 20 09:06:53.943440 master-0 kubenswrapper[18707]: W0320 09:06:53.943387 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod166568a4_b9dc_4ebc_899f_f5ef2ede8598.slice/crio-ac0040764854ebb9e2035ac069bcc511a323c35a8d7e9703fc6fecf3fddaae23 WatchSource:0}: Error finding container ac0040764854ebb9e2035ac069bcc511a323c35a8d7e9703fc6fecf3fddaae23: Status 404 returned error can't find the container with id ac0040764854ebb9e2035ac069bcc511a323c35a8d7e9703fc6fecf3fddaae23 Mar 20 09:06:54.037464 master-0 kubenswrapper[18707]: I0320 09:06:54.037409 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cd5f68b57-cbttm"] Mar 20 09:06:54.141613 master-0 kubenswrapper[18707]: I0320 09:06:54.141509 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-backup-0"] Mar 20 09:06:54.469150 master-0 kubenswrapper[18707]: I0320 09:06:54.469096 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-backup-0" event={"ID":"e20cdef0-3262-4c1b-b699-e5a4ff8270dd","Type":"ContainerStarted","Data":"e8840dce705213d366ec3d9594ab15da5694a6c1abd5ef298d1a06ea5704d17b"} Mar 20 09:06:54.475684 master-0 kubenswrapper[18707]: I0320 09:06:54.475554 18707 generic.go:334] "Generic (PLEG): container finished" podID="1019781e-d4ae-4e36-8e6d-761b752b1aeb" containerID="7940124f16d35ec2013a206a3b4959a531ca35f682063c296b20ab9f061cd94b" exitCode=0 Mar 20 09:06:54.475739 master-0 kubenswrapper[18707]: I0320 09:06:54.475678 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" event={"ID":"1019781e-d4ae-4e36-8e6d-761b752b1aeb","Type":"ContainerDied","Data":"7940124f16d35ec2013a206a3b4959a531ca35f682063c296b20ab9f061cd94b"} Mar 20 09:06:54.479605 master-0 kubenswrapper[18707]: I0320 09:06:54.479543 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" event={"ID":"dd73e9d3-234d-4470-b5a4-9abf382a12d7","Type":"ContainerStarted","Data":"6ae22b9f66da8d403dd2fa33209f3211afecbc48a42e679878c7895133fd3b4f"} Mar 20 09:06:54.480790 master-0 kubenswrapper[18707]: I0320 09:06:54.480762 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-scheduler-0" event={"ID":"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf","Type":"ContainerStarted","Data":"a8fe075204159a2e6dec36a1eb418825fdadf6828c962fb82e5614e792d9d2c7"} Mar 20 09:06:54.487612 master-0 kubenswrapper[18707]: I0320 09:06:54.486831 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-555d58bf7b-nplsv" event={"ID":"48c5811a-c534-461a-9cc1-2f35f6b6a43b","Type":"ContainerStarted","Data":"92aef4a1e7ac42f70a425304674348620192c74ae72ba60e80e61452607b81ec"} Mar 20 09:06:54.487612 master-0 kubenswrapper[18707]: I0320 09:06:54.486903 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-555d58bf7b-nplsv" event={"ID":"48c5811a-c534-461a-9cc1-2f35f6b6a43b","Type":"ContainerStarted","Data":"46f83be4dcad0c9ec2646ca57056a179e12d5be75f45fb0f9d9f3db320d4583c"} Mar 20 09:06:54.487612 master-0 kubenswrapper[18707]: I0320 09:06:54.486918 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-555d58bf7b-nplsv" event={"ID":"48c5811a-c534-461a-9cc1-2f35f6b6a43b","Type":"ContainerStarted","Data":"a9d9d13423ecb8693cb1c4426f89b105f400046abcfd59780ef658cdcc1afc48"} Mar 20 09:06:54.487612 master-0 kubenswrapper[18707]: I0320 09:06:54.487073 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:06:54.490406 master-0 kubenswrapper[18707]: I0320 09:06:54.489484 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-api-0" event={"ID":"166568a4-b9dc-4ebc-899f-f5ef2ede8598","Type":"ContainerStarted","Data":"ac0040764854ebb9e2035ac069bcc511a323c35a8d7e9703fc6fecf3fddaae23"} Mar 20 09:06:54.492481 master-0 kubenswrapper[18707]: I0320 09:06:54.492338 18707 generic.go:334] "Generic (PLEG): container finished" podID="4428e0d9-0da9-4aa4-8422-8baa68054f53" containerID="83d1b48c509ce4f31cd1091e9d10c0ec9cca32614658750f6cffebdf56b92619" exitCode=0 Mar 20 09:06:54.494352 master-0 kubenswrapper[18707]: I0320 09:06:54.494316 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" event={"ID":"4428e0d9-0da9-4aa4-8422-8baa68054f53","Type":"ContainerDied","Data":"83d1b48c509ce4f31cd1091e9d10c0ec9cca32614658750f6cffebdf56b92619"} Mar 20 09:06:54.494499 master-0 kubenswrapper[18707]: I0320 09:06:54.494354 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" event={"ID":"4428e0d9-0da9-4aa4-8422-8baa68054f53","Type":"ContainerStarted","Data":"064a3ef553ed7694a421ad2695f971db9b8daecf098d7155dfed831c8060c5a9"} Mar 20 09:06:54.941255 master-0 kubenswrapper[18707]: I0320 09:06:54.941163 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" Mar 20 09:06:54.954339 master-0 kubenswrapper[18707]: I0320 09:06:54.954222 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:54.999985 master-0 kubenswrapper[18707]: I0320 09:06:54.999900 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-555d58bf7b-nplsv" podStartSLOduration=2.999876326 podStartE2EDuration="2.999876326s" podCreationTimestamp="2026-03-20 09:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:06:54.587923645 +0000 UTC m=+1559.744104021" watchObservedRunningTime="2026-03-20 09:06:54.999876326 +0000 UTC m=+1560.156056702" Mar 20 09:06:55.056689 master-0 kubenswrapper[18707]: I0320 09:06:55.052583 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-ovsdbserver-sb\") pod \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " Mar 20 09:06:55.056689 master-0 kubenswrapper[18707]: I0320 09:06:55.052644 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-dns-swift-storage-0\") pod \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " Mar 20 09:06:55.056689 master-0 kubenswrapper[18707]: I0320 09:06:55.052716 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-edpm-a\") pod \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " Mar 20 09:06:55.056689 master-0 kubenswrapper[18707]: I0320 09:06:55.053142 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-config\") pod \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " Mar 20 09:06:55.056689 master-0 kubenswrapper[18707]: I0320 09:06:55.053238 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-ovsdbserver-nb\") pod \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " Mar 20 09:06:55.056689 master-0 kubenswrapper[18707]: I0320 09:06:55.053351 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm75l\" (UniqueName: \"kubernetes.io/projected/1019781e-d4ae-4e36-8e6d-761b752b1aeb-kube-api-access-jm75l\") pod \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " Mar 20 09:06:55.056689 master-0 kubenswrapper[18707]: I0320 09:06:55.053423 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-edpm-b\") pod \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " Mar 20 09:06:55.056689 master-0 kubenswrapper[18707]: I0320 09:06:55.053463 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-dns-svc\") pod \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\" (UID: \"1019781e-d4ae-4e36-8e6d-761b752b1aeb\") " Mar 20 09:06:55.080667 master-0 kubenswrapper[18707]: I0320 09:06:55.080508 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1019781e-d4ae-4e36-8e6d-761b752b1aeb-kube-api-access-jm75l" (OuterVolumeSpecName: "kube-api-access-jm75l") pod "1019781e-d4ae-4e36-8e6d-761b752b1aeb" (UID: "1019781e-d4ae-4e36-8e6d-761b752b1aeb"). InnerVolumeSpecName "kube-api-access-jm75l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:55.080896 master-0 kubenswrapper[18707]: I0320 09:06:55.080759 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-edpm-b" (OuterVolumeSpecName: "edpm-b") pod "1019781e-d4ae-4e36-8e6d-761b752b1aeb" (UID: "1019781e-d4ae-4e36-8e6d-761b752b1aeb"). InnerVolumeSpecName "edpm-b". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:55.093112 master-0 kubenswrapper[18707]: I0320 09:06:55.090174 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-config" (OuterVolumeSpecName: "config") pod "1019781e-d4ae-4e36-8e6d-761b752b1aeb" (UID: "1019781e-d4ae-4e36-8e6d-761b752b1aeb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:55.093112 master-0 kubenswrapper[18707]: I0320 09:06:55.091815 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1019781e-d4ae-4e36-8e6d-761b752b1aeb" (UID: "1019781e-d4ae-4e36-8e6d-761b752b1aeb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:55.095719 master-0 kubenswrapper[18707]: I0320 09:06:55.094676 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "1019781e-d4ae-4e36-8e6d-761b752b1aeb" (UID: "1019781e-d4ae-4e36-8e6d-761b752b1aeb"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:55.106584 master-0 kubenswrapper[18707]: I0320 09:06:55.106543 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1019781e-d4ae-4e36-8e6d-761b752b1aeb" (UID: "1019781e-d4ae-4e36-8e6d-761b752b1aeb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:55.174075 master-0 kubenswrapper[18707]: I0320 09:06:55.138653 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1019781e-d4ae-4e36-8e6d-761b752b1aeb" (UID: "1019781e-d4ae-4e36-8e6d-761b752b1aeb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:55.174075 master-0 kubenswrapper[18707]: I0320 09:06:55.157839 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1019781e-d4ae-4e36-8e6d-761b752b1aeb" (UID: "1019781e-d4ae-4e36-8e6d-761b752b1aeb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:55.174075 master-0 kubenswrapper[18707]: I0320 09:06:55.163216 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:55.174075 master-0 kubenswrapper[18707]: I0320 09:06:55.163256 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:55.174075 master-0 kubenswrapper[18707]: I0320 09:06:55.163273 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm75l\" (UniqueName: \"kubernetes.io/projected/1019781e-d4ae-4e36-8e6d-761b752b1aeb-kube-api-access-jm75l\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:55.174075 master-0 kubenswrapper[18707]: I0320 09:06:55.163285 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:55.174075 master-0 kubenswrapper[18707]: I0320 09:06:55.163293 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:55.174075 master-0 kubenswrapper[18707]: I0320 09:06:55.163302 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:55.174075 master-0 kubenswrapper[18707]: I0320 09:06:55.163314 18707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:55.174075 master-0 kubenswrapper[18707]: I0320 09:06:55.163325 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/1019781e-d4ae-4e36-8e6d-761b752b1aeb-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:55.528661 master-0 kubenswrapper[18707]: I0320 09:06:55.519797 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" Mar 20 09:06:55.528661 master-0 kubenswrapper[18707]: I0320 09:06:55.520899 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bcb7479f5-z74kg" event={"ID":"1019781e-d4ae-4e36-8e6d-761b752b1aeb","Type":"ContainerDied","Data":"26f99201bdb884e0419158dd5c806db5277595598d280ad97e65f2ea4a2e0d09"} Mar 20 09:06:55.528661 master-0 kubenswrapper[18707]: I0320 09:06:55.520942 18707 scope.go:117] "RemoveContainer" containerID="7940124f16d35ec2013a206a3b4959a531ca35f682063c296b20ab9f061cd94b" Mar 20 09:06:55.548543 master-0 kubenswrapper[18707]: I0320 09:06:55.538634 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" event={"ID":"dd73e9d3-234d-4470-b5a4-9abf382a12d7","Type":"ContainerStarted","Data":"6f8663594828bb76bc1343794734214f71dce271486b5a2b4189d57b43da0130"} Mar 20 09:06:55.548543 master-0 kubenswrapper[18707]: I0320 09:06:55.538686 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" event={"ID":"dd73e9d3-234d-4470-b5a4-9abf382a12d7","Type":"ContainerStarted","Data":"ae05bc92d214153ce18f48c52230166a2e389f20f1e35ff98414dcb3986ae81e"} Mar 20 09:06:55.548543 master-0 kubenswrapper[18707]: I0320 09:06:55.544749 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-api-0" event={"ID":"166568a4-b9dc-4ebc-899f-f5ef2ede8598","Type":"ContainerStarted","Data":"9446f8c6bf87d8083457e47dbb4f959cf9a6b9a980b695537cdfe22ffc487e04"} Mar 20 09:06:55.598201 master-0 kubenswrapper[18707]: I0320 09:06:55.595344 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" event={"ID":"4428e0d9-0da9-4aa4-8422-8baa68054f53","Type":"ContainerStarted","Data":"b50f09fc604465b033012e195b4d71b6f8106ade9dd4d689d929e810eae17345"} Mar 20 09:06:55.598201 master-0 kubenswrapper[18707]: I0320 09:06:55.595403 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:06:55.659429 master-0 kubenswrapper[18707]: I0320 09:06:55.659367 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bcb7479f5-z74kg"] Mar 20 09:06:55.675328 master-0 kubenswrapper[18707]: I0320 09:06:55.675290 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bcb7479f5-z74kg"] Mar 20 09:06:55.680945 master-0 kubenswrapper[18707]: I0320 09:06:55.680882 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" podStartSLOduration=2.72738639 podStartE2EDuration="3.680864507s" podCreationTimestamp="2026-03-20 09:06:52 +0000 UTC" firstStartedPulling="2026-03-20 09:06:53.522364445 +0000 UTC m=+1558.678544801" lastFinishedPulling="2026-03-20 09:06:54.475842562 +0000 UTC m=+1559.632022918" observedRunningTime="2026-03-20 09:06:55.626149483 +0000 UTC m=+1560.782329829" watchObservedRunningTime="2026-03-20 09:06:55.680864507 +0000 UTC m=+1560.837044863" Mar 20 09:06:55.690412 master-0 kubenswrapper[18707]: I0320 09:06:55.690351 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" podStartSLOduration=3.690336477 podStartE2EDuration="3.690336477s" podCreationTimestamp="2026-03-20 09:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:06:55.677231303 +0000 UTC m=+1560.833411669" watchObservedRunningTime="2026-03-20 09:06:55.690336477 +0000 UTC m=+1560.846516823" Mar 20 09:06:56.396447 master-0 kubenswrapper[18707]: I0320 09:06:56.396380 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/edpm-b-provisionserver-openstackprovisionserver-6bf75656cftnmfr" Mar 20 09:06:56.645916 master-0 kubenswrapper[18707]: I0320 09:06:56.645224 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-backup-0" event={"ID":"e20cdef0-3262-4c1b-b699-e5a4ff8270dd","Type":"ContainerStarted","Data":"76619726d4ffcbe6fbf6e87609fdd164190939a0787706efd1f18d901c02960f"} Mar 20 09:06:56.645916 master-0 kubenswrapper[18707]: I0320 09:06:56.645303 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-backup-0" event={"ID":"e20cdef0-3262-4c1b-b699-e5a4ff8270dd","Type":"ContainerStarted","Data":"f09fd9bfa4428d92d2bcebf97d69d708bd420c16ec239cddfa8f74a6bfb8e1bb"} Mar 20 09:06:56.681456 master-0 kubenswrapper[18707]: I0320 09:06:56.679929 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-scheduler-0" event={"ID":"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf","Type":"ContainerStarted","Data":"f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9"} Mar 20 09:06:56.684772 master-0 kubenswrapper[18707]: I0320 09:06:56.684131 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c920a-api-0"] Mar 20 09:06:56.701989 master-0 kubenswrapper[18707]: I0320 09:06:56.701926 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-api-0" event={"ID":"166568a4-b9dc-4ebc-899f-f5ef2ede8598","Type":"ContainerStarted","Data":"7808fbc2549cf24f811289a515af47ade434736fbad8e4f2d38f17aa639aa415"} Mar 20 09:06:56.702795 master-0 kubenswrapper[18707]: I0320 09:06:56.702767 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-c920a-api-0" Mar 20 09:06:56.728037 master-0 kubenswrapper[18707]: I0320 09:06:56.727964 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-c920a-backup-0" podStartSLOduration=3.743494335 podStartE2EDuration="4.727943617s" podCreationTimestamp="2026-03-20 09:06:52 +0000 UTC" firstStartedPulling="2026-03-20 09:06:54.172914645 +0000 UTC m=+1559.329095001" lastFinishedPulling="2026-03-20 09:06:55.157363927 +0000 UTC m=+1560.313544283" observedRunningTime="2026-03-20 09:06:56.688255243 +0000 UTC m=+1561.844435629" watchObservedRunningTime="2026-03-20 09:06:56.727943617 +0000 UTC m=+1561.884123973" Mar 20 09:06:56.756408 master-0 kubenswrapper[18707]: I0320 09:06:56.756314 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-c920a-api-0" podStartSLOduration=4.756286467 podStartE2EDuration="4.756286467s" podCreationTimestamp="2026-03-20 09:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:06:56.739441936 +0000 UTC m=+1561.895622312" watchObservedRunningTime="2026-03-20 09:06:56.756286467 +0000 UTC m=+1561.912466823" Mar 20 09:06:56.863573 master-0 kubenswrapper[18707]: I0320 09:06:56.863504 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/edpm-b-provisionserver-checksum-discovery-mf2jz"] Mar 20 09:06:56.864363 master-0 kubenswrapper[18707]: E0320 09:06:56.864332 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1019781e-d4ae-4e36-8e6d-761b752b1aeb" containerName="init" Mar 20 09:06:56.864416 master-0 kubenswrapper[18707]: I0320 09:06:56.864379 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1019781e-d4ae-4e36-8e6d-761b752b1aeb" containerName="init" Mar 20 09:06:56.864900 master-0 kubenswrapper[18707]: I0320 09:06:56.864827 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1019781e-d4ae-4e36-8e6d-761b752b1aeb" containerName="init" Mar 20 09:06:56.867741 master-0 kubenswrapper[18707]: I0320 09:06:56.867677 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" Mar 20 09:06:56.886942 master-0 kubenswrapper[18707]: I0320 09:06:56.886871 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/edpm-b-provisionserver-checksum-discovery-mf2jz"] Mar 20 09:06:56.962258 master-0 kubenswrapper[18707]: I0320 09:06:56.960074 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dhhn\" (UniqueName: \"kubernetes.io/projected/bae03209-559e-4828-bcb6-70eb73ff61dc-kube-api-access-8dhhn\") pod \"edpm-b-provisionserver-checksum-discovery-mf2jz\" (UID: \"bae03209-559e-4828-bcb6-70eb73ff61dc\") " pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" Mar 20 09:06:56.962258 master-0 kubenswrapper[18707]: I0320 09:06:56.960386 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/bae03209-559e-4828-bcb6-70eb73ff61dc-image-data\") pod \"edpm-b-provisionserver-checksum-discovery-mf2jz\" (UID: \"bae03209-559e-4828-bcb6-70eb73ff61dc\") " pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" Mar 20 09:06:57.066458 master-0 kubenswrapper[18707]: I0320 09:06:57.064756 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/bae03209-559e-4828-bcb6-70eb73ff61dc-image-data\") pod \"edpm-b-provisionserver-checksum-discovery-mf2jz\" (UID: \"bae03209-559e-4828-bcb6-70eb73ff61dc\") " pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" Mar 20 09:06:57.066458 master-0 kubenswrapper[18707]: I0320 09:06:57.064911 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dhhn\" (UniqueName: \"kubernetes.io/projected/bae03209-559e-4828-bcb6-70eb73ff61dc-kube-api-access-8dhhn\") pod \"edpm-b-provisionserver-checksum-discovery-mf2jz\" (UID: \"bae03209-559e-4828-bcb6-70eb73ff61dc\") " pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" Mar 20 09:06:57.066458 master-0 kubenswrapper[18707]: I0320 09:06:57.065659 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/bae03209-559e-4828-bcb6-70eb73ff61dc-image-data\") pod \"edpm-b-provisionserver-checksum-discovery-mf2jz\" (UID: \"bae03209-559e-4828-bcb6-70eb73ff61dc\") " pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" Mar 20 09:06:57.087207 master-0 kubenswrapper[18707]: I0320 09:06:57.087095 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dhhn\" (UniqueName: \"kubernetes.io/projected/bae03209-559e-4828-bcb6-70eb73ff61dc-kube-api-access-8dhhn\") pod \"edpm-b-provisionserver-checksum-discovery-mf2jz\" (UID: \"bae03209-559e-4828-bcb6-70eb73ff61dc\") " pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" Mar 20 09:06:57.120206 master-0 kubenswrapper[18707]: I0320 09:06:57.119344 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1019781e-d4ae-4e36-8e6d-761b752b1aeb" path="/var/lib/kubelet/pods/1019781e-d4ae-4e36-8e6d-761b752b1aeb/volumes" Mar 20 09:06:57.252273 master-0 kubenswrapper[18707]: I0320 09:06:57.252214 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" Mar 20 09:06:57.595910 master-0 kubenswrapper[18707]: I0320 09:06:57.595835 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:06:57.741359 master-0 kubenswrapper[18707]: I0320 09:06:57.741288 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-scheduler-0" event={"ID":"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf","Type":"ContainerStarted","Data":"2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b"} Mar 20 09:06:57.742919 master-0 kubenswrapper[18707]: I0320 09:06:57.742886 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-c920a-api-0" podUID="166568a4-b9dc-4ebc-899f-f5ef2ede8598" containerName="cinder-c920a-api-log" containerID="cri-o://9446f8c6bf87d8083457e47dbb4f959cf9a6b9a980b695537cdfe22ffc487e04" gracePeriod=30 Mar 20 09:06:57.743179 master-0 kubenswrapper[18707]: I0320 09:06:57.743153 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-c920a-api-0" podUID="166568a4-b9dc-4ebc-899f-f5ef2ede8598" containerName="cinder-api" containerID="cri-o://7808fbc2549cf24f811289a515af47ade434736fbad8e4f2d38f17aa639aa415" gracePeriod=30 Mar 20 09:06:57.941305 master-0 kubenswrapper[18707]: I0320 09:06:57.941248 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/edpm-b-provisionserver-checksum-discovery-mf2jz"] Mar 20 09:06:57.950162 master-0 kubenswrapper[18707]: I0320 09:06:57.950072 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-c920a-scheduler-0" podStartSLOduration=4.911297316 podStartE2EDuration="5.95003793s" podCreationTimestamp="2026-03-20 09:06:52 +0000 UTC" firstStartedPulling="2026-03-20 09:06:53.814383159 +0000 UTC m=+1558.970563515" lastFinishedPulling="2026-03-20 09:06:54.853123773 +0000 UTC m=+1560.009304129" observedRunningTime="2026-03-20 09:06:57.927718613 +0000 UTC m=+1563.083898969" watchObservedRunningTime="2026-03-20 09:06:57.95003793 +0000 UTC m=+1563.106218286" Mar 20 09:06:57.973533 master-0 kubenswrapper[18707]: I0320 09:06:57.973465 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-c920a-backup-0" Mar 20 09:06:58.778741 master-0 kubenswrapper[18707]: I0320 09:06:58.778371 18707 generic.go:334] "Generic (PLEG): container finished" podID="166568a4-b9dc-4ebc-899f-f5ef2ede8598" containerID="7808fbc2549cf24f811289a515af47ade434736fbad8e4f2d38f17aa639aa415" exitCode=0 Mar 20 09:06:58.778741 master-0 kubenswrapper[18707]: I0320 09:06:58.778409 18707 generic.go:334] "Generic (PLEG): container finished" podID="166568a4-b9dc-4ebc-899f-f5ef2ede8598" containerID="9446f8c6bf87d8083457e47dbb4f959cf9a6b9a980b695537cdfe22ffc487e04" exitCode=143 Mar 20 09:06:58.778741 master-0 kubenswrapper[18707]: I0320 09:06:58.778413 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-api-0" event={"ID":"166568a4-b9dc-4ebc-899f-f5ef2ede8598","Type":"ContainerDied","Data":"7808fbc2549cf24f811289a515af47ade434736fbad8e4f2d38f17aa639aa415"} Mar 20 09:06:58.778741 master-0 kubenswrapper[18707]: I0320 09:06:58.778475 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-api-0" event={"ID":"166568a4-b9dc-4ebc-899f-f5ef2ede8598","Type":"ContainerDied","Data":"9446f8c6bf87d8083457e47dbb4f959cf9a6b9a980b695537cdfe22ffc487e04"} Mar 20 09:06:58.791378 master-0 kubenswrapper[18707]: I0320 09:06:58.785734 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" event={"ID":"bae03209-559e-4828-bcb6-70eb73ff61dc","Type":"ContainerStarted","Data":"7e7cea23c7318292592a7d106763c1c985071971334264d512fa9e9df258faa2"} Mar 20 09:06:58.791378 master-0 kubenswrapper[18707]: I0320 09:06:58.785796 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" event={"ID":"bae03209-559e-4828-bcb6-70eb73ff61dc","Type":"ContainerStarted","Data":"05f75b7b8030eb42acaa78a590dc6a2e04425c7e8739ca2e01fff989fd0fd366"} Mar 20 09:06:59.181109 master-0 kubenswrapper[18707]: I0320 09:06:59.181053 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-api-0" Mar 20 09:06:59.353659 master-0 kubenswrapper[18707]: I0320 09:06:59.353495 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-config-data-custom\") pod \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " Mar 20 09:06:59.354044 master-0 kubenswrapper[18707]: I0320 09:06:59.354028 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/166568a4-b9dc-4ebc-899f-f5ef2ede8598-logs\") pod \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " Mar 20 09:06:59.354293 master-0 kubenswrapper[18707]: I0320 09:06:59.354181 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmncv\" (UniqueName: \"kubernetes.io/projected/166568a4-b9dc-4ebc-899f-f5ef2ede8598-kube-api-access-bmncv\") pod \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " Mar 20 09:06:59.354411 master-0 kubenswrapper[18707]: I0320 09:06:59.354397 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-combined-ca-bundle\") pod \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " Mar 20 09:06:59.354522 master-0 kubenswrapper[18707]: I0320 09:06:59.354509 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-config-data\") pod \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " Mar 20 09:06:59.354629 master-0 kubenswrapper[18707]: I0320 09:06:59.354616 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/166568a4-b9dc-4ebc-899f-f5ef2ede8598-etc-machine-id\") pod \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " Mar 20 09:06:59.354724 master-0 kubenswrapper[18707]: I0320 09:06:59.354670 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/166568a4-b9dc-4ebc-899f-f5ef2ede8598-logs" (OuterVolumeSpecName: "logs") pod "166568a4-b9dc-4ebc-899f-f5ef2ede8598" (UID: "166568a4-b9dc-4ebc-899f-f5ef2ede8598"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:06:59.354784 master-0 kubenswrapper[18707]: I0320 09:06:59.354702 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-scripts\") pod \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\" (UID: \"166568a4-b9dc-4ebc-899f-f5ef2ede8598\") " Mar 20 09:06:59.356088 master-0 kubenswrapper[18707]: I0320 09:06:59.356046 18707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/166568a4-b9dc-4ebc-899f-f5ef2ede8598-logs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:59.357763 master-0 kubenswrapper[18707]: I0320 09:06:59.357574 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/166568a4-b9dc-4ebc-899f-f5ef2ede8598-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "166568a4-b9dc-4ebc-899f-f5ef2ede8598" (UID: "166568a4-b9dc-4ebc-899f-f5ef2ede8598"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:06:59.358602 master-0 kubenswrapper[18707]: I0320 09:06:59.358526 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-scripts" (OuterVolumeSpecName: "scripts") pod "166568a4-b9dc-4ebc-899f-f5ef2ede8598" (UID: "166568a4-b9dc-4ebc-899f-f5ef2ede8598"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:59.362795 master-0 kubenswrapper[18707]: I0320 09:06:59.362660 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/166568a4-b9dc-4ebc-899f-f5ef2ede8598-kube-api-access-bmncv" (OuterVolumeSpecName: "kube-api-access-bmncv") pod "166568a4-b9dc-4ebc-899f-f5ef2ede8598" (UID: "166568a4-b9dc-4ebc-899f-f5ef2ede8598"). InnerVolumeSpecName "kube-api-access-bmncv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:59.365838 master-0 kubenswrapper[18707]: I0320 09:06:59.365759 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "166568a4-b9dc-4ebc-899f-f5ef2ede8598" (UID: "166568a4-b9dc-4ebc-899f-f5ef2ede8598"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:59.388320 master-0 kubenswrapper[18707]: I0320 09:06:59.388235 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "166568a4-b9dc-4ebc-899f-f5ef2ede8598" (UID: "166568a4-b9dc-4ebc-899f-f5ef2ede8598"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:59.454215 master-0 kubenswrapper[18707]: I0320 09:06:59.454078 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-config-data" (OuterVolumeSpecName: "config-data") pod "166568a4-b9dc-4ebc-899f-f5ef2ede8598" (UID: "166568a4-b9dc-4ebc-899f-f5ef2ede8598"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:59.461846 master-0 kubenswrapper[18707]: I0320 09:06:59.461776 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmncv\" (UniqueName: \"kubernetes.io/projected/166568a4-b9dc-4ebc-899f-f5ef2ede8598-kube-api-access-bmncv\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:59.461846 master-0 kubenswrapper[18707]: I0320 09:06:59.461841 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:59.461846 master-0 kubenswrapper[18707]: I0320 09:06:59.461863 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:59.462128 master-0 kubenswrapper[18707]: I0320 09:06:59.461883 18707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/166568a4-b9dc-4ebc-899f-f5ef2ede8598-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:59.462128 master-0 kubenswrapper[18707]: I0320 09:06:59.461901 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:59.462128 master-0 kubenswrapper[18707]: I0320 09:06:59.461918 18707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/166568a4-b9dc-4ebc-899f-f5ef2ede8598-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 20 09:06:59.801233 master-0 kubenswrapper[18707]: I0320 09:06:59.801151 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-api-0" event={"ID":"166568a4-b9dc-4ebc-899f-f5ef2ede8598","Type":"ContainerDied","Data":"ac0040764854ebb9e2035ac069bcc511a323c35a8d7e9703fc6fecf3fddaae23"} Mar 20 09:06:59.802223 master-0 kubenswrapper[18707]: I0320 09:06:59.801243 18707 scope.go:117] "RemoveContainer" containerID="7808fbc2549cf24f811289a515af47ade434736fbad8e4f2d38f17aa639aa415" Mar 20 09:06:59.802223 master-0 kubenswrapper[18707]: I0320 09:06:59.801389 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-api-0" Mar 20 09:06:59.938235 master-0 kubenswrapper[18707]: I0320 09:06:59.938162 18707 scope.go:117] "RemoveContainer" containerID="9446f8c6bf87d8083457e47dbb4f959cf9a6b9a980b695537cdfe22ffc487e04" Mar 20 09:06:59.942398 master-0 kubenswrapper[18707]: I0320 09:06:59.941359 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" Mar 20 09:06:59.989504 master-0 kubenswrapper[18707]: I0320 09:06:59.989436 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" Mar 20 09:07:00.815507 master-0 kubenswrapper[18707]: I0320 09:07:00.815454 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/edpm-a-provisionserver-openstackprovisionserver-78fd94b97-tkt7j" Mar 20 09:07:01.255670 master-0 kubenswrapper[18707]: I0320 09:07:01.255464 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c920a-api-0"] Mar 20 09:07:01.403856 master-0 kubenswrapper[18707]: I0320 09:07:01.403726 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-c920a-api-0"] Mar 20 09:07:02.127274 master-0 kubenswrapper[18707]: I0320 09:07:02.127157 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c920a-api-0"] Mar 20 09:07:02.128097 master-0 kubenswrapper[18707]: E0320 09:07:02.127822 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166568a4-b9dc-4ebc-899f-f5ef2ede8598" containerName="cinder-c920a-api-log" Mar 20 09:07:02.128097 master-0 kubenswrapper[18707]: I0320 09:07:02.127843 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="166568a4-b9dc-4ebc-899f-f5ef2ede8598" containerName="cinder-c920a-api-log" Mar 20 09:07:02.128097 master-0 kubenswrapper[18707]: E0320 09:07:02.127891 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166568a4-b9dc-4ebc-899f-f5ef2ede8598" containerName="cinder-api" Mar 20 09:07:02.128097 master-0 kubenswrapper[18707]: I0320 09:07:02.127899 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="166568a4-b9dc-4ebc-899f-f5ef2ede8598" containerName="cinder-api" Mar 20 09:07:02.128361 master-0 kubenswrapper[18707]: I0320 09:07:02.128269 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="166568a4-b9dc-4ebc-899f-f5ef2ede8598" containerName="cinder-api" Mar 20 09:07:02.128361 master-0 kubenswrapper[18707]: I0320 09:07:02.128298 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="166568a4-b9dc-4ebc-899f-f5ef2ede8598" containerName="cinder-c920a-api-log" Mar 20 09:07:02.129939 master-0 kubenswrapper[18707]: I0320 09:07:02.129896 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.134968 master-0 kubenswrapper[18707]: I0320 09:07:02.134909 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 09:07:02.135269 master-0 kubenswrapper[18707]: I0320 09:07:02.135122 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-c920a-api-config-data" Mar 20 09:07:02.135338 master-0 kubenswrapper[18707]: I0320 09:07:02.135282 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 09:07:02.234065 master-0 kubenswrapper[18707]: I0320 09:07:02.233999 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22dba0f1-9027-4e9e-857b-915b805e1265-logs\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.250849 master-0 kubenswrapper[18707]: I0320 09:07:02.250771 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-api-0"] Mar 20 09:07:02.251406 master-0 kubenswrapper[18707]: I0320 09:07:02.234419 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22dba0f1-9027-4e9e-857b-915b805e1265-etc-machine-id\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.251817 master-0 kubenswrapper[18707]: I0320 09:07:02.251775 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-combined-ca-bundle\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.252067 master-0 kubenswrapper[18707]: I0320 09:07:02.252034 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-scripts\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.252120 master-0 kubenswrapper[18707]: I0320 09:07:02.252091 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-config-data\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.252444 master-0 kubenswrapper[18707]: I0320 09:07:02.252413 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-internal-tls-certs\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.252543 master-0 kubenswrapper[18707]: I0320 09:07:02.252512 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-config-data-custom\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.252592 master-0 kubenswrapper[18707]: I0320 09:07:02.252577 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-public-tls-certs\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.252631 master-0 kubenswrapper[18707]: I0320 09:07:02.252608 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xhnj\" (UniqueName: \"kubernetes.io/projected/22dba0f1-9027-4e9e-857b-915b805e1265-kube-api-access-2xhnj\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.356213 master-0 kubenswrapper[18707]: I0320 09:07:02.355934 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-internal-tls-certs\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.356213 master-0 kubenswrapper[18707]: I0320 09:07:02.356026 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-config-data-custom\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.356213 master-0 kubenswrapper[18707]: I0320 09:07:02.356069 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-public-tls-certs\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.356213 master-0 kubenswrapper[18707]: I0320 09:07:02.356094 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xhnj\" (UniqueName: \"kubernetes.io/projected/22dba0f1-9027-4e9e-857b-915b805e1265-kube-api-access-2xhnj\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.356213 master-0 kubenswrapper[18707]: I0320 09:07:02.356155 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22dba0f1-9027-4e9e-857b-915b805e1265-logs\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.356213 master-0 kubenswrapper[18707]: I0320 09:07:02.356209 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22dba0f1-9027-4e9e-857b-915b805e1265-etc-machine-id\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.356704 master-0 kubenswrapper[18707]: I0320 09:07:02.356296 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-combined-ca-bundle\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.356704 master-0 kubenswrapper[18707]: I0320 09:07:02.356369 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-scripts\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.356704 master-0 kubenswrapper[18707]: I0320 09:07:02.356395 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-config-data\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.364216 master-0 kubenswrapper[18707]: I0320 09:07:02.357131 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22dba0f1-9027-4e9e-857b-915b805e1265-etc-machine-id\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.364216 master-0 kubenswrapper[18707]: I0320 09:07:02.357576 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22dba0f1-9027-4e9e-857b-915b805e1265-logs\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.364216 master-0 kubenswrapper[18707]: I0320 09:07:02.363595 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-internal-tls-certs\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.368304 master-0 kubenswrapper[18707]: I0320 09:07:02.368228 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/edpm-a-provisionserver-checksum-discovery-49xmz"] Mar 20 09:07:02.373967 master-0 kubenswrapper[18707]: I0320 09:07:02.373037 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-config-data-custom\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.373967 master-0 kubenswrapper[18707]: I0320 09:07:02.373430 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-scripts\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.373967 master-0 kubenswrapper[18707]: I0320 09:07:02.373482 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-config-data\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.373967 master-0 kubenswrapper[18707]: I0320 09:07:02.373887 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-combined-ca-bundle\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.385278 master-0 kubenswrapper[18707]: I0320 09:07:02.381153 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xhnj\" (UniqueName: \"kubernetes.io/projected/22dba0f1-9027-4e9e-857b-915b805e1265-kube-api-access-2xhnj\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.385278 master-0 kubenswrapper[18707]: I0320 09:07:02.384126 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22dba0f1-9027-4e9e-857b-915b805e1265-public-tls-certs\") pod \"cinder-c920a-api-0\" (UID: \"22dba0f1-9027-4e9e-857b-915b805e1265\") " pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.387296 master-0 kubenswrapper[18707]: I0320 09:07:02.386220 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/edpm-a-provisionserver-checksum-discovery-49xmz"] Mar 20 09:07:02.387296 master-0 kubenswrapper[18707]: I0320 09:07:02.386359 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-a-provisionserver-checksum-discovery-49xmz" Mar 20 09:07:02.459371 master-0 kubenswrapper[18707]: I0320 09:07:02.459312 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fprh7\" (UniqueName: \"kubernetes.io/projected/4d44d4b7-ce01-4aa4-9155-8338ad17b404-kube-api-access-fprh7\") pod \"edpm-a-provisionserver-checksum-discovery-49xmz\" (UID: \"4d44d4b7-ce01-4aa4-9155-8338ad17b404\") " pod="openstack/edpm-a-provisionserver-checksum-discovery-49xmz" Mar 20 09:07:02.459584 master-0 kubenswrapper[18707]: I0320 09:07:02.459490 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/4d44d4b7-ce01-4aa4-9155-8338ad17b404-image-data\") pod \"edpm-a-provisionserver-checksum-discovery-49xmz\" (UID: \"4d44d4b7-ce01-4aa4-9155-8338ad17b404\") " pod="openstack/edpm-a-provisionserver-checksum-discovery-49xmz" Mar 20 09:07:02.461647 master-0 kubenswrapper[18707]: I0320 09:07:02.460974 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-api-0" Mar 20 09:07:02.565315 master-0 kubenswrapper[18707]: I0320 09:07:02.562709 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fprh7\" (UniqueName: \"kubernetes.io/projected/4d44d4b7-ce01-4aa4-9155-8338ad17b404-kube-api-access-fprh7\") pod \"edpm-a-provisionserver-checksum-discovery-49xmz\" (UID: \"4d44d4b7-ce01-4aa4-9155-8338ad17b404\") " pod="openstack/edpm-a-provisionserver-checksum-discovery-49xmz" Mar 20 09:07:02.565315 master-0 kubenswrapper[18707]: I0320 09:07:02.563002 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/4d44d4b7-ce01-4aa4-9155-8338ad17b404-image-data\") pod \"edpm-a-provisionserver-checksum-discovery-49xmz\" (UID: \"4d44d4b7-ce01-4aa4-9155-8338ad17b404\") " pod="openstack/edpm-a-provisionserver-checksum-discovery-49xmz" Mar 20 09:07:02.565315 master-0 kubenswrapper[18707]: I0320 09:07:02.564022 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/4d44d4b7-ce01-4aa4-9155-8338ad17b404-image-data\") pod \"edpm-a-provisionserver-checksum-discovery-49xmz\" (UID: \"4d44d4b7-ce01-4aa4-9155-8338ad17b404\") " pod="openstack/edpm-a-provisionserver-checksum-discovery-49xmz" Mar 20 09:07:02.588851 master-0 kubenswrapper[18707]: I0320 09:07:02.587049 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fprh7\" (UniqueName: \"kubernetes.io/projected/4d44d4b7-ce01-4aa4-9155-8338ad17b404-kube-api-access-fprh7\") pod \"edpm-a-provisionserver-checksum-discovery-49xmz\" (UID: \"4d44d4b7-ce01-4aa4-9155-8338ad17b404\") " pod="openstack/edpm-a-provisionserver-checksum-discovery-49xmz" Mar 20 09:07:02.651831 master-0 kubenswrapper[18707]: I0320 09:07:02.651647 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:02.713449 master-0 kubenswrapper[18707]: I0320 09:07:02.713359 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-a-provisionserver-checksum-discovery-49xmz" Mar 20 09:07:02.975449 master-0 kubenswrapper[18707]: I0320 09:07:02.974492 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" podUID="dd73e9d3-234d-4470-b5a4-9abf382a12d7" containerName="cinder-volume" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 09:07:03.074303 master-0 kubenswrapper[18707]: I0320 09:07:03.074236 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-api-0"] Mar 20 09:07:03.143365 master-0 kubenswrapper[18707]: I0320 09:07:03.143295 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="166568a4-b9dc-4ebc-899f-f5ef2ede8598" path="/var/lib/kubelet/pods/166568a4-b9dc-4ebc-899f-f5ef2ede8598/volumes" Mar 20 09:07:03.144124 master-0 kubenswrapper[18707]: I0320 09:07:03.144099 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:07:03.291294 master-0 kubenswrapper[18707]: I0320 09:07:03.289833 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74db7c7d5f-t4vkw"] Mar 20 09:07:03.321818 master-0 kubenswrapper[18707]: I0320 09:07:03.301873 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" podUID="6f5fb021-477c-4a7e-8f92-224e08645060" containerName="dnsmasq-dns" containerID="cri-o://3280555efd3ec0463eb77d7ddf1f443095161c778750f56af1f757094b933059" gracePeriod=10 Mar 20 09:07:03.409661 master-0 kubenswrapper[18707]: I0320 09:07:03.407197 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:03.434441 master-0 kubenswrapper[18707]: I0320 09:07:03.431632 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/edpm-a-provisionserver-checksum-discovery-49xmz"] Mar 20 09:07:03.507476 master-0 kubenswrapper[18707]: I0320 09:07:03.506264 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6bc775ffc9-jcrc5"] Mar 20 09:07:03.516047 master-0 kubenswrapper[18707]: I0320 09:07:03.516007 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.522661 master-0 kubenswrapper[18707]: I0320 09:07:03.522620 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:03.522780 master-0 kubenswrapper[18707]: I0320 09:07:03.522738 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 09:07:03.523147 master-0 kubenswrapper[18707]: I0320 09:07:03.523110 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 09:07:03.523763 master-0 kubenswrapper[18707]: I0320 09:07:03.523729 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bc775ffc9-jcrc5"] Mar 20 09:07:03.564516 master-0 kubenswrapper[18707]: I0320 09:07:03.564461 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c920a-scheduler-0"] Mar 20 09:07:03.625053 master-0 kubenswrapper[18707]: I0320 09:07:03.624999 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-ovndb-tls-certs\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.625939 master-0 kubenswrapper[18707]: I0320 09:07:03.625078 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-public-tls-certs\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.625939 master-0 kubenswrapper[18707]: I0320 09:07:03.625121 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhmf2\" (UniqueName: \"kubernetes.io/projected/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-kube-api-access-xhmf2\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.625939 master-0 kubenswrapper[18707]: I0320 09:07:03.625154 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-combined-ca-bundle\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.625939 master-0 kubenswrapper[18707]: I0320 09:07:03.625197 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-httpd-config\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.625939 master-0 kubenswrapper[18707]: I0320 09:07:03.625249 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-config\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.625939 master-0 kubenswrapper[18707]: I0320 09:07:03.625274 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-internal-tls-certs\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.706370 master-0 kubenswrapper[18707]: I0320 09:07:03.706310 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c920a-backup-0"] Mar 20 09:07:03.747744 master-0 kubenswrapper[18707]: I0320 09:07:03.747672 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-ovndb-tls-certs\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.748057 master-0 kubenswrapper[18707]: I0320 09:07:03.747801 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-public-tls-certs\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.748057 master-0 kubenswrapper[18707]: I0320 09:07:03.747872 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhmf2\" (UniqueName: \"kubernetes.io/projected/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-kube-api-access-xhmf2\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.748057 master-0 kubenswrapper[18707]: I0320 09:07:03.747927 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-combined-ca-bundle\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.748057 master-0 kubenswrapper[18707]: I0320 09:07:03.747975 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-httpd-config\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.748057 master-0 kubenswrapper[18707]: I0320 09:07:03.748046 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-config\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.748669 master-0 kubenswrapper[18707]: I0320 09:07:03.748085 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-internal-tls-certs\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.763170 master-0 kubenswrapper[18707]: I0320 09:07:03.756240 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-combined-ca-bundle\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.763170 master-0 kubenswrapper[18707]: I0320 09:07:03.761169 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-public-tls-certs\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.770503 master-0 kubenswrapper[18707]: I0320 09:07:03.768659 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-ovndb-tls-certs\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.770689 master-0 kubenswrapper[18707]: I0320 09:07:03.768815 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-httpd-config\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.770921 master-0 kubenswrapper[18707]: I0320 09:07:03.770752 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-internal-tls-certs\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.774692 master-0 kubenswrapper[18707]: I0320 09:07:03.774633 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-config\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.784635 master-0 kubenswrapper[18707]: I0320 09:07:03.782975 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhmf2\" (UniqueName: \"kubernetes.io/projected/07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9-kube-api-access-xhmf2\") pod \"neutron-6bc775ffc9-jcrc5\" (UID: \"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9\") " pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:03.916785 master-0 kubenswrapper[18707]: I0320 09:07:03.914217 18707 generic.go:334] "Generic (PLEG): container finished" podID="6f5fb021-477c-4a7e-8f92-224e08645060" containerID="3280555efd3ec0463eb77d7ddf1f443095161c778750f56af1f757094b933059" exitCode=0 Mar 20 09:07:03.916785 master-0 kubenswrapper[18707]: I0320 09:07:03.914381 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" event={"ID":"6f5fb021-477c-4a7e-8f92-224e08645060","Type":"ContainerDied","Data":"3280555efd3ec0463eb77d7ddf1f443095161c778750f56af1f757094b933059"} Mar 20 09:07:03.916785 master-0 kubenswrapper[18707]: I0320 09:07:03.916105 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-api-0" event={"ID":"22dba0f1-9027-4e9e-857b-915b805e1265","Type":"ContainerStarted","Data":"8ea939e0599d66a83a0db0c59889078bddfa89490abfb4a4fbf9e67ead84af1d"} Mar 20 09:07:03.923496 master-0 kubenswrapper[18707]: I0320 09:07:03.921290 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-checksum-discovery-49xmz" event={"ID":"4d44d4b7-ce01-4aa4-9155-8338ad17b404","Type":"ContainerStarted","Data":"ace777ac4645bf819154a3e7c1bf57c4aea08bbc52fca5b1847b2780c95798bc"} Mar 20 09:07:03.923496 master-0 kubenswrapper[18707]: I0320 09:07:03.921329 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-c920a-scheduler-0" podUID="0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" containerName="cinder-scheduler" containerID="cri-o://f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9" gracePeriod=30 Mar 20 09:07:03.923496 master-0 kubenswrapper[18707]: I0320 09:07:03.921461 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-c920a-scheduler-0" podUID="0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" containerName="probe" containerID="cri-o://2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b" gracePeriod=30 Mar 20 09:07:03.923496 master-0 kubenswrapper[18707]: I0320 09:07:03.921374 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-checksum-discovery-49xmz" event={"ID":"4d44d4b7-ce01-4aa4-9155-8338ad17b404","Type":"ContainerStarted","Data":"d12525b1b89fdcb755cdbc04b984565ac34947d5a7c249f9b10aa2c7e2204a6c"} Mar 20 09:07:03.923496 master-0 kubenswrapper[18707]: I0320 09:07:03.921998 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-c920a-backup-0" podUID="e20cdef0-3262-4c1b-b699-e5a4ff8270dd" containerName="cinder-backup" containerID="cri-o://f09fd9bfa4428d92d2bcebf97d69d708bd420c16ec239cddfa8f74a6bfb8e1bb" gracePeriod=30 Mar 20 09:07:03.923496 master-0 kubenswrapper[18707]: I0320 09:07:03.922074 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-c920a-backup-0" podUID="e20cdef0-3262-4c1b-b699-e5a4ff8270dd" containerName="probe" containerID="cri-o://76619726d4ffcbe6fbf6e87609fdd164190939a0787706efd1f18d901c02960f" gracePeriod=30 Mar 20 09:07:03.968579 master-0 kubenswrapper[18707]: I0320 09:07:03.968539 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:07:03.983152 master-0 kubenswrapper[18707]: I0320 09:07:03.982712 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:04.170211 master-0 kubenswrapper[18707]: I0320 09:07:04.164098 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-config\") pod \"6f5fb021-477c-4a7e-8f92-224e08645060\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " Mar 20 09:07:04.170211 master-0 kubenswrapper[18707]: I0320 09:07:04.164214 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-edpm-b\") pod \"6f5fb021-477c-4a7e-8f92-224e08645060\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " Mar 20 09:07:04.170211 master-0 kubenswrapper[18707]: I0320 09:07:04.164325 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-ovsdbserver-nb\") pod \"6f5fb021-477c-4a7e-8f92-224e08645060\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " Mar 20 09:07:04.170211 master-0 kubenswrapper[18707]: I0320 09:07:04.164436 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc64x\" (UniqueName: \"kubernetes.io/projected/6f5fb021-477c-4a7e-8f92-224e08645060-kube-api-access-lc64x\") pod \"6f5fb021-477c-4a7e-8f92-224e08645060\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " Mar 20 09:07:04.170211 master-0 kubenswrapper[18707]: I0320 09:07:04.164496 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-ovsdbserver-sb\") pod \"6f5fb021-477c-4a7e-8f92-224e08645060\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " Mar 20 09:07:04.170211 master-0 kubenswrapper[18707]: I0320 09:07:04.164608 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-edpm-a\") pod \"6f5fb021-477c-4a7e-8f92-224e08645060\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " Mar 20 09:07:04.170211 master-0 kubenswrapper[18707]: I0320 09:07:04.164654 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-dns-swift-storage-0\") pod \"6f5fb021-477c-4a7e-8f92-224e08645060\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " Mar 20 09:07:04.170211 master-0 kubenswrapper[18707]: I0320 09:07:04.164722 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-dns-svc\") pod \"6f5fb021-477c-4a7e-8f92-224e08645060\" (UID: \"6f5fb021-477c-4a7e-8f92-224e08645060\") " Mar 20 09:07:04.194520 master-0 kubenswrapper[18707]: I0320 09:07:04.189533 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f5fb021-477c-4a7e-8f92-224e08645060-kube-api-access-lc64x" (OuterVolumeSpecName: "kube-api-access-lc64x") pod "6f5fb021-477c-4a7e-8f92-224e08645060" (UID: "6f5fb021-477c-4a7e-8f92-224e08645060"). InnerVolumeSpecName "kube-api-access-lc64x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:04.276311 master-0 kubenswrapper[18707]: I0320 09:07:04.268145 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc64x\" (UniqueName: \"kubernetes.io/projected/6f5fb021-477c-4a7e-8f92-224e08645060-kube-api-access-lc64x\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:04.356209 master-0 kubenswrapper[18707]: I0320 09:07:04.355969 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "6f5fb021-477c-4a7e-8f92-224e08645060" (UID: "6f5fb021-477c-4a7e-8f92-224e08645060"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:07:04.397711 master-0 kubenswrapper[18707]: I0320 09:07:04.392823 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:04.463406 master-0 kubenswrapper[18707]: I0320 09:07:04.456910 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6f5fb021-477c-4a7e-8f92-224e08645060" (UID: "6f5fb021-477c-4a7e-8f92-224e08645060"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:07:04.516570 master-0 kubenswrapper[18707]: I0320 09:07:04.506383 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:04.531047 master-0 kubenswrapper[18707]: I0320 09:07:04.530887 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6f5fb021-477c-4a7e-8f92-224e08645060" (UID: "6f5fb021-477c-4a7e-8f92-224e08645060"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:07:04.573356 master-0 kubenswrapper[18707]: I0320 09:07:04.572920 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-edpm-b" (OuterVolumeSpecName: "edpm-b") pod "6f5fb021-477c-4a7e-8f92-224e08645060" (UID: "6f5fb021-477c-4a7e-8f92-224e08645060"). InnerVolumeSpecName "edpm-b". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:07:04.575177 master-0 kubenswrapper[18707]: I0320 09:07:04.574659 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-config" (OuterVolumeSpecName: "config") pod "6f5fb021-477c-4a7e-8f92-224e08645060" (UID: "6f5fb021-477c-4a7e-8f92-224e08645060"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:07:04.575177 master-0 kubenswrapper[18707]: I0320 09:07:04.574990 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6f5fb021-477c-4a7e-8f92-224e08645060" (UID: "6f5fb021-477c-4a7e-8f92-224e08645060"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:07:04.596242 master-0 kubenswrapper[18707]: I0320 09:07:04.595665 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6f5fb021-477c-4a7e-8f92-224e08645060" (UID: "6f5fb021-477c-4a7e-8f92-224e08645060"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:07:04.609922 master-0 kubenswrapper[18707]: I0320 09:07:04.609857 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:04.609922 master-0 kubenswrapper[18707]: I0320 09:07:04.609915 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:04.610074 master-0 kubenswrapper[18707]: I0320 09:07:04.609930 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:04.610074 master-0 kubenswrapper[18707]: I0320 09:07:04.609944 18707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:04.610074 master-0 kubenswrapper[18707]: I0320 09:07:04.609956 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f5fb021-477c-4a7e-8f92-224e08645060-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:04.761125 master-0 kubenswrapper[18707]: I0320 09:07:04.761060 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bc775ffc9-jcrc5"] Mar 20 09:07:04.947328 master-0 kubenswrapper[18707]: I0320 09:07:04.946459 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" event={"ID":"6f5fb021-477c-4a7e-8f92-224e08645060","Type":"ContainerDied","Data":"976df8f52f81ed6312e330867f8f9fc51cabb338452c293ab99e773c53477a90"} Mar 20 09:07:04.947328 master-0 kubenswrapper[18707]: I0320 09:07:04.946518 18707 scope.go:117] "RemoveContainer" containerID="3280555efd3ec0463eb77d7ddf1f443095161c778750f56af1f757094b933059" Mar 20 09:07:04.947328 master-0 kubenswrapper[18707]: I0320 09:07:04.946661 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" Mar 20 09:07:04.970821 master-0 kubenswrapper[18707]: I0320 09:07:04.966501 18707 generic.go:334] "Generic (PLEG): container finished" podID="e20cdef0-3262-4c1b-b699-e5a4ff8270dd" containerID="76619726d4ffcbe6fbf6e87609fdd164190939a0787706efd1f18d901c02960f" exitCode=0 Mar 20 09:07:04.970821 master-0 kubenswrapper[18707]: I0320 09:07:04.966581 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-backup-0" event={"ID":"e20cdef0-3262-4c1b-b699-e5a4ff8270dd","Type":"ContainerDied","Data":"76619726d4ffcbe6fbf6e87609fdd164190939a0787706efd1f18d901c02960f"} Mar 20 09:07:04.980652 master-0 kubenswrapper[18707]: I0320 09:07:04.979322 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-api-0" event={"ID":"22dba0f1-9027-4e9e-857b-915b805e1265","Type":"ContainerStarted","Data":"58f7130a9926fbf267cc00c6297fec6f07f3c4ce438e0b832be97cba8c8437c4"} Mar 20 09:07:04.983266 master-0 kubenswrapper[18707]: I0320 09:07:04.981465 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bc775ffc9-jcrc5" event={"ID":"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9","Type":"ContainerStarted","Data":"03bf558a8026d37b035d2735581645714a9ca3af527d98c6790bd81786892450"} Mar 20 09:07:05.007790 master-0 kubenswrapper[18707]: I0320 09:07:05.007762 18707 scope.go:117] "RemoveContainer" containerID="99c340d8e66ad7aa183d45533d1b40835ba835715c029a0d1a60cee94fce5093" Mar 20 09:07:05.013455 master-0 kubenswrapper[18707]: I0320 09:07:05.013394 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74db7c7d5f-t4vkw"] Mar 20 09:07:05.025871 master-0 kubenswrapper[18707]: I0320 09:07:05.025783 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74db7c7d5f-t4vkw"] Mar 20 09:07:05.128932 master-0 kubenswrapper[18707]: I0320 09:07:05.128763 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f5fb021-477c-4a7e-8f92-224e08645060" path="/var/lib/kubelet/pods/6f5fb021-477c-4a7e-8f92-224e08645060/volumes" Mar 20 09:07:05.829550 master-0 kubenswrapper[18707]: I0320 09:07:05.828996 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:05.904156 master-0 kubenswrapper[18707]: I0320 09:07:05.904071 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-scripts\") pod \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " Mar 20 09:07:05.904378 master-0 kubenswrapper[18707]: I0320 09:07:05.904246 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-run\") pod \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " Mar 20 09:07:05.904416 master-0 kubenswrapper[18707]: I0320 09:07:05.904398 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-sys\") pod \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " Mar 20 09:07:05.904505 master-0 kubenswrapper[18707]: I0320 09:07:05.904458 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-sys" (OuterVolumeSpecName: "sys") pod "e20cdef0-3262-4c1b-b699-e5a4ff8270dd" (UID: "e20cdef0-3262-4c1b-b699-e5a4ff8270dd"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:05.904550 master-0 kubenswrapper[18707]: I0320 09:07:05.904464 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-run" (OuterVolumeSpecName: "run") pod "e20cdef0-3262-4c1b-b699-e5a4ff8270dd" (UID: "e20cdef0-3262-4c1b-b699-e5a4ff8270dd"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:05.904550 master-0 kubenswrapper[18707]: I0320 09:07:05.904478 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-dev\") pod \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " Mar 20 09:07:05.904550 master-0 kubenswrapper[18707]: I0320 09:07:05.904544 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-dev" (OuterVolumeSpecName: "dev") pod "e20cdef0-3262-4c1b-b699-e5a4ff8270dd" (UID: "e20cdef0-3262-4c1b-b699-e5a4ff8270dd"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:05.904637 master-0 kubenswrapper[18707]: I0320 09:07:05.904602 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-config-data\") pod \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " Mar 20 09:07:05.904673 master-0 kubenswrapper[18707]: I0320 09:07:05.904659 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-locks-cinder\") pod \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " Mar 20 09:07:05.904706 master-0 kubenswrapper[18707]: I0320 09:07:05.904695 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-locks-brick\") pod \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " Mar 20 09:07:05.904738 master-0 kubenswrapper[18707]: I0320 09:07:05.904710 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-iscsi\") pod \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " Mar 20 09:07:05.904738 master-0 kubenswrapper[18707]: I0320 09:07:05.904734 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-machine-id\") pod \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " Mar 20 09:07:05.904800 master-0 kubenswrapper[18707]: I0320 09:07:05.904755 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-config-data-custom\") pod \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " Mar 20 09:07:05.904800 master-0 kubenswrapper[18707]: I0320 09:07:05.904778 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-nvme\") pod \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " Mar 20 09:07:05.904866 master-0 kubenswrapper[18707]: I0320 09:07:05.904787 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "e20cdef0-3262-4c1b-b699-e5a4ff8270dd" (UID: "e20cdef0-3262-4c1b-b699-e5a4ff8270dd"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:05.904866 master-0 kubenswrapper[18707]: I0320 09:07:05.904805 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-lib-modules\") pod \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " Mar 20 09:07:05.904866 master-0 kubenswrapper[18707]: I0320 09:07:05.904841 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e20cdef0-3262-4c1b-b699-e5a4ff8270dd" (UID: "e20cdef0-3262-4c1b-b699-e5a4ff8270dd"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:05.904866 master-0 kubenswrapper[18707]: I0320 09:07:05.904856 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stl6s\" (UniqueName: \"kubernetes.io/projected/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-kube-api-access-stl6s\") pod \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " Mar 20 09:07:05.904989 master-0 kubenswrapper[18707]: I0320 09:07:05.904874 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "e20cdef0-3262-4c1b-b699-e5a4ff8270dd" (UID: "e20cdef0-3262-4c1b-b699-e5a4ff8270dd"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:05.904989 master-0 kubenswrapper[18707]: I0320 09:07:05.904895 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-combined-ca-bundle\") pod \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " Mar 20 09:07:05.904989 master-0 kubenswrapper[18707]: I0320 09:07:05.904932 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-lib-cinder\") pod \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\" (UID: \"e20cdef0-3262-4c1b-b699-e5a4ff8270dd\") " Mar 20 09:07:05.906523 master-0 kubenswrapper[18707]: I0320 09:07:05.906469 18707 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-run\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:05.906523 master-0 kubenswrapper[18707]: I0320 09:07:05.906497 18707 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-sys\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:05.906523 master-0 kubenswrapper[18707]: I0320 09:07:05.906508 18707 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-dev\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:05.906523 master-0 kubenswrapper[18707]: I0320 09:07:05.906520 18707 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:05.906797 master-0 kubenswrapper[18707]: I0320 09:07:05.906533 18707 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:05.906797 master-0 kubenswrapper[18707]: I0320 09:07:05.906556 18707 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:05.907929 master-0 kubenswrapper[18707]: I0320 09:07:05.904900 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e20cdef0-3262-4c1b-b699-e5a4ff8270dd" (UID: "e20cdef0-3262-4c1b-b699-e5a4ff8270dd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:05.908043 master-0 kubenswrapper[18707]: I0320 09:07:05.906680 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "e20cdef0-3262-4c1b-b699-e5a4ff8270dd" (UID: "e20cdef0-3262-4c1b-b699-e5a4ff8270dd"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:05.908043 master-0 kubenswrapper[18707]: I0320 09:07:05.906704 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "e20cdef0-3262-4c1b-b699-e5a4ff8270dd" (UID: "e20cdef0-3262-4c1b-b699-e5a4ff8270dd"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:05.908043 master-0 kubenswrapper[18707]: I0320 09:07:05.907323 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "e20cdef0-3262-4c1b-b699-e5a4ff8270dd" (UID: "e20cdef0-3262-4c1b-b699-e5a4ff8270dd"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:05.908043 master-0 kubenswrapper[18707]: I0320 09:07:05.907883 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e20cdef0-3262-4c1b-b699-e5a4ff8270dd" (UID: "e20cdef0-3262-4c1b-b699-e5a4ff8270dd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:05.909785 master-0 kubenswrapper[18707]: I0320 09:07:05.909737 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-kube-api-access-stl6s" (OuterVolumeSpecName: "kube-api-access-stl6s") pod "e20cdef0-3262-4c1b-b699-e5a4ff8270dd" (UID: "e20cdef0-3262-4c1b-b699-e5a4ff8270dd"). InnerVolumeSpecName "kube-api-access-stl6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:05.910314 master-0 kubenswrapper[18707]: I0320 09:07:05.910274 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-scripts" (OuterVolumeSpecName: "scripts") pod "e20cdef0-3262-4c1b-b699-e5a4ff8270dd" (UID: "e20cdef0-3262-4c1b-b699-e5a4ff8270dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:05.994052 master-0 kubenswrapper[18707]: I0320 09:07:05.993924 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e20cdef0-3262-4c1b-b699-e5a4ff8270dd" (UID: "e20cdef0-3262-4c1b-b699-e5a4ff8270dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:06.002664 master-0 kubenswrapper[18707]: I0320 09:07:06.002603 18707 generic.go:334] "Generic (PLEG): container finished" podID="e20cdef0-3262-4c1b-b699-e5a4ff8270dd" containerID="f09fd9bfa4428d92d2bcebf97d69d708bd420c16ec239cddfa8f74a6bfb8e1bb" exitCode=0 Mar 20 09:07:06.002850 master-0 kubenswrapper[18707]: I0320 09:07:06.002709 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:06.002850 master-0 kubenswrapper[18707]: I0320 09:07:06.002832 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-backup-0" event={"ID":"e20cdef0-3262-4c1b-b699-e5a4ff8270dd","Type":"ContainerDied","Data":"f09fd9bfa4428d92d2bcebf97d69d708bd420c16ec239cddfa8f74a6bfb8e1bb"} Mar 20 09:07:06.002919 master-0 kubenswrapper[18707]: I0320 09:07:06.002864 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-backup-0" event={"ID":"e20cdef0-3262-4c1b-b699-e5a4ff8270dd","Type":"ContainerDied","Data":"e8840dce705213d366ec3d9594ab15da5694a6c1abd5ef298d1a06ea5704d17b"} Mar 20 09:07:06.002919 master-0 kubenswrapper[18707]: I0320 09:07:06.002884 18707 scope.go:117] "RemoveContainer" containerID="76619726d4ffcbe6fbf6e87609fdd164190939a0787706efd1f18d901c02960f" Mar 20 09:07:06.003111 master-0 kubenswrapper[18707]: I0320 09:07:06.003088 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:06.005628 master-0 kubenswrapper[18707]: I0320 09:07:06.005582 18707 generic.go:334] "Generic (PLEG): container finished" podID="0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" containerID="2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b" exitCode=0 Mar 20 09:07:06.005628 master-0 kubenswrapper[18707]: I0320 09:07:06.005603 18707 generic.go:334] "Generic (PLEG): container finished" podID="0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" containerID="f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9" exitCode=0 Mar 20 09:07:06.005760 master-0 kubenswrapper[18707]: I0320 09:07:06.005637 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-scheduler-0" event={"ID":"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf","Type":"ContainerDied","Data":"2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b"} Mar 20 09:07:06.005760 master-0 kubenswrapper[18707]: I0320 09:07:06.005657 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-scheduler-0" event={"ID":"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf","Type":"ContainerDied","Data":"f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9"} Mar 20 09:07:06.005760 master-0 kubenswrapper[18707]: I0320 09:07:06.005668 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-scheduler-0" event={"ID":"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf","Type":"ContainerDied","Data":"a8fe075204159a2e6dec36a1eb418825fdadf6828c962fb82e5614e792d9d2c7"} Mar 20 09:07:06.008134 master-0 kubenswrapper[18707]: I0320 09:07:06.008080 18707 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:06.008134 master-0 kubenswrapper[18707]: I0320 09:07:06.008106 18707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:06.008134 master-0 kubenswrapper[18707]: I0320 09:07:06.008115 18707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:06.008134 master-0 kubenswrapper[18707]: I0320 09:07:06.008125 18707 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:06.008134 master-0 kubenswrapper[18707]: I0320 09:07:06.008135 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stl6s\" (UniqueName: \"kubernetes.io/projected/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-kube-api-access-stl6s\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:06.008134 master-0 kubenswrapper[18707]: I0320 09:07:06.008145 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:06.008419 master-0 kubenswrapper[18707]: I0320 09:07:06.008156 18707 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:06.008419 master-0 kubenswrapper[18707]: I0320 09:07:06.008165 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:06.008878 master-0 kubenswrapper[18707]: I0320 09:07:06.008849 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bc775ffc9-jcrc5" event={"ID":"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9","Type":"ContainerStarted","Data":"b89c948f00fcd9a4c384ebb54b6ce00dca74c767a7cf059532b134697c34b55e"} Mar 20 09:07:06.008926 master-0 kubenswrapper[18707]: I0320 09:07:06.008877 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bc775ffc9-jcrc5" event={"ID":"07e7af78-12cb-4dc3-a1db-9af7f3fdbfb9","Type":"ContainerStarted","Data":"c5617e8572619cd13224328229f57891d2b6b644105e8a8a5da2b9869919f502"} Mar 20 09:07:06.009732 master-0 kubenswrapper[18707]: I0320 09:07:06.009567 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:06.093453 master-0 kubenswrapper[18707]: I0320 09:07:06.092911 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-config-data" (OuterVolumeSpecName: "config-data") pod "e20cdef0-3262-4c1b-b699-e5a4ff8270dd" (UID: "e20cdef0-3262-4c1b-b699-e5a4ff8270dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:06.116421 master-0 kubenswrapper[18707]: I0320 09:07:06.116258 18707 scope.go:117] "RemoveContainer" containerID="f09fd9bfa4428d92d2bcebf97d69d708bd420c16ec239cddfa8f74a6bfb8e1bb" Mar 20 09:07:06.132211 master-0 kubenswrapper[18707]: I0320 09:07:06.119080 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20cdef0-3262-4c1b-b699-e5a4ff8270dd-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:06.146585 master-0 kubenswrapper[18707]: I0320 09:07:06.146464 18707 scope.go:117] "RemoveContainer" containerID="76619726d4ffcbe6fbf6e87609fdd164190939a0787706efd1f18d901c02960f" Mar 20 09:07:06.146974 master-0 kubenswrapper[18707]: E0320 09:07:06.146949 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76619726d4ffcbe6fbf6e87609fdd164190939a0787706efd1f18d901c02960f\": container with ID starting with 76619726d4ffcbe6fbf6e87609fdd164190939a0787706efd1f18d901c02960f not found: ID does not exist" containerID="76619726d4ffcbe6fbf6e87609fdd164190939a0787706efd1f18d901c02960f" Mar 20 09:07:06.147031 master-0 kubenswrapper[18707]: I0320 09:07:06.146981 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76619726d4ffcbe6fbf6e87609fdd164190939a0787706efd1f18d901c02960f"} err="failed to get container status \"76619726d4ffcbe6fbf6e87609fdd164190939a0787706efd1f18d901c02960f\": rpc error: code = NotFound desc = could not find container \"76619726d4ffcbe6fbf6e87609fdd164190939a0787706efd1f18d901c02960f\": container with ID starting with 76619726d4ffcbe6fbf6e87609fdd164190939a0787706efd1f18d901c02960f not found: ID does not exist" Mar 20 09:07:06.147031 master-0 kubenswrapper[18707]: I0320 09:07:06.147004 18707 scope.go:117] "RemoveContainer" containerID="f09fd9bfa4428d92d2bcebf97d69d708bd420c16ec239cddfa8f74a6bfb8e1bb" Mar 20 09:07:06.147314 master-0 kubenswrapper[18707]: E0320 09:07:06.147288 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f09fd9bfa4428d92d2bcebf97d69d708bd420c16ec239cddfa8f74a6bfb8e1bb\": container with ID starting with f09fd9bfa4428d92d2bcebf97d69d708bd420c16ec239cddfa8f74a6bfb8e1bb not found: ID does not exist" containerID="f09fd9bfa4428d92d2bcebf97d69d708bd420c16ec239cddfa8f74a6bfb8e1bb" Mar 20 09:07:06.147369 master-0 kubenswrapper[18707]: I0320 09:07:06.147316 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f09fd9bfa4428d92d2bcebf97d69d708bd420c16ec239cddfa8f74a6bfb8e1bb"} err="failed to get container status \"f09fd9bfa4428d92d2bcebf97d69d708bd420c16ec239cddfa8f74a6bfb8e1bb\": rpc error: code = NotFound desc = could not find container \"f09fd9bfa4428d92d2bcebf97d69d708bd420c16ec239cddfa8f74a6bfb8e1bb\": container with ID starting with f09fd9bfa4428d92d2bcebf97d69d708bd420c16ec239cddfa8f74a6bfb8e1bb not found: ID does not exist" Mar 20 09:07:06.147406 master-0 kubenswrapper[18707]: I0320 09:07:06.147393 18707 scope.go:117] "RemoveContainer" containerID="2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b" Mar 20 09:07:06.164831 master-0 kubenswrapper[18707]: I0320 09:07:06.164774 18707 scope.go:117] "RemoveContainer" containerID="f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9" Mar 20 09:07:06.187612 master-0 kubenswrapper[18707]: I0320 09:07:06.187536 18707 scope.go:117] "RemoveContainer" containerID="2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b" Mar 20 09:07:06.188152 master-0 kubenswrapper[18707]: E0320 09:07:06.188118 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b\": container with ID starting with 2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b not found: ID does not exist" containerID="2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b" Mar 20 09:07:06.188239 master-0 kubenswrapper[18707]: I0320 09:07:06.188149 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b"} err="failed to get container status \"2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b\": rpc error: code = NotFound desc = could not find container \"2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b\": container with ID starting with 2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b not found: ID does not exist" Mar 20 09:07:06.188239 master-0 kubenswrapper[18707]: I0320 09:07:06.188171 18707 scope.go:117] "RemoveContainer" containerID="f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9" Mar 20 09:07:06.188687 master-0 kubenswrapper[18707]: E0320 09:07:06.188639 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9\": container with ID starting with f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9 not found: ID does not exist" containerID="f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9" Mar 20 09:07:06.188772 master-0 kubenswrapper[18707]: I0320 09:07:06.188702 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9"} err="failed to get container status \"f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9\": rpc error: code = NotFound desc = could not find container \"f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9\": container with ID starting with f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9 not found: ID does not exist" Mar 20 09:07:06.188772 master-0 kubenswrapper[18707]: I0320 09:07:06.188740 18707 scope.go:117] "RemoveContainer" containerID="2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b" Mar 20 09:07:06.189156 master-0 kubenswrapper[18707]: I0320 09:07:06.189081 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b"} err="failed to get container status \"2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b\": rpc error: code = NotFound desc = could not find container \"2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b\": container with ID starting with 2c74a0b8911480fc7e9d5c9cca486c4f1a3e772041a9d4129ffd291507f54f2b not found: ID does not exist" Mar 20 09:07:06.189156 master-0 kubenswrapper[18707]: I0320 09:07:06.189105 18707 scope.go:117] "RemoveContainer" containerID="f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9" Mar 20 09:07:06.189825 master-0 kubenswrapper[18707]: I0320 09:07:06.189791 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9"} err="failed to get container status \"f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9\": rpc error: code = NotFound desc = could not find container \"f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9\": container with ID starting with f7db04b649c7e24f94368bbf61325cf4caf7244aba0e44cb9f7651d07c846df9 not found: ID does not exist" Mar 20 09:07:06.220470 master-0 kubenswrapper[18707]: I0320 09:07:06.220439 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-combined-ca-bundle\") pod \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " Mar 20 09:07:06.221134 master-0 kubenswrapper[18707]: I0320 09:07:06.221102 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-config-data-custom\") pod \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " Mar 20 09:07:06.221447 master-0 kubenswrapper[18707]: I0320 09:07:06.221429 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-config-data\") pod \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " Mar 20 09:07:06.221662 master-0 kubenswrapper[18707]: I0320 09:07:06.221601 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pz6d\" (UniqueName: \"kubernetes.io/projected/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-kube-api-access-7pz6d\") pod \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " Mar 20 09:07:06.222853 master-0 kubenswrapper[18707]: I0320 09:07:06.222356 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" (UID: "0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:06.222994 master-0 kubenswrapper[18707]: I0320 09:07:06.222974 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-etc-machine-id\") pod \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " Mar 20 09:07:06.223194 master-0 kubenswrapper[18707]: I0320 09:07:06.223148 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-scripts\") pod \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\" (UID: \"0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf\") " Mar 20 09:07:06.224064 master-0 kubenswrapper[18707]: I0320 09:07:06.224027 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" (UID: "0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:06.224532 master-0 kubenswrapper[18707]: I0320 09:07:06.224507 18707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:06.224532 master-0 kubenswrapper[18707]: I0320 09:07:06.224530 18707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:06.225916 master-0 kubenswrapper[18707]: I0320 09:07:06.225842 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-kube-api-access-7pz6d" (OuterVolumeSpecName: "kube-api-access-7pz6d") pod "0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" (UID: "0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf"). InnerVolumeSpecName "kube-api-access-7pz6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:06.226679 master-0 kubenswrapper[18707]: I0320 09:07:06.226626 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-scripts" (OuterVolumeSpecName: "scripts") pod "0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" (UID: "0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:06.270643 master-0 kubenswrapper[18707]: I0320 09:07:06.270560 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" (UID: "0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:06.326841 master-0 kubenswrapper[18707]: I0320 09:07:06.326790 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:06.327101 master-0 kubenswrapper[18707]: I0320 09:07:06.327084 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pz6d\" (UniqueName: \"kubernetes.io/projected/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-kube-api-access-7pz6d\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:06.327254 master-0 kubenswrapper[18707]: I0320 09:07:06.327203 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:06.337079 master-0 kubenswrapper[18707]: I0320 09:07:06.337023 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-config-data" (OuterVolumeSpecName: "config-data") pod "0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" (UID: "0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:06.430013 master-0 kubenswrapper[18707]: I0320 09:07:06.429962 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:06.896863 master-0 kubenswrapper[18707]: I0320 09:07:06.896676 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6bc775ffc9-jcrc5" podStartSLOduration=3.896648368 podStartE2EDuration="3.896648368s" podCreationTimestamp="2026-03-20 09:07:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:07:06.876308547 +0000 UTC m=+1572.032488913" watchObservedRunningTime="2026-03-20 09:07:06.896648368 +0000 UTC m=+1572.052828764" Mar 20 09:07:07.065772 master-0 kubenswrapper[18707]: I0320 09:07:07.056506 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.078551 master-0 kubenswrapper[18707]: I0320 09:07:07.076352 18707 generic.go:334] "Generic (PLEG): container finished" podID="bae03209-559e-4828-bcb6-70eb73ff61dc" containerID="7e7cea23c7318292592a7d106763c1c985071971334264d512fa9e9df258faa2" exitCode=0 Mar 20 09:07:07.078551 master-0 kubenswrapper[18707]: I0320 09:07:07.076448 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" event={"ID":"bae03209-559e-4828-bcb6-70eb73ff61dc","Type":"ContainerDied","Data":"7e7cea23c7318292592a7d106763c1c985071971334264d512fa9e9df258faa2"} Mar 20 09:07:07.093264 master-0 kubenswrapper[18707]: I0320 09:07:07.090625 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c920a-backup-0"] Mar 20 09:07:07.224977 master-0 kubenswrapper[18707]: I0320 09:07:07.224864 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-c920a-backup-0"] Mar 20 09:07:07.224977 master-0 kubenswrapper[18707]: I0320 09:07:07.224922 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c920a-backup-0"] Mar 20 09:07:07.225496 master-0 kubenswrapper[18707]: E0320 09:07:07.225458 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f5fb021-477c-4a7e-8f92-224e08645060" containerName="dnsmasq-dns" Mar 20 09:07:07.225496 master-0 kubenswrapper[18707]: I0320 09:07:07.225477 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f5fb021-477c-4a7e-8f92-224e08645060" containerName="dnsmasq-dns" Mar 20 09:07:07.225610 master-0 kubenswrapper[18707]: E0320 09:07:07.225515 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" containerName="probe" Mar 20 09:07:07.225610 master-0 kubenswrapper[18707]: I0320 09:07:07.225524 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" containerName="probe" Mar 20 09:07:07.225610 master-0 kubenswrapper[18707]: E0320 09:07:07.225555 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f5fb021-477c-4a7e-8f92-224e08645060" containerName="init" Mar 20 09:07:07.225610 master-0 kubenswrapper[18707]: I0320 09:07:07.225561 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f5fb021-477c-4a7e-8f92-224e08645060" containerName="init" Mar 20 09:07:07.225610 master-0 kubenswrapper[18707]: E0320 09:07:07.225595 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" containerName="cinder-scheduler" Mar 20 09:07:07.225610 master-0 kubenswrapper[18707]: I0320 09:07:07.225601 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" containerName="cinder-scheduler" Mar 20 09:07:07.225844 master-0 kubenswrapper[18707]: E0320 09:07:07.225622 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e20cdef0-3262-4c1b-b699-e5a4ff8270dd" containerName="cinder-backup" Mar 20 09:07:07.225844 master-0 kubenswrapper[18707]: I0320 09:07:07.225628 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20cdef0-3262-4c1b-b699-e5a4ff8270dd" containerName="cinder-backup" Mar 20 09:07:07.225844 master-0 kubenswrapper[18707]: E0320 09:07:07.225640 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e20cdef0-3262-4c1b-b699-e5a4ff8270dd" containerName="probe" Mar 20 09:07:07.225844 master-0 kubenswrapper[18707]: I0320 09:07:07.225646 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20cdef0-3262-4c1b-b699-e5a4ff8270dd" containerName="probe" Mar 20 09:07:07.226010 master-0 kubenswrapper[18707]: I0320 09:07:07.225880 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" containerName="cinder-scheduler" Mar 20 09:07:07.226010 master-0 kubenswrapper[18707]: I0320 09:07:07.225916 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e20cdef0-3262-4c1b-b699-e5a4ff8270dd" containerName="cinder-backup" Mar 20 09:07:07.226010 master-0 kubenswrapper[18707]: I0320 09:07:07.225925 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e20cdef0-3262-4c1b-b699-e5a4ff8270dd" containerName="probe" Mar 20 09:07:07.226010 master-0 kubenswrapper[18707]: I0320 09:07:07.225933 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" containerName="probe" Mar 20 09:07:07.226010 master-0 kubenswrapper[18707]: I0320 09:07:07.225949 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f5fb021-477c-4a7e-8f92-224e08645060" containerName="dnsmasq-dns" Mar 20 09:07:07.241726 master-0 kubenswrapper[18707]: I0320 09:07:07.241674 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-backup-0"] Mar 20 09:07:07.241939 master-0 kubenswrapper[18707]: I0320 09:07:07.241805 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.255837 master-0 kubenswrapper[18707]: I0320 09:07:07.255512 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-c920a-backup-config-data" Mar 20 09:07:07.259569 master-0 kubenswrapper[18707]: I0320 09:07:07.258847 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-api-0" event={"ID":"22dba0f1-9027-4e9e-857b-915b805e1265","Type":"ContainerStarted","Data":"d341cca8315a148a594179b8b765adf357fc4d527ebdab8e2f1a44d0eff756fe"} Mar 20 09:07:07.259569 master-0 kubenswrapper[18707]: I0320 09:07:07.258895 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-c920a-api-0" Mar 20 09:07:07.295215 master-0 kubenswrapper[18707]: I0320 09:07:07.295143 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c920a-scheduler-0"] Mar 20 09:07:07.330563 master-0 kubenswrapper[18707]: I0320 09:07:07.330497 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a3e0612-d529-446a-b38d-dfa7a9e41d27-scripts\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.330563 master-0 kubenswrapper[18707]: I0320 09:07:07.330553 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-etc-iscsi\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.330830 master-0 kubenswrapper[18707]: I0320 09:07:07.330594 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-run\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.330830 master-0 kubenswrapper[18707]: I0320 09:07:07.330691 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3e0612-d529-446a-b38d-dfa7a9e41d27-config-data\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.330830 master-0 kubenswrapper[18707]: I0320 09:07:07.330734 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-var-locks-brick\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.330830 master-0 kubenswrapper[18707]: I0320 09:07:07.330762 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-dev\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.330830 master-0 kubenswrapper[18707]: I0320 09:07:07.330804 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-var-lib-cinder\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.330830 master-0 kubenswrapper[18707]: I0320 09:07:07.330826 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3e0612-d529-446a-b38d-dfa7a9e41d27-combined-ca-bundle\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.331010 master-0 kubenswrapper[18707]: I0320 09:07:07.330864 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-sys\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.331010 master-0 kubenswrapper[18707]: I0320 09:07:07.330890 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-etc-machine-id\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.331010 master-0 kubenswrapper[18707]: I0320 09:07:07.330952 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx6qr\" (UniqueName: \"kubernetes.io/projected/6a3e0612-d529-446a-b38d-dfa7a9e41d27-kube-api-access-mx6qr\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.331010 master-0 kubenswrapper[18707]: I0320 09:07:07.330980 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-etc-nvme\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.331010 master-0 kubenswrapper[18707]: I0320 09:07:07.330998 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a3e0612-d529-446a-b38d-dfa7a9e41d27-config-data-custom\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.331252 master-0 kubenswrapper[18707]: I0320 09:07:07.331048 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-var-locks-cinder\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.331252 master-0 kubenswrapper[18707]: I0320 09:07:07.331078 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-lib-modules\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.359374 master-0 kubenswrapper[18707]: I0320 09:07:07.359309 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-c920a-scheduler-0"] Mar 20 09:07:07.377816 master-0 kubenswrapper[18707]: I0320 09:07:07.375159 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c920a-scheduler-0"] Mar 20 09:07:07.382199 master-0 kubenswrapper[18707]: I0320 09:07:07.378627 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.382381 master-0 kubenswrapper[18707]: I0320 09:07:07.382333 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-c920a-scheduler-config-data" Mar 20 09:07:07.420306 master-0 kubenswrapper[18707]: I0320 09:07:07.420117 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-scheduler-0"] Mar 20 09:07:07.425074 master-0 kubenswrapper[18707]: I0320 09:07:07.425003 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-c920a-api-0" podStartSLOduration=6.424988376 podStartE2EDuration="6.424988376s" podCreationTimestamp="2026-03-20 09:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:07:07.328957222 +0000 UTC m=+1572.485137578" watchObservedRunningTime="2026-03-20 09:07:07.424988376 +0000 UTC m=+1572.581168732" Mar 20 09:07:07.432153 master-0 kubenswrapper[18707]: I0320 09:07:07.432100 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-etc-machine-id\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432312 master-0 kubenswrapper[18707]: I0320 09:07:07.432175 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx6qr\" (UniqueName: \"kubernetes.io/projected/6a3e0612-d529-446a-b38d-dfa7a9e41d27-kube-api-access-mx6qr\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432312 master-0 kubenswrapper[18707]: I0320 09:07:07.432228 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-etc-nvme\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432312 master-0 kubenswrapper[18707]: I0320 09:07:07.432247 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a3e0612-d529-446a-b38d-dfa7a9e41d27-config-data-custom\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432312 master-0 kubenswrapper[18707]: I0320 09:07:07.432276 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-var-locks-cinder\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432312 master-0 kubenswrapper[18707]: I0320 09:07:07.432301 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-lib-modules\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432468 master-0 kubenswrapper[18707]: I0320 09:07:07.432319 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-config-data-custom\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.432468 master-0 kubenswrapper[18707]: I0320 09:07:07.432372 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a3e0612-d529-446a-b38d-dfa7a9e41d27-scripts\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432468 master-0 kubenswrapper[18707]: I0320 09:07:07.432391 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-etc-iscsi\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432468 master-0 kubenswrapper[18707]: I0320 09:07:07.432411 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-combined-ca-bundle\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.432468 master-0 kubenswrapper[18707]: I0320 09:07:07.432434 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-run\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432468 master-0 kubenswrapper[18707]: I0320 09:07:07.432456 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-scripts\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.432662 master-0 kubenswrapper[18707]: I0320 09:07:07.432483 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3e0612-d529-446a-b38d-dfa7a9e41d27-config-data\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432662 master-0 kubenswrapper[18707]: I0320 09:07:07.432513 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-var-locks-brick\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432662 master-0 kubenswrapper[18707]: I0320 09:07:07.432528 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-dev\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432662 master-0 kubenswrapper[18707]: I0320 09:07:07.432546 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-config-data\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.432662 master-0 kubenswrapper[18707]: I0320 09:07:07.432573 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-var-lib-cinder\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432662 master-0 kubenswrapper[18707]: I0320 09:07:07.432591 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3e0612-d529-446a-b38d-dfa7a9e41d27-combined-ca-bundle\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432662 master-0 kubenswrapper[18707]: I0320 09:07:07.432608 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-etc-machine-id\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.432662 master-0 kubenswrapper[18707]: I0320 09:07:07.432626 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9llfn\" (UniqueName: \"kubernetes.io/projected/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-kube-api-access-9llfn\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.432662 master-0 kubenswrapper[18707]: I0320 09:07:07.432648 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-sys\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432921 master-0 kubenswrapper[18707]: I0320 09:07:07.432733 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-sys\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.432921 master-0 kubenswrapper[18707]: I0320 09:07:07.432772 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-etc-machine-id\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.433131 master-0 kubenswrapper[18707]: I0320 09:07:07.433100 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-etc-nvme\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.434660 master-0 kubenswrapper[18707]: I0320 09:07:07.434617 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-etc-iscsi\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.434731 master-0 kubenswrapper[18707]: I0320 09:07:07.434699 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-run\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.435212 master-0 kubenswrapper[18707]: I0320 09:07:07.435137 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-lib-modules\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.435317 master-0 kubenswrapper[18707]: I0320 09:07:07.435292 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-var-locks-cinder\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.435367 master-0 kubenswrapper[18707]: I0320 09:07:07.435342 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-var-locks-brick\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.435437 master-0 kubenswrapper[18707]: I0320 09:07:07.435417 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-var-lib-cinder\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.435479 master-0 kubenswrapper[18707]: I0320 09:07:07.435456 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6a3e0612-d529-446a-b38d-dfa7a9e41d27-dev\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.440444 master-0 kubenswrapper[18707]: I0320 09:07:07.440403 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3e0612-d529-446a-b38d-dfa7a9e41d27-config-data\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.451252 master-0 kubenswrapper[18707]: I0320 09:07:07.451157 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a3e0612-d529-446a-b38d-dfa7a9e41d27-scripts\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.451362 master-0 kubenswrapper[18707]: I0320 09:07:07.451172 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3e0612-d529-446a-b38d-dfa7a9e41d27-combined-ca-bundle\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.451676 master-0 kubenswrapper[18707]: I0320 09:07:07.451629 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a3e0612-d529-446a-b38d-dfa7a9e41d27-config-data-custom\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.460125 master-0 kubenswrapper[18707]: I0320 09:07:07.458593 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx6qr\" (UniqueName: \"kubernetes.io/projected/6a3e0612-d529-446a-b38d-dfa7a9e41d27-kube-api-access-mx6qr\") pod \"cinder-c920a-backup-0\" (UID: \"6a3e0612-d529-446a-b38d-dfa7a9e41d27\") " pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.547668 master-0 kubenswrapper[18707]: I0320 09:07:07.547607 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-combined-ca-bundle\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.547903 master-0 kubenswrapper[18707]: I0320 09:07:07.547689 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-scripts\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.547903 master-0 kubenswrapper[18707]: I0320 09:07:07.547757 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-config-data\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.547903 master-0 kubenswrapper[18707]: I0320 09:07:07.547803 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-etc-machine-id\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.547903 master-0 kubenswrapper[18707]: I0320 09:07:07.547832 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9llfn\" (UniqueName: \"kubernetes.io/projected/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-kube-api-access-9llfn\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.548043 master-0 kubenswrapper[18707]: I0320 09:07:07.547952 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-config-data-custom\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.551113 master-0 kubenswrapper[18707]: I0320 09:07:07.549267 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-etc-machine-id\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.551497 master-0 kubenswrapper[18707]: I0320 09:07:07.551462 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-config-data-custom\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.566334 master-0 kubenswrapper[18707]: I0320 09:07:07.565088 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9llfn\" (UniqueName: \"kubernetes.io/projected/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-kube-api-access-9llfn\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.570956 master-0 kubenswrapper[18707]: I0320 09:07:07.569368 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-combined-ca-bundle\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.572892 master-0 kubenswrapper[18707]: I0320 09:07:07.572844 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-scripts\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.575334 master-0 kubenswrapper[18707]: I0320 09:07:07.575280 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e85ce06-e9d6-430b-af36-fc1cab6fbc1c-config-data\") pod \"cinder-c920a-scheduler-0\" (UID: \"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c\") " pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.605296 master-0 kubenswrapper[18707]: I0320 09:07:07.605233 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:07.631639 master-0 kubenswrapper[18707]: I0320 09:07:07.631587 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:07.703850 master-0 kubenswrapper[18707]: I0320 09:07:07.703631 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:07.736504 master-0 kubenswrapper[18707]: I0320 09:07:07.735929 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c920a-volume-lvm-iscsi-0"] Mar 20 09:07:08.253885 master-0 kubenswrapper[18707]: W0320 09:07:08.252447 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a3e0612_d529_446a_b38d_dfa7a9e41d27.slice/crio-b1ddb94188c8e44b90a34d21b33c5595ee0bad1b12eb08f91d193f12e79c854e WatchSource:0}: Error finding container b1ddb94188c8e44b90a34d21b33c5595ee0bad1b12eb08f91d193f12e79c854e: Status 404 returned error can't find the container with id b1ddb94188c8e44b90a34d21b33c5595ee0bad1b12eb08f91d193f12e79c854e Mar 20 09:07:08.255955 master-0 kubenswrapper[18707]: I0320 09:07:08.255900 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-backup-0"] Mar 20 09:07:08.304279 master-0 kubenswrapper[18707]: I0320 09:07:08.304212 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-scheduler-0"] Mar 20 09:07:08.314708 master-0 kubenswrapper[18707]: I0320 09:07:08.314627 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" podUID="dd73e9d3-234d-4470-b5a4-9abf382a12d7" containerName="cinder-volume" containerID="cri-o://ae05bc92d214153ce18f48c52230166a2e389f20f1e35ff98414dcb3986ae81e" gracePeriod=30 Mar 20 09:07:08.314788 master-0 kubenswrapper[18707]: I0320 09:07:08.314774 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-backup-0" event={"ID":"6a3e0612-d529-446a-b38d-dfa7a9e41d27","Type":"ContainerStarted","Data":"b1ddb94188c8e44b90a34d21b33c5595ee0bad1b12eb08f91d193f12e79c854e"} Mar 20 09:07:08.315313 master-0 kubenswrapper[18707]: I0320 09:07:08.315232 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" podUID="dd73e9d3-234d-4470-b5a4-9abf382a12d7" containerName="probe" containerID="cri-o://6f8663594828bb76bc1343794734214f71dce271486b5a2b4189d57b43da0130" gracePeriod=30 Mar 20 09:07:08.318635 master-0 kubenswrapper[18707]: W0320 09:07:08.318569 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e85ce06_e9d6_430b_af36_fc1cab6fbc1c.slice/crio-a745ef90991eeb54ff4fec15ef5fa03ec27b11aef1d44bc12b1ade9f3299ef09 WatchSource:0}: Error finding container a745ef90991eeb54ff4fec15ef5fa03ec27b11aef1d44bc12b1ade9f3299ef09: Status 404 returned error can't find the container with id a745ef90991eeb54ff4fec15ef5fa03ec27b11aef1d44bc12b1ade9f3299ef09 Mar 20 09:07:08.835295 master-0 kubenswrapper[18707]: I0320 09:07:08.832745 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74db7c7d5f-t4vkw" podUID="6f5fb021-477c-4a7e-8f92-224e08645060" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.214:5353: i/o timeout" Mar 20 09:07:09.114285 master-0 kubenswrapper[18707]: I0320 09:07:09.113826 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf" path="/var/lib/kubelet/pods/0ae8e4bf-a2f7-4104-ba9c-0eafa246bfbf/volumes" Mar 20 09:07:09.116310 master-0 kubenswrapper[18707]: I0320 09:07:09.115563 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e20cdef0-3262-4c1b-b699-e5a4ff8270dd" path="/var/lib/kubelet/pods/e20cdef0-3262-4c1b-b699-e5a4ff8270dd/volumes" Mar 20 09:07:09.349553 master-0 kubenswrapper[18707]: I0320 09:07:09.349245 18707 generic.go:334] "Generic (PLEG): container finished" podID="dd73e9d3-234d-4470-b5a4-9abf382a12d7" containerID="6f8663594828bb76bc1343794734214f71dce271486b5a2b4189d57b43da0130" exitCode=0 Mar 20 09:07:09.349553 master-0 kubenswrapper[18707]: I0320 09:07:09.349288 18707 generic.go:334] "Generic (PLEG): container finished" podID="dd73e9d3-234d-4470-b5a4-9abf382a12d7" containerID="ae05bc92d214153ce18f48c52230166a2e389f20f1e35ff98414dcb3986ae81e" exitCode=0 Mar 20 09:07:09.349553 master-0 kubenswrapper[18707]: I0320 09:07:09.349333 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" event={"ID":"dd73e9d3-234d-4470-b5a4-9abf382a12d7","Type":"ContainerDied","Data":"6f8663594828bb76bc1343794734214f71dce271486b5a2b4189d57b43da0130"} Mar 20 09:07:09.349553 master-0 kubenswrapper[18707]: I0320 09:07:09.349362 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" event={"ID":"dd73e9d3-234d-4470-b5a4-9abf382a12d7","Type":"ContainerDied","Data":"ae05bc92d214153ce18f48c52230166a2e389f20f1e35ff98414dcb3986ae81e"} Mar 20 09:07:09.376580 master-0 kubenswrapper[18707]: I0320 09:07:09.376479 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-backup-0" event={"ID":"6a3e0612-d529-446a-b38d-dfa7a9e41d27","Type":"ContainerStarted","Data":"6e0808cc42fcf46838d1a11f0ae7c760ee1e12758851170cc391adcd406f84b3"} Mar 20 09:07:09.376580 master-0 kubenswrapper[18707]: I0320 09:07:09.376538 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-backup-0" event={"ID":"6a3e0612-d529-446a-b38d-dfa7a9e41d27","Type":"ContainerStarted","Data":"d01eb0c37726de0b27ebde360e13a871db222026d2428719a7fb1428f12223dc"} Mar 20 09:07:09.415438 master-0 kubenswrapper[18707]: I0320 09:07:09.415376 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-scheduler-0" event={"ID":"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c","Type":"ContainerStarted","Data":"202ecd0581803f2e0a01f6a536ebbd08bbd9fa93de2c70f8f216fdaade75555e"} Mar 20 09:07:09.415438 master-0 kubenswrapper[18707]: I0320 09:07:09.415449 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-scheduler-0" event={"ID":"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c","Type":"ContainerStarted","Data":"a745ef90991eeb54ff4fec15ef5fa03ec27b11aef1d44bc12b1ade9f3299ef09"} Mar 20 09:07:09.493760 master-0 kubenswrapper[18707]: I0320 09:07:09.470954 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-c920a-backup-0" podStartSLOduration=2.470931972 podStartE2EDuration="2.470931972s" podCreationTimestamp="2026-03-20 09:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:07:09.457553459 +0000 UTC m=+1574.613733815" watchObservedRunningTime="2026-03-20 09:07:09.470931972 +0000 UTC m=+1574.627112328" Mar 20 09:07:10.362485 master-0 kubenswrapper[18707]: I0320 09:07:10.362435 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:10.470369 master-0 kubenswrapper[18707]: I0320 09:07:10.469496 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-locks-cinder\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.470766 master-0 kubenswrapper[18707]: I0320 09:07:10.470729 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-locks-brick\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.470897 master-0 kubenswrapper[18707]: I0320 09:07:10.470884 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-scripts\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.471040 master-0 kubenswrapper[18707]: I0320 09:07:10.471025 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-combined-ca-bundle\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.471159 master-0 kubenswrapper[18707]: I0320 09:07:10.471146 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-config-data\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.471263 master-0 kubenswrapper[18707]: I0320 09:07:10.471251 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-lib-modules\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.471390 master-0 kubenswrapper[18707]: I0320 09:07:10.471378 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-sys\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.471465 master-0 kubenswrapper[18707]: I0320 09:07:10.471454 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-lib-cinder\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.471544 master-0 kubenswrapper[18707]: I0320 09:07:10.471533 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-dev\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.471616 master-0 kubenswrapper[18707]: I0320 09:07:10.469585 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:10.471659 master-0 kubenswrapper[18707]: I0320 09:07:10.471069 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:10.471659 master-0 kubenswrapper[18707]: I0320 09:07:10.471648 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-sys" (OuterVolumeSpecName: "sys") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:10.471721 master-0 kubenswrapper[18707]: I0320 09:07:10.471672 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:10.471721 master-0 kubenswrapper[18707]: I0320 09:07:10.471695 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:10.471838 master-0 kubenswrapper[18707]: I0320 09:07:10.471824 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-iscsi\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.473425 master-0 kubenswrapper[18707]: I0320 09:07:10.471948 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-nvme\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.473586 master-0 kubenswrapper[18707]: I0320 09:07:10.473565 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-878x5\" (UniqueName: \"kubernetes.io/projected/dd73e9d3-234d-4470-b5a4-9abf382a12d7-kube-api-access-878x5\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.473689 master-0 kubenswrapper[18707]: I0320 09:07:10.473676 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-run\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.473794 master-0 kubenswrapper[18707]: I0320 09:07:10.473781 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-machine-id\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.473904 master-0 kubenswrapper[18707]: I0320 09:07:10.473890 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-config-data-custom\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.475923 master-0 kubenswrapper[18707]: I0320 09:07:10.472001 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-dev" (OuterVolumeSpecName: "dev") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:10.475923 master-0 kubenswrapper[18707]: I0320 09:07:10.472024 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:10.475923 master-0 kubenswrapper[18707]: I0320 09:07:10.472048 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:10.475923 master-0 kubenswrapper[18707]: I0320 09:07:10.475651 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-run" (OuterVolumeSpecName: "run") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:10.475923 master-0 kubenswrapper[18707]: I0320 09:07:10.475877 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:07:10.476683 master-0 kubenswrapper[18707]: I0320 09:07:10.476628 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-scripts" (OuterVolumeSpecName: "scripts") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:10.476753 master-0 kubenswrapper[18707]: I0320 09:07:10.476721 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:10.476867 master-0 kubenswrapper[18707]: I0320 09:07:10.476777 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" event={"ID":"dd73e9d3-234d-4470-b5a4-9abf382a12d7","Type":"ContainerDied","Data":"6ae22b9f66da8d403dd2fa33209f3211afecbc48a42e679878c7895133fd3b4f"} Mar 20 09:07:10.476910 master-0 kubenswrapper[18707]: I0320 09:07:10.476894 18707 scope.go:117] "RemoveContainer" containerID="6f8663594828bb76bc1343794734214f71dce271486b5a2b4189d57b43da0130" Mar 20 09:07:10.482138 master-0 kubenswrapper[18707]: I0320 09:07:10.482091 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:10.484935 master-0 kubenswrapper[18707]: I0320 09:07:10.484360 18707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:10.485911 master-0 kubenswrapper[18707]: I0320 09:07:10.485882 18707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:10.486149 master-0 kubenswrapper[18707]: I0320 09:07:10.486137 18707 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:10.486302 master-0 kubenswrapper[18707]: I0320 09:07:10.486284 18707 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:10.486380 master-0 kubenswrapper[18707]: I0320 09:07:10.486368 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:10.486466 master-0 kubenswrapper[18707]: I0320 09:07:10.486453 18707 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:10.486557 master-0 kubenswrapper[18707]: I0320 09:07:10.486542 18707 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-sys\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:10.486643 master-0 kubenswrapper[18707]: I0320 09:07:10.486629 18707 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:10.486720 master-0 kubenswrapper[18707]: I0320 09:07:10.486707 18707 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-dev\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:10.486806 master-0 kubenswrapper[18707]: I0320 09:07:10.486794 18707 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:10.486881 master-0 kubenswrapper[18707]: I0320 09:07:10.486867 18707 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:10.486968 master-0 kubenswrapper[18707]: I0320 09:07:10.486956 18707 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dd73e9d3-234d-4470-b5a4-9abf382a12d7-run\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:10.488619 master-0 kubenswrapper[18707]: I0320 09:07:10.488112 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd73e9d3-234d-4470-b5a4-9abf382a12d7-kube-api-access-878x5" (OuterVolumeSpecName: "kube-api-access-878x5") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "kube-api-access-878x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:10.591026 master-0 kubenswrapper[18707]: I0320 09:07:10.589677 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-878x5\" (UniqueName: \"kubernetes.io/projected/dd73e9d3-234d-4470-b5a4-9abf382a12d7-kube-api-access-878x5\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:10.617060 master-0 kubenswrapper[18707]: I0320 09:07:10.616368 18707 scope.go:117] "RemoveContainer" containerID="ae05bc92d214153ce18f48c52230166a2e389f20f1e35ff98414dcb3986ae81e" Mar 20 09:07:10.910211 master-0 kubenswrapper[18707]: I0320 09:07:10.909383 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:10.916703 master-0 kubenswrapper[18707]: I0320 09:07:10.916414 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-combined-ca-bundle\") pod \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\" (UID: \"dd73e9d3-234d-4470-b5a4-9abf382a12d7\") " Mar 20 09:07:10.921209 master-0 kubenswrapper[18707]: W0320 09:07:10.919924 18707 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/dd73e9d3-234d-4470-b5a4-9abf382a12d7/volumes/kubernetes.io~secret/combined-ca-bundle Mar 20 09:07:10.921209 master-0 kubenswrapper[18707]: I0320 09:07:10.919970 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:10.937092 master-0 kubenswrapper[18707]: I0320 09:07:10.933553 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:10.937709 master-0 kubenswrapper[18707]: I0320 09:07:10.937634 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-config-data" (OuterVolumeSpecName: "config-data") pod "dd73e9d3-234d-4470-b5a4-9abf382a12d7" (UID: "dd73e9d3-234d-4470-b5a4-9abf382a12d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:11.037495 master-0 kubenswrapper[18707]: I0320 09:07:11.037417 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd73e9d3-234d-4470-b5a4-9abf382a12d7-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:11.513843 master-0 kubenswrapper[18707]: I0320 09:07:11.513789 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" event={"ID":"bae03209-559e-4828-bcb6-70eb73ff61dc","Type":"ContainerStarted","Data":"ec656880600d950a9ab66c0cf523cb4ea3121c5a64decd1830a980cbbedb372e"} Mar 20 09:07:11.527287 master-0 kubenswrapper[18707]: I0320 09:07:11.521077 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-scheduler-0" event={"ID":"4e85ce06-e9d6-430b-af36-fc1cab6fbc1c","Type":"ContainerStarted","Data":"24bd90f0d7fa5446af65ec297e19dc3bf82dcae132389b9e8c667c50252116b9"} Mar 20 09:07:11.990495 master-0 kubenswrapper[18707]: I0320 09:07:11.990444 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c920a-volume-lvm-iscsi-0"] Mar 20 09:07:12.267469 master-0 kubenswrapper[18707]: I0320 09:07:12.267377 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-c920a-volume-lvm-iscsi-0"] Mar 20 09:07:12.551803 master-0 kubenswrapper[18707]: I0320 09:07:12.551677 18707 generic.go:334] "Generic (PLEG): container finished" podID="4d44d4b7-ce01-4aa4-9155-8338ad17b404" containerID="ace777ac4645bf819154a3e7c1bf57c4aea08bbc52fca5b1847b2780c95798bc" exitCode=0 Mar 20 09:07:12.551803 master-0 kubenswrapper[18707]: I0320 09:07:12.551766 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-checksum-discovery-49xmz" event={"ID":"4d44d4b7-ce01-4aa4-9155-8338ad17b404","Type":"ContainerDied","Data":"ace777ac4645bf819154a3e7c1bf57c4aea08bbc52fca5b1847b2780c95798bc"} Mar 20 09:07:12.557596 master-0 kubenswrapper[18707]: I0320 09:07:12.557532 18707 generic.go:334] "Generic (PLEG): container finished" podID="bae03209-559e-4828-bcb6-70eb73ff61dc" containerID="ec656880600d950a9ab66c0cf523cb4ea3121c5a64decd1830a980cbbedb372e" exitCode=0 Mar 20 09:07:12.558946 master-0 kubenswrapper[18707]: I0320 09:07:12.558911 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" event={"ID":"bae03209-559e-4828-bcb6-70eb73ff61dc","Type":"ContainerDied","Data":"ec656880600d950a9ab66c0cf523cb4ea3121c5a64decd1830a980cbbedb372e"} Mar 20 09:07:12.605510 master-0 kubenswrapper[18707]: I0320 09:07:12.605468 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:12.642437 master-0 kubenswrapper[18707]: I0320 09:07:12.641993 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-c920a-scheduler-0" podStartSLOduration=5.641955617 podStartE2EDuration="5.641955617s" podCreationTimestamp="2026-03-20 09:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:07:12.542926047 +0000 UTC m=+1577.699106403" watchObservedRunningTime="2026-03-20 09:07:12.641955617 +0000 UTC m=+1577.798135973" Mar 20 09:07:12.647733 master-0 kubenswrapper[18707]: I0320 09:07:12.647670 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c920a-volume-lvm-iscsi-0"] Mar 20 09:07:12.648202 master-0 kubenswrapper[18707]: E0320 09:07:12.648159 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd73e9d3-234d-4470-b5a4-9abf382a12d7" containerName="probe" Mar 20 09:07:12.654383 master-0 kubenswrapper[18707]: I0320 09:07:12.648177 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd73e9d3-234d-4470-b5a4-9abf382a12d7" containerName="probe" Mar 20 09:07:12.654790 master-0 kubenswrapper[18707]: E0320 09:07:12.654690 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd73e9d3-234d-4470-b5a4-9abf382a12d7" containerName="cinder-volume" Mar 20 09:07:12.654790 master-0 kubenswrapper[18707]: I0320 09:07:12.654751 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd73e9d3-234d-4470-b5a4-9abf382a12d7" containerName="cinder-volume" Mar 20 09:07:12.655406 master-0 kubenswrapper[18707]: I0320 09:07:12.655374 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd73e9d3-234d-4470-b5a4-9abf382a12d7" containerName="cinder-volume" Mar 20 09:07:12.655406 master-0 kubenswrapper[18707]: I0320 09:07:12.655397 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd73e9d3-234d-4470-b5a4-9abf382a12d7" containerName="probe" Mar 20 09:07:12.658842 master-0 kubenswrapper[18707]: I0320 09:07:12.658777 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.662693 master-0 kubenswrapper[18707]: I0320 09:07:12.662445 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-c920a-volume-lvm-iscsi-config-data" Mar 20 09:07:12.705649 master-0 kubenswrapper[18707]: I0320 09:07:12.705517 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:12.710633 master-0 kubenswrapper[18707]: I0320 09:07:12.710529 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcbnj\" (UniqueName: \"kubernetes.io/projected/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-kube-api-access-gcbnj\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.710782 master-0 kubenswrapper[18707]: I0320 09:07:12.710662 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-etc-iscsi\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.710782 master-0 kubenswrapper[18707]: I0320 09:07:12.710702 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-config-data\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.710782 master-0 kubenswrapper[18707]: I0320 09:07:12.710726 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-combined-ca-bundle\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.710782 master-0 kubenswrapper[18707]: I0320 09:07:12.710750 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-etc-machine-id\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.710950 master-0 kubenswrapper[18707]: I0320 09:07:12.710785 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-run\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.710950 master-0 kubenswrapper[18707]: I0320 09:07:12.710821 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-etc-nvme\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.710950 master-0 kubenswrapper[18707]: I0320 09:07:12.710846 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-lib-modules\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.710950 master-0 kubenswrapper[18707]: I0320 09:07:12.710871 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-scripts\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.711118 master-0 kubenswrapper[18707]: I0320 09:07:12.710953 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-sys\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.711118 master-0 kubenswrapper[18707]: I0320 09:07:12.711024 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-var-locks-cinder\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.711118 master-0 kubenswrapper[18707]: I0320 09:07:12.711064 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-var-lib-cinder\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.711118 master-0 kubenswrapper[18707]: I0320 09:07:12.711087 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-config-data-custom\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.711300 master-0 kubenswrapper[18707]: I0320 09:07:12.711197 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-dev\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.711300 master-0 kubenswrapper[18707]: I0320 09:07:12.711225 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-var-locks-brick\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.813515 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-dev\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.813571 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-var-locks-brick\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.813601 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcbnj\" (UniqueName: \"kubernetes.io/projected/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-kube-api-access-gcbnj\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816214 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-dev\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816356 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-etc-iscsi\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816396 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-config-data\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816416 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-combined-ca-bundle\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816460 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-etc-machine-id\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816487 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-run\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816490 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-var-locks-brick\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816542 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-etc-nvme\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816600 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-etc-machine-id\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816623 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-run\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816670 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-etc-nvme\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816711 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-lib-modules\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816745 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-scripts\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816742 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-etc-iscsi\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816816 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-lib-modules\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816881 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-sys\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.816988 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-var-locks-cinder\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.817019 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-var-lib-cinder\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.817041 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-config-data-custom\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.817726 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-var-locks-cinder\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.817764 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-sys\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.825249 master-0 kubenswrapper[18707]: I0320 09:07:12.819205 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-var-lib-cinder\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.830224 master-0 kubenswrapper[18707]: I0320 09:07:12.827607 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-config-data-custom\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.830224 master-0 kubenswrapper[18707]: I0320 09:07:12.828272 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-config-data\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.835574 master-0 kubenswrapper[18707]: I0320 09:07:12.835472 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-scripts\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.836167 master-0 kubenswrapper[18707]: I0320 09:07:12.836087 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" podStartSLOduration=13.590477347 podStartE2EDuration="16.836073394s" podCreationTimestamp="2026-03-20 09:06:56 +0000 UTC" firstStartedPulling="2026-03-20 09:07:07.09024081 +0000 UTC m=+1572.246421166" lastFinishedPulling="2026-03-20 09:07:10.335836857 +0000 UTC m=+1575.492017213" observedRunningTime="2026-03-20 09:07:12.789160433 +0000 UTC m=+1577.945340819" watchObservedRunningTime="2026-03-20 09:07:12.836073394 +0000 UTC m=+1577.992253750" Mar 20 09:07:12.856220 master-0 kubenswrapper[18707]: I0320 09:07:12.853523 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-combined-ca-bundle\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.856220 master-0 kubenswrapper[18707]: I0320 09:07:12.853596 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcbnj\" (UniqueName: \"kubernetes.io/projected/237b7604-dd48-43fe-b94f-9bb8ad0a68e4-kube-api-access-gcbnj\") pod \"cinder-c920a-volume-lvm-iscsi-0\" (UID: \"237b7604-dd48-43fe-b94f-9bb8ad0a68e4\") " pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:12.869208 master-0 kubenswrapper[18707]: I0320 09:07:12.865372 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-volume-lvm-iscsi-0"] Mar 20 09:07:13.014226 master-0 kubenswrapper[18707]: I0320 09:07:13.013627 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:13.159099 master-0 kubenswrapper[18707]: I0320 09:07:13.158949 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd73e9d3-234d-4470-b5a4-9abf382a12d7" path="/var/lib/kubelet/pods/dd73e9d3-234d-4470-b5a4-9abf382a12d7/volumes" Mar 20 09:07:13.587454 master-0 kubenswrapper[18707]: I0320 09:07:13.587389 18707 generic.go:334] "Generic (PLEG): container finished" podID="4d44d4b7-ce01-4aa4-9155-8338ad17b404" containerID="38f7b6ca8042ae518fe70e8a22137198f9493b653bb668a681f315aeade3b616" exitCode=0 Mar 20 09:07:13.588347 master-0 kubenswrapper[18707]: I0320 09:07:13.588268 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-checksum-discovery-49xmz" event={"ID":"4d44d4b7-ce01-4aa4-9155-8338ad17b404","Type":"ContainerDied","Data":"38f7b6ca8042ae518fe70e8a22137198f9493b653bb668a681f315aeade3b616"} Mar 20 09:07:13.687428 master-0 kubenswrapper[18707]: I0320 09:07:13.687368 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c920a-volume-lvm-iscsi-0"] Mar 20 09:07:13.691787 master-0 kubenswrapper[18707]: W0320 09:07:13.691729 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod237b7604_dd48_43fe_b94f_9bb8ad0a68e4.slice/crio-1648a96022fcbf77839685295cd0f2ac44f2a0290e67d904876b3658c755ca77 WatchSource:0}: Error finding container 1648a96022fcbf77839685295cd0f2ac44f2a0290e67d904876b3658c755ca77: Status 404 returned error can't find the container with id 1648a96022fcbf77839685295cd0f2ac44f2a0290e67d904876b3658c755ca77 Mar 20 09:07:14.224662 master-0 kubenswrapper[18707]: I0320 09:07:14.224147 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:07:14.285074 master-0 kubenswrapper[18707]: I0320 09:07:14.284989 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" Mar 20 09:07:14.287097 master-0 kubenswrapper[18707]: I0320 09:07:14.286514 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:07:14.401927 master-0 kubenswrapper[18707]: I0320 09:07:14.399643 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/bae03209-559e-4828-bcb6-70eb73ff61dc-image-data\") pod \"bae03209-559e-4828-bcb6-70eb73ff61dc\" (UID: \"bae03209-559e-4828-bcb6-70eb73ff61dc\") " Mar 20 09:07:14.401927 master-0 kubenswrapper[18707]: I0320 09:07:14.399811 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dhhn\" (UniqueName: \"kubernetes.io/projected/bae03209-559e-4828-bcb6-70eb73ff61dc-kube-api-access-8dhhn\") pod \"bae03209-559e-4828-bcb6-70eb73ff61dc\" (UID: \"bae03209-559e-4828-bcb6-70eb73ff61dc\") " Mar 20 09:07:14.418674 master-0 kubenswrapper[18707]: I0320 09:07:14.418599 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bae03209-559e-4828-bcb6-70eb73ff61dc-kube-api-access-8dhhn" (OuterVolumeSpecName: "kube-api-access-8dhhn") pod "bae03209-559e-4828-bcb6-70eb73ff61dc" (UID: "bae03209-559e-4828-bcb6-70eb73ff61dc"). InnerVolumeSpecName "kube-api-access-8dhhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:14.507275 master-0 kubenswrapper[18707]: I0320 09:07:14.507153 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dhhn\" (UniqueName: \"kubernetes.io/projected/bae03209-559e-4828-bcb6-70eb73ff61dc-kube-api-access-8dhhn\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:14.550429 master-0 kubenswrapper[18707]: I0320 09:07:14.549575 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bae03209-559e-4828-bcb6-70eb73ff61dc-image-data" (OuterVolumeSpecName: "image-data") pod "bae03209-559e-4828-bcb6-70eb73ff61dc" (UID: "bae03209-559e-4828-bcb6-70eb73ff61dc"). InnerVolumeSpecName "image-data". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:07:14.599340 master-0 kubenswrapper[18707]: I0320 09:07:14.599087 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7754cdbcb6-lblgz"] Mar 20 09:07:14.600110 master-0 kubenswrapper[18707]: E0320 09:07:14.599981 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae03209-559e-4828-bcb6-70eb73ff61dc" containerName="init" Mar 20 09:07:14.600110 master-0 kubenswrapper[18707]: I0320 09:07:14.600001 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae03209-559e-4828-bcb6-70eb73ff61dc" containerName="init" Mar 20 09:07:14.600110 master-0 kubenswrapper[18707]: E0320 09:07:14.600064 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae03209-559e-4828-bcb6-70eb73ff61dc" containerName="edpm-b-provisionserver-checksum-discovery" Mar 20 09:07:14.600110 master-0 kubenswrapper[18707]: I0320 09:07:14.600074 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae03209-559e-4828-bcb6-70eb73ff61dc" containerName="edpm-b-provisionserver-checksum-discovery" Mar 20 09:07:14.600393 master-0 kubenswrapper[18707]: I0320 09:07:14.600296 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae03209-559e-4828-bcb6-70eb73ff61dc" containerName="edpm-b-provisionserver-checksum-discovery" Mar 20 09:07:14.608064 master-0 kubenswrapper[18707]: I0320 09:07:14.608018 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.609405 master-0 kubenswrapper[18707]: I0320 09:07:14.609341 18707 reconciler_common.go:293] "Volume detached for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/bae03209-559e-4828-bcb6-70eb73ff61dc-image-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:14.644813 master-0 kubenswrapper[18707]: I0320 09:07:14.643592 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7754cdbcb6-lblgz"] Mar 20 09:07:14.652022 master-0 kubenswrapper[18707]: I0320 09:07:14.651972 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" Mar 20 09:07:14.652958 master-0 kubenswrapper[18707]: I0320 09:07:14.651966 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-b-provisionserver-checksum-discovery-mf2jz" event={"ID":"bae03209-559e-4828-bcb6-70eb73ff61dc","Type":"ContainerDied","Data":"05f75b7b8030eb42acaa78a590dc6a2e04425c7e8739ca2e01fff989fd0fd366"} Mar 20 09:07:14.653026 master-0 kubenswrapper[18707]: I0320 09:07:14.652974 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05f75b7b8030eb42acaa78a590dc6a2e04425c7e8739ca2e01fff989fd0fd366" Mar 20 09:07:14.658152 master-0 kubenswrapper[18707]: I0320 09:07:14.658105 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" event={"ID":"237b7604-dd48-43fe-b94f-9bb8ad0a68e4","Type":"ContainerStarted","Data":"7d9ed7268285574f545cf5084d75124f6e587bc3e717149d9bd6d07d7196c3d1"} Mar 20 09:07:14.658340 master-0 kubenswrapper[18707]: I0320 09:07:14.658321 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" event={"ID":"237b7604-dd48-43fe-b94f-9bb8ad0a68e4","Type":"ContainerStarted","Data":"9a437056856ee765b4a2c0bddef5ec922f5f4cc6f169a140f447527446cb0c9c"} Mar 20 09:07:14.658430 master-0 kubenswrapper[18707]: I0320 09:07:14.658416 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" event={"ID":"237b7604-dd48-43fe-b94f-9bb8ad0a68e4","Type":"ContainerStarted","Data":"1648a96022fcbf77839685295cd0f2ac44f2a0290e67d904876b3658c755ca77"} Mar 20 09:07:14.698006 master-0 kubenswrapper[18707]: I0320 09:07:14.697601 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" podStartSLOduration=2.697581388 podStartE2EDuration="2.697581388s" podCreationTimestamp="2026-03-20 09:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:07:14.692366929 +0000 UTC m=+1579.848547285" watchObservedRunningTime="2026-03-20 09:07:14.697581388 +0000 UTC m=+1579.853761744" Mar 20 09:07:14.710939 master-0 kubenswrapper[18707]: I0320 09:07:14.710876 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/549b0bdb-08b2-4ed7-9335-81637a54c925-internal-tls-certs\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.710939 master-0 kubenswrapper[18707]: I0320 09:07:14.710930 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/549b0bdb-08b2-4ed7-9335-81637a54c925-logs\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.711298 master-0 kubenswrapper[18707]: I0320 09:07:14.710983 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/549b0bdb-08b2-4ed7-9335-81637a54c925-scripts\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.711298 master-0 kubenswrapper[18707]: I0320 09:07:14.711060 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/549b0bdb-08b2-4ed7-9335-81637a54c925-config-data\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.711298 master-0 kubenswrapper[18707]: I0320 09:07:14.711101 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x99ll\" (UniqueName: \"kubernetes.io/projected/549b0bdb-08b2-4ed7-9335-81637a54c925-kube-api-access-x99ll\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.711298 master-0 kubenswrapper[18707]: I0320 09:07:14.711137 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549b0bdb-08b2-4ed7-9335-81637a54c925-combined-ca-bundle\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.711298 master-0 kubenswrapper[18707]: I0320 09:07:14.711153 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/549b0bdb-08b2-4ed7-9335-81637a54c925-public-tls-certs\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.846492 master-0 kubenswrapper[18707]: I0320 09:07:14.845642 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/549b0bdb-08b2-4ed7-9335-81637a54c925-config-data\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.846492 master-0 kubenswrapper[18707]: I0320 09:07:14.845743 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x99ll\" (UniqueName: \"kubernetes.io/projected/549b0bdb-08b2-4ed7-9335-81637a54c925-kube-api-access-x99ll\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.849232 master-0 kubenswrapper[18707]: I0320 09:07:14.846841 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549b0bdb-08b2-4ed7-9335-81637a54c925-combined-ca-bundle\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.849232 master-0 kubenswrapper[18707]: I0320 09:07:14.846905 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/549b0bdb-08b2-4ed7-9335-81637a54c925-public-tls-certs\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.849232 master-0 kubenswrapper[18707]: I0320 09:07:14.847238 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/549b0bdb-08b2-4ed7-9335-81637a54c925-internal-tls-certs\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.849232 master-0 kubenswrapper[18707]: I0320 09:07:14.847334 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/549b0bdb-08b2-4ed7-9335-81637a54c925-logs\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.849232 master-0 kubenswrapper[18707]: I0320 09:07:14.847641 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/549b0bdb-08b2-4ed7-9335-81637a54c925-logs\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.849232 master-0 kubenswrapper[18707]: I0320 09:07:14.847838 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/549b0bdb-08b2-4ed7-9335-81637a54c925-scripts\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.851925 master-0 kubenswrapper[18707]: I0320 09:07:14.851853 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/549b0bdb-08b2-4ed7-9335-81637a54c925-public-tls-certs\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.859310 master-0 kubenswrapper[18707]: I0320 09:07:14.855437 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/549b0bdb-08b2-4ed7-9335-81637a54c925-scripts\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.859310 master-0 kubenswrapper[18707]: I0320 09:07:14.856616 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/549b0bdb-08b2-4ed7-9335-81637a54c925-internal-tls-certs\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.859310 master-0 kubenswrapper[18707]: I0320 09:07:14.858179 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/549b0bdb-08b2-4ed7-9335-81637a54c925-config-data\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.863589 master-0 kubenswrapper[18707]: I0320 09:07:14.861761 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549b0bdb-08b2-4ed7-9335-81637a54c925-combined-ca-bundle\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:14.869517 master-0 kubenswrapper[18707]: I0320 09:07:14.869347 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x99ll\" (UniqueName: \"kubernetes.io/projected/549b0bdb-08b2-4ed7-9335-81637a54c925-kube-api-access-x99ll\") pod \"placement-7754cdbcb6-lblgz\" (UID: \"549b0bdb-08b2-4ed7-9335-81637a54c925\") " pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:15.015399 master-0 kubenswrapper[18707]: I0320 09:07:15.015076 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:15.189025 master-0 kubenswrapper[18707]: I0320 09:07:15.188339 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-a-provisionserver-checksum-discovery-49xmz" Mar 20 09:07:15.277605 master-0 kubenswrapper[18707]: I0320 09:07:15.277381 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-c920a-api-0" Mar 20 09:07:15.385585 master-0 kubenswrapper[18707]: I0320 09:07:15.385529 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fprh7\" (UniqueName: \"kubernetes.io/projected/4d44d4b7-ce01-4aa4-9155-8338ad17b404-kube-api-access-fprh7\") pod \"4d44d4b7-ce01-4aa4-9155-8338ad17b404\" (UID: \"4d44d4b7-ce01-4aa4-9155-8338ad17b404\") " Mar 20 09:07:15.385844 master-0 kubenswrapper[18707]: I0320 09:07:15.385663 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/4d44d4b7-ce01-4aa4-9155-8338ad17b404-image-data\") pod \"4d44d4b7-ce01-4aa4-9155-8338ad17b404\" (UID: \"4d44d4b7-ce01-4aa4-9155-8338ad17b404\") " Mar 20 09:07:15.390745 master-0 kubenswrapper[18707]: I0320 09:07:15.390692 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d44d4b7-ce01-4aa4-9155-8338ad17b404-kube-api-access-fprh7" (OuterVolumeSpecName: "kube-api-access-fprh7") pod "4d44d4b7-ce01-4aa4-9155-8338ad17b404" (UID: "4d44d4b7-ce01-4aa4-9155-8338ad17b404"). InnerVolumeSpecName "kube-api-access-fprh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:15.493167 master-0 kubenswrapper[18707]: I0320 09:07:15.493104 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fprh7\" (UniqueName: \"kubernetes.io/projected/4d44d4b7-ce01-4aa4-9155-8338ad17b404-kube-api-access-fprh7\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:15.535347 master-0 kubenswrapper[18707]: I0320 09:07:15.528333 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d44d4b7-ce01-4aa4-9155-8338ad17b404-image-data" (OuterVolumeSpecName: "image-data") pod "4d44d4b7-ce01-4aa4-9155-8338ad17b404" (UID: "4d44d4b7-ce01-4aa4-9155-8338ad17b404"). InnerVolumeSpecName "image-data". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:07:15.602648 master-0 kubenswrapper[18707]: I0320 09:07:15.602560 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7754cdbcb6-lblgz"] Mar 20 09:07:15.603129 master-0 kubenswrapper[18707]: I0320 09:07:15.602669 18707 reconciler_common.go:293] "Volume detached for volume \"image-data\" (UniqueName: \"kubernetes.io/empty-dir/4d44d4b7-ce01-4aa4-9155-8338ad17b404-image-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:15.609615 master-0 kubenswrapper[18707]: W0320 09:07:15.609556 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod549b0bdb_08b2_4ed7_9335_81637a54c925.slice/crio-68972a81618369a94da6de7c24445003986e9f6a0cab2a45227a930864026b7b WatchSource:0}: Error finding container 68972a81618369a94da6de7c24445003986e9f6a0cab2a45227a930864026b7b: Status 404 returned error can't find the container with id 68972a81618369a94da6de7c24445003986e9f6a0cab2a45227a930864026b7b Mar 20 09:07:15.680158 master-0 kubenswrapper[18707]: I0320 09:07:15.680088 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/edpm-a-provisionserver-checksum-discovery-49xmz" event={"ID":"4d44d4b7-ce01-4aa4-9155-8338ad17b404","Type":"ContainerDied","Data":"d12525b1b89fdcb755cdbc04b984565ac34947d5a7c249f9b10aa2c7e2204a6c"} Mar 20 09:07:15.680158 master-0 kubenswrapper[18707]: I0320 09:07:15.680141 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d12525b1b89fdcb755cdbc04b984565ac34947d5a7c249f9b10aa2c7e2204a6c" Mar 20 09:07:15.680419 master-0 kubenswrapper[18707]: I0320 09:07:15.680293 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/edpm-a-provisionserver-checksum-discovery-49xmz" Mar 20 09:07:15.683748 master-0 kubenswrapper[18707]: I0320 09:07:15.683297 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7754cdbcb6-lblgz" event={"ID":"549b0bdb-08b2-4ed7-9335-81637a54c925","Type":"ContainerStarted","Data":"68972a81618369a94da6de7c24445003986e9f6a0cab2a45227a930864026b7b"} Mar 20 09:07:16.699446 master-0 kubenswrapper[18707]: I0320 09:07:16.699374 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7754cdbcb6-lblgz" event={"ID":"549b0bdb-08b2-4ed7-9335-81637a54c925","Type":"ContainerStarted","Data":"e82b69e1098f1f1c4580f8e2098174d6338992cb32a147d79e090f85058d4bec"} Mar 20 09:07:16.699446 master-0 kubenswrapper[18707]: I0320 09:07:16.699442 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7754cdbcb6-lblgz" event={"ID":"549b0bdb-08b2-4ed7-9335-81637a54c925","Type":"ContainerStarted","Data":"489faac3b7840a226e0025afe9e26ca5bda923e3ef26a00f44bef7bcb37a4028"} Mar 20 09:07:16.700098 master-0 kubenswrapper[18707]: I0320 09:07:16.699496 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:16.700098 master-0 kubenswrapper[18707]: I0320 09:07:16.699824 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:17.840400 master-0 kubenswrapper[18707]: I0320 09:07:17.838667 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-c920a-backup-0" Mar 20 09:07:17.874897 master-0 kubenswrapper[18707]: I0320 09:07:17.874805 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7754cdbcb6-lblgz" podStartSLOduration=3.87478865 podStartE2EDuration="3.87478865s" podCreationTimestamp="2026-03-20 09:07:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:07:16.731807579 +0000 UTC m=+1581.887987935" watchObservedRunningTime="2026-03-20 09:07:17.87478865 +0000 UTC m=+1583.030969006" Mar 20 09:07:18.010388 master-0 kubenswrapper[18707]: I0320 09:07:18.010334 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-c920a-scheduler-0" Mar 20 09:07:18.014513 master-0 kubenswrapper[18707]: I0320 09:07:18.014464 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:22.742633 master-0 kubenswrapper[18707]: I0320 09:07:22.742542 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:07:23.235261 master-0 kubenswrapper[18707]: I0320 09:07:23.235078 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-c920a-volume-lvm-iscsi-0" Mar 20 09:07:23.813120 master-0 kubenswrapper[18707]: I0320 09:07:23.812866 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5987d8f9fd-q4m4b" Mar 20 09:07:29.011514 master-0 kubenswrapper[18707]: I0320 09:07:29.011382 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 09:07:29.012585 master-0 kubenswrapper[18707]: E0320 09:07:29.012043 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d44d4b7-ce01-4aa4-9155-8338ad17b404" containerName="init" Mar 20 09:07:29.012585 master-0 kubenswrapper[18707]: I0320 09:07:29.012062 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d44d4b7-ce01-4aa4-9155-8338ad17b404" containerName="init" Mar 20 09:07:29.012585 master-0 kubenswrapper[18707]: E0320 09:07:29.012093 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d44d4b7-ce01-4aa4-9155-8338ad17b404" containerName="edpm-a-provisionserver-checksum-discovery" Mar 20 09:07:29.012585 master-0 kubenswrapper[18707]: I0320 09:07:29.012104 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d44d4b7-ce01-4aa4-9155-8338ad17b404" containerName="edpm-a-provisionserver-checksum-discovery" Mar 20 09:07:29.012585 master-0 kubenswrapper[18707]: I0320 09:07:29.012424 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d44d4b7-ce01-4aa4-9155-8338ad17b404" containerName="edpm-a-provisionserver-checksum-discovery" Mar 20 09:07:29.013463 master-0 kubenswrapper[18707]: I0320 09:07:29.013419 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 09:07:29.016043 master-0 kubenswrapper[18707]: I0320 09:07:29.015602 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 09:07:29.016043 master-0 kubenswrapper[18707]: I0320 09:07:29.015973 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 09:07:29.138935 master-0 kubenswrapper[18707]: I0320 09:07:29.137688 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 09:07:29.218810 master-0 kubenswrapper[18707]: I0320 09:07:29.218714 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/77128d27-dc38-4372-937d-20b542156b17-openstack-config\") pod \"openstackclient\" (UID: \"77128d27-dc38-4372-937d-20b542156b17\") " pod="openstack/openstackclient" Mar 20 09:07:29.218810 master-0 kubenswrapper[18707]: I0320 09:07:29.218788 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77128d27-dc38-4372-937d-20b542156b17-combined-ca-bundle\") pod \"openstackclient\" (UID: \"77128d27-dc38-4372-937d-20b542156b17\") " pod="openstack/openstackclient" Mar 20 09:07:29.219934 master-0 kubenswrapper[18707]: I0320 09:07:29.219884 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76xv4\" (UniqueName: \"kubernetes.io/projected/77128d27-dc38-4372-937d-20b542156b17-kube-api-access-76xv4\") pod \"openstackclient\" (UID: \"77128d27-dc38-4372-937d-20b542156b17\") " pod="openstack/openstackclient" Mar 20 09:07:29.220008 master-0 kubenswrapper[18707]: I0320 09:07:29.219982 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/77128d27-dc38-4372-937d-20b542156b17-openstack-config-secret\") pod \"openstackclient\" (UID: \"77128d27-dc38-4372-937d-20b542156b17\") " pod="openstack/openstackclient" Mar 20 09:07:29.322543 master-0 kubenswrapper[18707]: I0320 09:07:29.322470 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/77128d27-dc38-4372-937d-20b542156b17-openstack-config\") pod \"openstackclient\" (UID: \"77128d27-dc38-4372-937d-20b542156b17\") " pod="openstack/openstackclient" Mar 20 09:07:29.322543 master-0 kubenswrapper[18707]: I0320 09:07:29.322536 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77128d27-dc38-4372-937d-20b542156b17-combined-ca-bundle\") pod \"openstackclient\" (UID: \"77128d27-dc38-4372-937d-20b542156b17\") " pod="openstack/openstackclient" Mar 20 09:07:29.322819 master-0 kubenswrapper[18707]: I0320 09:07:29.322776 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76xv4\" (UniqueName: \"kubernetes.io/projected/77128d27-dc38-4372-937d-20b542156b17-kube-api-access-76xv4\") pod \"openstackclient\" (UID: \"77128d27-dc38-4372-937d-20b542156b17\") " pod="openstack/openstackclient" Mar 20 09:07:29.322854 master-0 kubenswrapper[18707]: I0320 09:07:29.322823 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/77128d27-dc38-4372-937d-20b542156b17-openstack-config-secret\") pod \"openstackclient\" (UID: \"77128d27-dc38-4372-937d-20b542156b17\") " pod="openstack/openstackclient" Mar 20 09:07:29.324297 master-0 kubenswrapper[18707]: I0320 09:07:29.324266 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/77128d27-dc38-4372-937d-20b542156b17-openstack-config\") pod \"openstackclient\" (UID: \"77128d27-dc38-4372-937d-20b542156b17\") " pod="openstack/openstackclient" Mar 20 09:07:29.337911 master-0 kubenswrapper[18707]: I0320 09:07:29.337856 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/77128d27-dc38-4372-937d-20b542156b17-openstack-config-secret\") pod \"openstackclient\" (UID: \"77128d27-dc38-4372-937d-20b542156b17\") " pod="openstack/openstackclient" Mar 20 09:07:29.338772 master-0 kubenswrapper[18707]: I0320 09:07:29.338753 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77128d27-dc38-4372-937d-20b542156b17-combined-ca-bundle\") pod \"openstackclient\" (UID: \"77128d27-dc38-4372-937d-20b542156b17\") " pod="openstack/openstackclient" Mar 20 09:07:29.344497 master-0 kubenswrapper[18707]: I0320 09:07:29.344451 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76xv4\" (UniqueName: \"kubernetes.io/projected/77128d27-dc38-4372-937d-20b542156b17-kube-api-access-76xv4\") pod \"openstackclient\" (UID: \"77128d27-dc38-4372-937d-20b542156b17\") " pod="openstack/openstackclient" Mar 20 09:07:29.638984 master-0 kubenswrapper[18707]: I0320 09:07:29.634997 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 09:07:30.554372 master-0 kubenswrapper[18707]: W0320 09:07:30.554311 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77128d27_dc38_4372_937d_20b542156b17.slice/crio-b50ac45e38d5dc2b9e7b07ebba8bcbef7ac464db8c87b6613d23509c177ff380 WatchSource:0}: Error finding container b50ac45e38d5dc2b9e7b07ebba8bcbef7ac464db8c87b6613d23509c177ff380: Status 404 returned error can't find the container with id b50ac45e38d5dc2b9e7b07ebba8bcbef7ac464db8c87b6613d23509c177ff380 Mar 20 09:07:30.573946 master-0 kubenswrapper[18707]: I0320 09:07:30.573862 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 09:07:30.973646 master-0 kubenswrapper[18707]: I0320 09:07:30.973545 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"77128d27-dc38-4372-937d-20b542156b17","Type":"ContainerStarted","Data":"b50ac45e38d5dc2b9e7b07ebba8bcbef7ac464db8c87b6613d23509c177ff380"} Mar 20 09:07:33.064211 master-0 kubenswrapper[18707]: I0320 09:07:33.064081 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-84fcf8cbb-q9wtd"] Mar 20 09:07:33.074386 master-0 kubenswrapper[18707]: I0320 09:07:33.072556 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.078200 master-0 kubenswrapper[18707]: I0320 09:07:33.078092 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-84fcf8cbb-q9wtd"] Mar 20 09:07:33.103762 master-0 kubenswrapper[18707]: I0320 09:07:33.103714 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 09:07:33.103982 master-0 kubenswrapper[18707]: I0320 09:07:33.103869 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 09:07:33.104045 master-0 kubenswrapper[18707]: I0320 09:07:33.104026 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 09:07:33.251375 master-0 kubenswrapper[18707]: I0320 09:07:33.251252 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-etc-swift\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.251892 master-0 kubenswrapper[18707]: I0320 09:07:33.251771 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-config-data\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.251892 master-0 kubenswrapper[18707]: I0320 09:07:33.251843 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-internal-tls-certs\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.252353 master-0 kubenswrapper[18707]: I0320 09:07:33.251969 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-run-httpd\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.252353 master-0 kubenswrapper[18707]: I0320 09:07:33.252099 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-combined-ca-bundle\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.252353 master-0 kubenswrapper[18707]: I0320 09:07:33.252163 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-public-tls-certs\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.252353 master-0 kubenswrapper[18707]: I0320 09:07:33.252257 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4l8f\" (UniqueName: \"kubernetes.io/projected/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-kube-api-access-c4l8f\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.253420 master-0 kubenswrapper[18707]: I0320 09:07:33.253289 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-log-httpd\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.356116 master-0 kubenswrapper[18707]: I0320 09:07:33.355962 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-etc-swift\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.356116 master-0 kubenswrapper[18707]: I0320 09:07:33.356079 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-config-data\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.356116 master-0 kubenswrapper[18707]: I0320 09:07:33.356114 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-internal-tls-certs\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.356453 master-0 kubenswrapper[18707]: I0320 09:07:33.356164 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-run-httpd\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.356453 master-0 kubenswrapper[18707]: I0320 09:07:33.356365 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-combined-ca-bundle\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.357065 master-0 kubenswrapper[18707]: I0320 09:07:33.357010 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-run-httpd\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.357246 master-0 kubenswrapper[18707]: I0320 09:07:33.357166 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-public-tls-certs\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.357330 master-0 kubenswrapper[18707]: I0320 09:07:33.357315 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4l8f\" (UniqueName: \"kubernetes.io/projected/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-kube-api-access-c4l8f\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.357384 master-0 kubenswrapper[18707]: I0320 09:07:33.357343 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-log-httpd\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.370291 master-0 kubenswrapper[18707]: I0320 09:07:33.357939 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-log-httpd\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.370291 master-0 kubenswrapper[18707]: I0320 09:07:33.362524 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-etc-swift\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.370291 master-0 kubenswrapper[18707]: I0320 09:07:33.366697 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-combined-ca-bundle\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.370291 master-0 kubenswrapper[18707]: I0320 09:07:33.368696 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-config-data\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.370756 master-0 kubenswrapper[18707]: I0320 09:07:33.370173 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-internal-tls-certs\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.373226 master-0 kubenswrapper[18707]: I0320 09:07:33.372986 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-public-tls-certs\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.376692 master-0 kubenswrapper[18707]: I0320 09:07:33.376667 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4l8f\" (UniqueName: \"kubernetes.io/projected/484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d-kube-api-access-c4l8f\") pod \"swift-proxy-84fcf8cbb-q9wtd\" (UID: \"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d\") " pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:33.433473 master-0 kubenswrapper[18707]: I0320 09:07:33.433411 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:34.003132 master-0 kubenswrapper[18707]: I0320 09:07:34.002970 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6bc775ffc9-jcrc5" Mar 20 09:07:34.367339 master-0 kubenswrapper[18707]: I0320 09:07:34.361047 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-84fcf8cbb-q9wtd"] Mar 20 09:07:34.459216 master-0 kubenswrapper[18707]: I0320 09:07:34.455635 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-555d58bf7b-nplsv"] Mar 20 09:07:34.459216 master-0 kubenswrapper[18707]: I0320 09:07:34.455957 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-555d58bf7b-nplsv" podUID="48c5811a-c534-461a-9cc1-2f35f6b6a43b" containerName="neutron-api" containerID="cri-o://46f83be4dcad0c9ec2646ca57056a179e12d5be75f45fb0f9d9f3db320d4583c" gracePeriod=30 Mar 20 09:07:34.459216 master-0 kubenswrapper[18707]: I0320 09:07:34.456066 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-555d58bf7b-nplsv" podUID="48c5811a-c534-461a-9cc1-2f35f6b6a43b" containerName="neutron-httpd" containerID="cri-o://92aef4a1e7ac42f70a425304674348620192c74ae72ba60e80e61452607b81ec" gracePeriod=30 Mar 20 09:07:35.070690 master-0 kubenswrapper[18707]: I0320 09:07:35.070633 18707 generic.go:334] "Generic (PLEG): container finished" podID="48c5811a-c534-461a-9cc1-2f35f6b6a43b" containerID="92aef4a1e7ac42f70a425304674348620192c74ae72ba60e80e61452607b81ec" exitCode=0 Mar 20 09:07:35.071018 master-0 kubenswrapper[18707]: I0320 09:07:35.070988 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-555d58bf7b-nplsv" event={"ID":"48c5811a-c534-461a-9cc1-2f35f6b6a43b","Type":"ContainerDied","Data":"92aef4a1e7ac42f70a425304674348620192c74ae72ba60e80e61452607b81ec"} Mar 20 09:07:35.075746 master-0 kubenswrapper[18707]: I0320 09:07:35.074437 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" event={"ID":"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d","Type":"ContainerStarted","Data":"bde31c1e29ec19b649f2ec455cdf07d789492f19bb7eb3c4be7d6e3ee9bd5038"} Mar 20 09:07:35.075746 master-0 kubenswrapper[18707]: I0320 09:07:35.074498 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" event={"ID":"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d","Type":"ContainerStarted","Data":"cbeb076f95e61987d2dc7e7aa6094306091dd4a40825c6b69f459950ac49143d"} Mar 20 09:07:36.090376 master-0 kubenswrapper[18707]: I0320 09:07:36.089532 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" event={"ID":"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d","Type":"ContainerStarted","Data":"a05566eb86cd8882c426a4e6344297eda9f7dd8e7ddc878df374ed2dd821adc9"} Mar 20 09:07:36.091081 master-0 kubenswrapper[18707]: I0320 09:07:36.091049 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:36.091156 master-0 kubenswrapper[18707]: I0320 09:07:36.091101 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:36.636981 master-0 kubenswrapper[18707]: I0320 09:07:36.636868 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podStartSLOduration=3.636847925 podStartE2EDuration="3.636847925s" podCreationTimestamp="2026-03-20 09:07:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:07:36.63665451 +0000 UTC m=+1601.792834866" watchObservedRunningTime="2026-03-20 09:07:36.636847925 +0000 UTC m=+1601.793028281" Mar 20 09:07:39.445267 master-0 kubenswrapper[18707]: I0320 09:07:39.445174 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:42.443664 master-0 kubenswrapper[18707]: I0320 09:07:42.443570 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:42.444749 master-0 kubenswrapper[18707]: I0320 09:07:42.444632 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:43.447307 master-0 kubenswrapper[18707]: I0320 09:07:43.445899 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:43.450991 master-0 kubenswrapper[18707]: I0320 09:07:43.450270 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:45.274261 master-0 kubenswrapper[18707]: I0320 09:07:45.274134 18707 generic.go:334] "Generic (PLEG): container finished" podID="48c5811a-c534-461a-9cc1-2f35f6b6a43b" containerID="46f83be4dcad0c9ec2646ca57056a179e12d5be75f45fb0f9d9f3db320d4583c" exitCode=0 Mar 20 09:07:45.274261 master-0 kubenswrapper[18707]: I0320 09:07:45.274244 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-555d58bf7b-nplsv" event={"ID":"48c5811a-c534-461a-9cc1-2f35f6b6a43b","Type":"ContainerDied","Data":"46f83be4dcad0c9ec2646ca57056a179e12d5be75f45fb0f9d9f3db320d4583c"} Mar 20 09:07:45.439559 master-0 kubenswrapper[18707]: I0320 09:07:45.439483 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:45.441673 master-0 kubenswrapper[18707]: I0320 09:07:45.441613 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:45.441771 master-0 kubenswrapper[18707]: I0320 09:07:45.441698 18707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:45.445153 master-0 kubenswrapper[18707]: I0320 09:07:45.445082 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:45.448149 master-0 kubenswrapper[18707]: I0320 09:07:45.448080 18707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="proxy-httpd" containerStatusID={"Type":"cri-o","ID":"bde31c1e29ec19b649f2ec455cdf07d789492f19bb7eb3c4be7d6e3ee9bd5038"} pod="openstack/swift-proxy-84fcf8cbb-q9wtd" containerMessage="Container proxy-httpd failed liveness probe, will be restarted" Mar 20 09:07:45.448338 master-0 kubenswrapper[18707]: I0320 09:07:45.448298 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-httpd" containerID="cri-o://bde31c1e29ec19b649f2ec455cdf07d789492f19bb7eb3c4be7d6e3ee9bd5038" gracePeriod=30 Mar 20 09:07:45.449182 master-0 kubenswrapper[18707]: I0320 09:07:45.449137 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:47.041640 master-0 kubenswrapper[18707]: I0320 09:07:47.041582 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-27086-default-external-api-0"] Mar 20 09:07:47.042275 master-0 kubenswrapper[18707]: I0320 09:07:47.041863 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-27086-default-external-api-0" podUID="dbd0581a-055a-4728-8979-e0cfd24dd897" containerName="glance-log" containerID="cri-o://07e92ed41c0a2abbb89f39a0f628c94ad2999142ab7519ad10f78eb258075752" gracePeriod=30 Mar 20 09:07:47.042275 master-0 kubenswrapper[18707]: I0320 09:07:47.042107 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-27086-default-external-api-0" podUID="dbd0581a-055a-4728-8979-e0cfd24dd897" containerName="glance-httpd" containerID="cri-o://52a8b1a6a8e5d9b43c350b60899b2648676b53d28fa03fd9d17da7a7480b7c39" gracePeriod=30 Mar 20 09:07:47.308917 master-0 kubenswrapper[18707]: I0320 09:07:47.308823 18707 generic.go:334] "Generic (PLEG): container finished" podID="dbd0581a-055a-4728-8979-e0cfd24dd897" containerID="07e92ed41c0a2abbb89f39a0f628c94ad2999142ab7519ad10f78eb258075752" exitCode=143 Mar 20 09:07:47.309018 master-0 kubenswrapper[18707]: I0320 09:07:47.308962 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-external-api-0" event={"ID":"dbd0581a-055a-4728-8979-e0cfd24dd897","Type":"ContainerDied","Data":"07e92ed41c0a2abbb89f39a0f628c94ad2999142ab7519ad10f78eb258075752"} Mar 20 09:07:47.319243 master-0 kubenswrapper[18707]: I0320 09:07:47.318614 18707 generic.go:334] "Generic (PLEG): container finished" podID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerID="bde31c1e29ec19b649f2ec455cdf07d789492f19bb7eb3c4be7d6e3ee9bd5038" exitCode=0 Mar 20 09:07:47.319243 master-0 kubenswrapper[18707]: I0320 09:07:47.318689 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" event={"ID":"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d","Type":"ContainerDied","Data":"bde31c1e29ec19b649f2ec455cdf07d789492f19bb7eb3c4be7d6e3ee9bd5038"} Mar 20 09:07:47.328554 master-0 kubenswrapper[18707]: I0320 09:07:47.328514 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-555d58bf7b-nplsv" event={"ID":"48c5811a-c534-461a-9cc1-2f35f6b6a43b","Type":"ContainerDied","Data":"a9d9d13423ecb8693cb1c4426f89b105f400046abcfd59780ef658cdcc1afc48"} Mar 20 09:07:47.328653 master-0 kubenswrapper[18707]: I0320 09:07:47.328560 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9d9d13423ecb8693cb1c4426f89b105f400046abcfd59780ef658cdcc1afc48" Mar 20 09:07:47.359202 master-0 kubenswrapper[18707]: I0320 09:07:47.358669 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:07:47.468468 master-0 kubenswrapper[18707]: I0320 09:07:47.468385 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-config\") pod \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " Mar 20 09:07:47.468468 master-0 kubenswrapper[18707]: I0320 09:07:47.468464 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-httpd-config\") pod \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " Mar 20 09:07:47.468782 master-0 kubenswrapper[18707]: I0320 09:07:47.468519 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-ovndb-tls-certs\") pod \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " Mar 20 09:07:47.468782 master-0 kubenswrapper[18707]: I0320 09:07:47.468655 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-combined-ca-bundle\") pod \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " Mar 20 09:07:47.468782 master-0 kubenswrapper[18707]: I0320 09:07:47.468741 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkskk\" (UniqueName: \"kubernetes.io/projected/48c5811a-c534-461a-9cc1-2f35f6b6a43b-kube-api-access-pkskk\") pod \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\" (UID: \"48c5811a-c534-461a-9cc1-2f35f6b6a43b\") " Mar 20 09:07:47.497081 master-0 kubenswrapper[18707]: I0320 09:07:47.495784 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c5811a-c534-461a-9cc1-2f35f6b6a43b-kube-api-access-pkskk" (OuterVolumeSpecName: "kube-api-access-pkskk") pod "48c5811a-c534-461a-9cc1-2f35f6b6a43b" (UID: "48c5811a-c534-461a-9cc1-2f35f6b6a43b"). InnerVolumeSpecName "kube-api-access-pkskk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:47.500618 master-0 kubenswrapper[18707]: I0320 09:07:47.498313 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "48c5811a-c534-461a-9cc1-2f35f6b6a43b" (UID: "48c5811a-c534-461a-9cc1-2f35f6b6a43b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:47.553865 master-0 kubenswrapper[18707]: I0320 09:07:47.553669 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48c5811a-c534-461a-9cc1-2f35f6b6a43b" (UID: "48c5811a-c534-461a-9cc1-2f35f6b6a43b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:47.555151 master-0 kubenswrapper[18707]: I0320 09:07:47.554413 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-config" (OuterVolumeSpecName: "config") pod "48c5811a-c534-461a-9cc1-2f35f6b6a43b" (UID: "48c5811a-c534-461a-9cc1-2f35f6b6a43b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:47.571069 master-0 kubenswrapper[18707]: I0320 09:07:47.571010 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkskk\" (UniqueName: \"kubernetes.io/projected/48c5811a-c534-461a-9cc1-2f35f6b6a43b-kube-api-access-pkskk\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:47.571069 master-0 kubenswrapper[18707]: I0320 09:07:47.571056 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:47.571069 master-0 kubenswrapper[18707]: I0320 09:07:47.571067 18707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-httpd-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:47.571069 master-0 kubenswrapper[18707]: I0320 09:07:47.571076 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:47.583760 master-0 kubenswrapper[18707]: I0320 09:07:47.583688 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "48c5811a-c534-461a-9cc1-2f35f6b6a43b" (UID: "48c5811a-c534-461a-9cc1-2f35f6b6a43b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:47.674866 master-0 kubenswrapper[18707]: I0320 09:07:47.674783 18707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c5811a-c534-461a-9cc1-2f35f6b6a43b-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:48.260324 master-0 kubenswrapper[18707]: I0320 09:07:48.260272 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:48.323548 master-0 kubenswrapper[18707]: I0320 09:07:48.323495 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7754cdbcb6-lblgz" Mar 20 09:07:48.342006 master-0 kubenswrapper[18707]: I0320 09:07:48.341952 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"77128d27-dc38-4372-937d-20b542156b17","Type":"ContainerStarted","Data":"3154b984d237bec45e668ce6f637ea39a99ff967a91f31c04571608d2fbfd862"} Mar 20 09:07:48.345606 master-0 kubenswrapper[18707]: I0320 09:07:48.345521 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" event={"ID":"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d","Type":"ContainerStarted","Data":"7be9d745f65d4162b28719b73565c589b33bcca33938177b5dc995a1f8276446"} Mar 20 09:07:48.345606 master-0 kubenswrapper[18707]: I0320 09:07:48.345550 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-555d58bf7b-nplsv" Mar 20 09:07:48.346337 master-0 kubenswrapper[18707]: I0320 09:07:48.346289 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.128.0.238:8080/healthcheck\": dial tcp 10.128.0.238:8080: connect: connection refused" Mar 20 09:07:48.346432 master-0 kubenswrapper[18707]: I0320 09:07:48.346397 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-server" probeResult="failure" output="Get \"https://10.128.0.238:8080/healthcheck\": dial tcp 10.128.0.238:8080: connect: connection refused" Mar 20 09:07:48.434653 master-0 kubenswrapper[18707]: I0320 09:07:48.434496 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:48.434992 master-0 kubenswrapper[18707]: I0320 09:07:48.434942 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-server" probeResult="failure" output="Get \"https://10.128.0.238:8080/healthcheck\": dial tcp 10.128.0.238:8080: connect: connection refused" Mar 20 09:07:48.434992 master-0 kubenswrapper[18707]: I0320 09:07:48.434957 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-server" probeResult="failure" output="Get \"https://10.128.0.238:8080/healthcheck\": dial tcp 10.128.0.238:8080: connect: connection refused" Mar 20 09:07:48.435112 master-0 kubenswrapper[18707]: I0320 09:07:48.434999 18707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:49.358199 master-0 kubenswrapper[18707]: I0320 09:07:49.358132 18707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="proxy-server" containerStatusID={"Type":"cri-o","ID":"a05566eb86cd8882c426a4e6344297eda9f7dd8e7ddc878df374ed2dd821adc9"} pod="openstack/swift-proxy-84fcf8cbb-q9wtd" containerMessage="Container proxy-server failed liveness probe, will be restarted" Mar 20 09:07:49.358829 master-0 kubenswrapper[18707]: I0320 09:07:49.358800 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-server" containerID="cri-o://a05566eb86cd8882c426a4e6344297eda9f7dd8e7ddc878df374ed2dd821adc9" gracePeriod=30 Mar 20 09:07:49.369696 master-0 kubenswrapper[18707]: I0320 09:07:49.369646 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:49.550212 master-0 kubenswrapper[18707]: I0320 09:07:49.548438 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.991129287 podStartE2EDuration="21.548409327s" podCreationTimestamp="2026-03-20 09:07:28 +0000 UTC" firstStartedPulling="2026-03-20 09:07:30.559226962 +0000 UTC m=+1595.715407318" lastFinishedPulling="2026-03-20 09:07:47.116507002 +0000 UTC m=+1612.272687358" observedRunningTime="2026-03-20 09:07:49.543906958 +0000 UTC m=+1614.700087334" watchObservedRunningTime="2026-03-20 09:07:49.548409327 +0000 UTC m=+1614.704589683" Mar 20 09:07:50.269042 master-0 kubenswrapper[18707]: I0320 09:07:50.268938 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-27086-default-internal-api-0"] Mar 20 09:07:50.269402 master-0 kubenswrapper[18707]: I0320 09:07:50.269359 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-27086-default-internal-api-0" podUID="dcee9936-db51-42d9-ada7-279cec63d27e" containerName="glance-httpd" containerID="cri-o://0b3e345b6cd271ceb5c5d53750e1102868ebb3da5bfa88101d7a166c578666dd" gracePeriod=30 Mar 20 09:07:50.269562 master-0 kubenswrapper[18707]: I0320 09:07:50.269531 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-27086-default-internal-api-0" podUID="dcee9936-db51-42d9-ada7-279cec63d27e" containerName="glance-log" containerID="cri-o://188e328723a831b8c019606c55e41bb77c059e030357cb18f0d149f0569d2036" gracePeriod=30 Mar 20 09:07:51.382134 master-0 kubenswrapper[18707]: I0320 09:07:51.382069 18707 generic.go:334] "Generic (PLEG): container finished" podID="dcee9936-db51-42d9-ada7-279cec63d27e" containerID="188e328723a831b8c019606c55e41bb77c059e030357cb18f0d149f0569d2036" exitCode=143 Mar 20 09:07:51.382766 master-0 kubenswrapper[18707]: I0320 09:07:51.382149 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-internal-api-0" event={"ID":"dcee9936-db51-42d9-ada7-279cec63d27e","Type":"ContainerDied","Data":"188e328723a831b8c019606c55e41bb77c059e030357cb18f0d149f0569d2036"} Mar 20 09:07:51.384558 master-0 kubenswrapper[18707]: I0320 09:07:51.384521 18707 generic.go:334] "Generic (PLEG): container finished" podID="dbd0581a-055a-4728-8979-e0cfd24dd897" containerID="52a8b1a6a8e5d9b43c350b60899b2648676b53d28fa03fd9d17da7a7480b7c39" exitCode=0 Mar 20 09:07:51.384668 master-0 kubenswrapper[18707]: I0320 09:07:51.384599 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-external-api-0" event={"ID":"dbd0581a-055a-4728-8979-e0cfd24dd897","Type":"ContainerDied","Data":"52a8b1a6a8e5d9b43c350b60899b2648676b53d28fa03fd9d17da7a7480b7c39"} Mar 20 09:07:51.387346 master-0 kubenswrapper[18707]: I0320 09:07:51.387308 18707 generic.go:334] "Generic (PLEG): container finished" podID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerID="a05566eb86cd8882c426a4e6344297eda9f7dd8e7ddc878df374ed2dd821adc9" exitCode=0 Mar 20 09:07:51.387433 master-0 kubenswrapper[18707]: I0320 09:07:51.387354 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" event={"ID":"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d","Type":"ContainerDied","Data":"a05566eb86cd8882c426a4e6344297eda9f7dd8e7ddc878df374ed2dd821adc9"} Mar 20 09:07:51.387433 master-0 kubenswrapper[18707]: I0320 09:07:51.387379 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" event={"ID":"484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d","Type":"ContainerStarted","Data":"5aae019e5d2dc52b250b5858d8fd2e9adc55c0f5895d5005973a008e805e6831"} Mar 20 09:07:51.457730 master-0 kubenswrapper[18707]: I0320 09:07:51.457632 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-555d58bf7b-nplsv"] Mar 20 09:07:51.516600 master-0 kubenswrapper[18707]: I0320 09:07:51.516542 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-555d58bf7b-nplsv"] Mar 20 09:07:52.401947 master-0 kubenswrapper[18707]: I0320 09:07:52.401876 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:07:52.407401 master-0 kubenswrapper[18707]: I0320 09:07:52.407335 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:53.061717 master-0 kubenswrapper[18707]: I0320 09:07:53.061661 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8b47bf878-2ftwn"] Mar 20 09:07:53.062079 master-0 kubenswrapper[18707]: I0320 09:07:53.062033 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-8b47bf878-2ftwn" podUID="2dafeade-6a38-4134-ac1c-7331ea038aec" containerName="placement-log" containerID="cri-o://89b2d0395c21db068f91d1a8531bc530fddf5e8897d2499384549e1ac365ed68" gracePeriod=30 Mar 20 09:07:53.062317 master-0 kubenswrapper[18707]: I0320 09:07:53.062278 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-8b47bf878-2ftwn" podUID="2dafeade-6a38-4134-ac1c-7331ea038aec" containerName="placement-api" containerID="cri-o://1dda1c268f511fb2abda022887e5169f2df07d603ed922eb6f445725a0b163af" gracePeriod=30 Mar 20 09:07:53.169283 master-0 kubenswrapper[18707]: I0320 09:07:53.168467 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48c5811a-c534-461a-9cc1-2f35f6b6a43b" path="/var/lib/kubelet/pods/48c5811a-c534-461a-9cc1-2f35f6b6a43b/volumes" Mar 20 09:07:53.232480 master-0 kubenswrapper[18707]: I0320 09:07:53.232414 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hdkgz"] Mar 20 09:07:53.233291 master-0 kubenswrapper[18707]: E0320 09:07:53.233208 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c5811a-c534-461a-9cc1-2f35f6b6a43b" containerName="neutron-api" Mar 20 09:07:53.233291 master-0 kubenswrapper[18707]: I0320 09:07:53.233234 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c5811a-c534-461a-9cc1-2f35f6b6a43b" containerName="neutron-api" Mar 20 09:07:53.233507 master-0 kubenswrapper[18707]: E0320 09:07:53.233314 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c5811a-c534-461a-9cc1-2f35f6b6a43b" containerName="neutron-httpd" Mar 20 09:07:53.233507 master-0 kubenswrapper[18707]: I0320 09:07:53.233345 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c5811a-c534-461a-9cc1-2f35f6b6a43b" containerName="neutron-httpd" Mar 20 09:07:53.233758 master-0 kubenswrapper[18707]: I0320 09:07:53.233738 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c5811a-c534-461a-9cc1-2f35f6b6a43b" containerName="neutron-httpd" Mar 20 09:07:53.233848 master-0 kubenswrapper[18707]: I0320 09:07:53.233818 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c5811a-c534-461a-9cc1-2f35f6b6a43b" containerName="neutron-api" Mar 20 09:07:53.274749 master-0 kubenswrapper[18707]: I0320 09:07:53.272285 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hdkgz"] Mar 20 09:07:53.274749 master-0 kubenswrapper[18707]: I0320 09:07:53.272428 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hdkgz" Mar 20 09:07:53.295478 master-0 kubenswrapper[18707]: I0320 09:07:53.294926 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-th2rs"] Mar 20 09:07:53.319417 master-0 kubenswrapper[18707]: I0320 09:07:53.319343 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-th2rs" Mar 20 09:07:53.324512 master-0 kubenswrapper[18707]: I0320 09:07:53.324459 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-th2rs"] Mar 20 09:07:53.370986 master-0 kubenswrapper[18707]: I0320 09:07:53.365626 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a88fa32-c356-4dfe-abb9-d342391d52de-operator-scripts\") pod \"nova-cell0-db-create-th2rs\" (UID: \"7a88fa32-c356-4dfe-abb9-d342391d52de\") " pod="openstack/nova-cell0-db-create-th2rs" Mar 20 09:07:53.370986 master-0 kubenswrapper[18707]: I0320 09:07:53.365700 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp6fb\" (UniqueName: \"kubernetes.io/projected/2a210b09-ea3d-4841-9cb3-f86a7f93985e-kube-api-access-kp6fb\") pod \"nova-api-db-create-hdkgz\" (UID: \"2a210b09-ea3d-4841-9cb3-f86a7f93985e\") " pod="openstack/nova-api-db-create-hdkgz" Mar 20 09:07:53.370986 master-0 kubenswrapper[18707]: I0320 09:07:53.365765 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a210b09-ea3d-4841-9cb3-f86a7f93985e-operator-scripts\") pod \"nova-api-db-create-hdkgz\" (UID: \"2a210b09-ea3d-4841-9cb3-f86a7f93985e\") " pod="openstack/nova-api-db-create-hdkgz" Mar 20 09:07:53.370986 master-0 kubenswrapper[18707]: I0320 09:07:53.365845 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m42wd\" (UniqueName: \"kubernetes.io/projected/7a88fa32-c356-4dfe-abb9-d342391d52de-kube-api-access-m42wd\") pod \"nova-cell0-db-create-th2rs\" (UID: \"7a88fa32-c356-4dfe-abb9-d342391d52de\") " pod="openstack/nova-cell0-db-create-th2rs" Mar 20 09:07:53.433348 master-0 kubenswrapper[18707]: I0320 09:07:53.433208 18707 generic.go:334] "Generic (PLEG): container finished" podID="2dafeade-6a38-4134-ac1c-7331ea038aec" containerID="89b2d0395c21db068f91d1a8531bc530fddf5e8897d2499384549e1ac365ed68" exitCode=143 Mar 20 09:07:53.435001 master-0 kubenswrapper[18707]: I0320 09:07:53.434980 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8b47bf878-2ftwn" event={"ID":"2dafeade-6a38-4134-ac1c-7331ea038aec","Type":"ContainerDied","Data":"89b2d0395c21db068f91d1a8531bc530fddf5e8897d2499384549e1ac365ed68"} Mar 20 09:07:53.489359 master-0 kubenswrapper[18707]: I0320 09:07:53.469331 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-af59-account-create-update-tk2tl"] Mar 20 09:07:53.489359 master-0 kubenswrapper[18707]: I0320 09:07:53.488744 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a210b09-ea3d-4841-9cb3-f86a7f93985e-operator-scripts\") pod \"nova-api-db-create-hdkgz\" (UID: \"2a210b09-ea3d-4841-9cb3-f86a7f93985e\") " pod="openstack/nova-api-db-create-hdkgz" Mar 20 09:07:53.489359 master-0 kubenswrapper[18707]: I0320 09:07:53.489000 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m42wd\" (UniqueName: \"kubernetes.io/projected/7a88fa32-c356-4dfe-abb9-d342391d52de-kube-api-access-m42wd\") pod \"nova-cell0-db-create-th2rs\" (UID: \"7a88fa32-c356-4dfe-abb9-d342391d52de\") " pod="openstack/nova-cell0-db-create-th2rs" Mar 20 09:07:53.489359 master-0 kubenswrapper[18707]: I0320 09:07:53.489307 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a88fa32-c356-4dfe-abb9-d342391d52de-operator-scripts\") pod \"nova-cell0-db-create-th2rs\" (UID: \"7a88fa32-c356-4dfe-abb9-d342391d52de\") " pod="openstack/nova-cell0-db-create-th2rs" Mar 20 09:07:53.489359 master-0 kubenswrapper[18707]: I0320 09:07:53.489371 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp6fb\" (UniqueName: \"kubernetes.io/projected/2a210b09-ea3d-4841-9cb3-f86a7f93985e-kube-api-access-kp6fb\") pod \"nova-api-db-create-hdkgz\" (UID: \"2a210b09-ea3d-4841-9cb3-f86a7f93985e\") " pod="openstack/nova-api-db-create-hdkgz" Mar 20 09:07:53.496237 master-0 kubenswrapper[18707]: I0320 09:07:53.491878 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a88fa32-c356-4dfe-abb9-d342391d52de-operator-scripts\") pod \"nova-cell0-db-create-th2rs\" (UID: \"7a88fa32-c356-4dfe-abb9-d342391d52de\") " pod="openstack/nova-cell0-db-create-th2rs" Mar 20 09:07:53.496237 master-0 kubenswrapper[18707]: I0320 09:07:53.491999 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a210b09-ea3d-4841-9cb3-f86a7f93985e-operator-scripts\") pod \"nova-api-db-create-hdkgz\" (UID: \"2a210b09-ea3d-4841-9cb3-f86a7f93985e\") " pod="openstack/nova-api-db-create-hdkgz" Mar 20 09:07:53.496237 master-0 kubenswrapper[18707]: I0320 09:07:53.493303 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-af59-account-create-update-tk2tl" Mar 20 09:07:53.496979 master-0 kubenswrapper[18707]: I0320 09:07:53.496956 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 09:07:53.497089 master-0 kubenswrapper[18707]: I0320 09:07:53.496979 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:53.501416 master-0 kubenswrapper[18707]: I0320 09:07:53.501379 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-af59-account-create-update-tk2tl"] Mar 20 09:07:53.507089 master-0 kubenswrapper[18707]: I0320 09:07:53.507035 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:53.553484 master-0 kubenswrapper[18707]: I0320 09:07:53.553435 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m42wd\" (UniqueName: \"kubernetes.io/projected/7a88fa32-c356-4dfe-abb9-d342391d52de-kube-api-access-m42wd\") pod \"nova-cell0-db-create-th2rs\" (UID: \"7a88fa32-c356-4dfe-abb9-d342391d52de\") " pod="openstack/nova-cell0-db-create-th2rs" Mar 20 09:07:53.568210 master-0 kubenswrapper[18707]: I0320 09:07:53.567885 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp6fb\" (UniqueName: \"kubernetes.io/projected/2a210b09-ea3d-4841-9cb3-f86a7f93985e-kube-api-access-kp6fb\") pod \"nova-api-db-create-hdkgz\" (UID: \"2a210b09-ea3d-4841-9cb3-f86a7f93985e\") " pod="openstack/nova-api-db-create-hdkgz" Mar 20 09:07:53.595725 master-0 kubenswrapper[18707]: I0320 09:07:53.592893 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cldzz\" (UniqueName: \"kubernetes.io/projected/e2bb4fbf-4157-4051-bfbb-0b0a011fbc49-kube-api-access-cldzz\") pod \"nova-api-af59-account-create-update-tk2tl\" (UID: \"e2bb4fbf-4157-4051-bfbb-0b0a011fbc49\") " pod="openstack/nova-api-af59-account-create-update-tk2tl" Mar 20 09:07:53.595725 master-0 kubenswrapper[18707]: I0320 09:07:53.593478 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2bb4fbf-4157-4051-bfbb-0b0a011fbc49-operator-scripts\") pod \"nova-api-af59-account-create-update-tk2tl\" (UID: \"e2bb4fbf-4157-4051-bfbb-0b0a011fbc49\") " pod="openstack/nova-api-af59-account-create-update-tk2tl" Mar 20 09:07:53.651350 master-0 kubenswrapper[18707]: I0320 09:07:53.642756 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-fh7jr"] Mar 20 09:07:53.651350 master-0 kubenswrapper[18707]: I0320 09:07:53.644607 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fh7jr" Mar 20 09:07:53.659291 master-0 kubenswrapper[18707]: I0320 09:07:53.651979 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fh7jr"] Mar 20 09:07:53.679787 master-0 kubenswrapper[18707]: I0320 09:07:53.679718 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hdkgz" Mar 20 09:07:53.703598 master-0 kubenswrapper[18707]: I0320 09:07:53.698070 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-th2rs" Mar 20 09:07:53.724212 master-0 kubenswrapper[18707]: I0320 09:07:53.706543 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2bb4fbf-4157-4051-bfbb-0b0a011fbc49-operator-scripts\") pod \"nova-api-af59-account-create-update-tk2tl\" (UID: \"e2bb4fbf-4157-4051-bfbb-0b0a011fbc49\") " pod="openstack/nova-api-af59-account-create-update-tk2tl" Mar 20 09:07:53.724212 master-0 kubenswrapper[18707]: I0320 09:07:53.706850 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cldzz\" (UniqueName: \"kubernetes.io/projected/e2bb4fbf-4157-4051-bfbb-0b0a011fbc49-kube-api-access-cldzz\") pod \"nova-api-af59-account-create-update-tk2tl\" (UID: \"e2bb4fbf-4157-4051-bfbb-0b0a011fbc49\") " pod="openstack/nova-api-af59-account-create-update-tk2tl" Mar 20 09:07:53.724212 master-0 kubenswrapper[18707]: I0320 09:07:53.707303 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2bb4fbf-4157-4051-bfbb-0b0a011fbc49-operator-scripts\") pod \"nova-api-af59-account-create-update-tk2tl\" (UID: \"e2bb4fbf-4157-4051-bfbb-0b0a011fbc49\") " pod="openstack/nova-api-af59-account-create-update-tk2tl" Mar 20 09:07:53.724212 master-0 kubenswrapper[18707]: I0320 09:07:53.715583 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:53.770204 master-0 kubenswrapper[18707]: I0320 09:07:53.769470 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cldzz\" (UniqueName: \"kubernetes.io/projected/e2bb4fbf-4157-4051-bfbb-0b0a011fbc49-kube-api-access-cldzz\") pod \"nova-api-af59-account-create-update-tk2tl\" (UID: \"e2bb4fbf-4157-4051-bfbb-0b0a011fbc49\") " pod="openstack/nova-api-af59-account-create-update-tk2tl" Mar 20 09:07:53.834310 master-0 kubenswrapper[18707]: I0320 09:07:53.832898 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-11a2-account-create-update-gkhvg"] Mar 20 09:07:53.834310 master-0 kubenswrapper[18707]: E0320 09:07:53.833576 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd0581a-055a-4728-8979-e0cfd24dd897" containerName="glance-httpd" Mar 20 09:07:53.834310 master-0 kubenswrapper[18707]: I0320 09:07:53.833592 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd0581a-055a-4728-8979-e0cfd24dd897" containerName="glance-httpd" Mar 20 09:07:53.834310 master-0 kubenswrapper[18707]: E0320 09:07:53.833633 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd0581a-055a-4728-8979-e0cfd24dd897" containerName="glance-log" Mar 20 09:07:53.834310 master-0 kubenswrapper[18707]: I0320 09:07:53.833639 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd0581a-055a-4728-8979-e0cfd24dd897" containerName="glance-log" Mar 20 09:07:53.834310 master-0 kubenswrapper[18707]: I0320 09:07:53.833869 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd0581a-055a-4728-8979-e0cfd24dd897" containerName="glance-httpd" Mar 20 09:07:53.834310 master-0 kubenswrapper[18707]: I0320 09:07:53.833910 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd0581a-055a-4728-8979-e0cfd24dd897" containerName="glance-log" Mar 20 09:07:53.834809 master-0 kubenswrapper[18707]: I0320 09:07:53.834784 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-11a2-account-create-update-gkhvg" Mar 20 09:07:53.835468 master-0 kubenswrapper[18707]: I0320 09:07:53.835408 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbd0581a-055a-4728-8979-e0cfd24dd897-httpd-run\") pod \"dbd0581a-055a-4728-8979-e0cfd24dd897\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " Mar 20 09:07:53.835591 master-0 kubenswrapper[18707]: I0320 09:07:53.835553 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-public-tls-certs\") pod \"dbd0581a-055a-4728-8979-e0cfd24dd897\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " Mar 20 09:07:53.835709 master-0 kubenswrapper[18707]: I0320 09:07:53.835675 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbd0581a-055a-4728-8979-e0cfd24dd897-logs\") pod \"dbd0581a-055a-4728-8979-e0cfd24dd897\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " Mar 20 09:07:53.835765 master-0 kubenswrapper[18707]: I0320 09:07:53.835742 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zlh8\" (UniqueName: \"kubernetes.io/projected/dbd0581a-055a-4728-8979-e0cfd24dd897-kube-api-access-5zlh8\") pod \"dbd0581a-055a-4728-8979-e0cfd24dd897\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " Mar 20 09:07:53.835846 master-0 kubenswrapper[18707]: I0320 09:07:53.835823 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-scripts\") pod \"dbd0581a-055a-4728-8979-e0cfd24dd897\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " Mar 20 09:07:53.839265 master-0 kubenswrapper[18707]: I0320 09:07:53.838705 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 09:07:53.839265 master-0 kubenswrapper[18707]: I0320 09:07:53.838961 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbd0581a-055a-4728-8979-e0cfd24dd897-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dbd0581a-055a-4728-8979-e0cfd24dd897" (UID: "dbd0581a-055a-4728-8979-e0cfd24dd897"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:07:53.839265 master-0 kubenswrapper[18707]: I0320 09:07:53.839111 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") pod \"dbd0581a-055a-4728-8979-e0cfd24dd897\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " Mar 20 09:07:53.839265 master-0 kubenswrapper[18707]: I0320 09:07:53.839222 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-config-data\") pod \"dbd0581a-055a-4728-8979-e0cfd24dd897\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " Mar 20 09:07:53.839867 master-0 kubenswrapper[18707]: I0320 09:07:53.839809 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbd0581a-055a-4728-8979-e0cfd24dd897-logs" (OuterVolumeSpecName: "logs") pod "dbd0581a-055a-4728-8979-e0cfd24dd897" (UID: "dbd0581a-055a-4728-8979-e0cfd24dd897"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:07:53.847758 master-0 kubenswrapper[18707]: I0320 09:07:53.839998 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-combined-ca-bundle\") pod \"dbd0581a-055a-4728-8979-e0cfd24dd897\" (UID: \"dbd0581a-055a-4728-8979-e0cfd24dd897\") " Mar 20 09:07:53.847758 master-0 kubenswrapper[18707]: I0320 09:07:53.840632 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-11a2-account-create-update-gkhvg"] Mar 20 09:07:53.847758 master-0 kubenswrapper[18707]: I0320 09:07:53.840783 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c91fb20b-d4f7-4c69-b78c-458e9122718d-operator-scripts\") pod \"nova-cell1-db-create-fh7jr\" (UID: \"c91fb20b-d4f7-4c69-b78c-458e9122718d\") " pod="openstack/nova-cell1-db-create-fh7jr" Mar 20 09:07:53.847758 master-0 kubenswrapper[18707]: I0320 09:07:53.842232 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd0581a-055a-4728-8979-e0cfd24dd897-kube-api-access-5zlh8" (OuterVolumeSpecName: "kube-api-access-5zlh8") pod "dbd0581a-055a-4728-8979-e0cfd24dd897" (UID: "dbd0581a-055a-4728-8979-e0cfd24dd897"). InnerVolumeSpecName "kube-api-access-5zlh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:53.847758 master-0 kubenswrapper[18707]: I0320 09:07:53.847140 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-scripts" (OuterVolumeSpecName: "scripts") pod "dbd0581a-055a-4728-8979-e0cfd24dd897" (UID: "dbd0581a-055a-4728-8979-e0cfd24dd897"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:53.848933 master-0 kubenswrapper[18707]: I0320 09:07:53.848621 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2trmh\" (UniqueName: \"kubernetes.io/projected/c91fb20b-d4f7-4c69-b78c-458e9122718d-kube-api-access-2trmh\") pod \"nova-cell1-db-create-fh7jr\" (UID: \"c91fb20b-d4f7-4c69-b78c-458e9122718d\") " pod="openstack/nova-cell1-db-create-fh7jr" Mar 20 09:07:53.854558 master-0 kubenswrapper[18707]: I0320 09:07:53.851478 18707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dbd0581a-055a-4728-8979-e0cfd24dd897-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:53.854558 master-0 kubenswrapper[18707]: I0320 09:07:53.851534 18707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbd0581a-055a-4728-8979-e0cfd24dd897-logs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:53.854558 master-0 kubenswrapper[18707]: I0320 09:07:53.851552 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zlh8\" (UniqueName: \"kubernetes.io/projected/dbd0581a-055a-4728-8979-e0cfd24dd897-kube-api-access-5zlh8\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:53.854558 master-0 kubenswrapper[18707]: I0320 09:07:53.851565 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:53.879521 master-0 kubenswrapper[18707]: I0320 09:07:53.876662 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e" (OuterVolumeSpecName: "glance") pod "dbd0581a-055a-4728-8979-e0cfd24dd897" (UID: "dbd0581a-055a-4728-8979-e0cfd24dd897"). InnerVolumeSpecName "pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 09:07:53.944230 master-0 kubenswrapper[18707]: I0320 09:07:53.944164 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbd0581a-055a-4728-8979-e0cfd24dd897" (UID: "dbd0581a-055a-4728-8979-e0cfd24dd897"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:53.956683 master-0 kubenswrapper[18707]: I0320 09:07:53.955868 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c91fb20b-d4f7-4c69-b78c-458e9122718d-operator-scripts\") pod \"nova-cell1-db-create-fh7jr\" (UID: \"c91fb20b-d4f7-4c69-b78c-458e9122718d\") " pod="openstack/nova-cell1-db-create-fh7jr" Mar 20 09:07:53.956683 master-0 kubenswrapper[18707]: I0320 09:07:53.955922 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2trmh\" (UniqueName: \"kubernetes.io/projected/c91fb20b-d4f7-4c69-b78c-458e9122718d-kube-api-access-2trmh\") pod \"nova-cell1-db-create-fh7jr\" (UID: \"c91fb20b-d4f7-4c69-b78c-458e9122718d\") " pod="openstack/nova-cell1-db-create-fh7jr" Mar 20 09:07:53.956683 master-0 kubenswrapper[18707]: I0320 09:07:53.956004 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsjh9\" (UniqueName: \"kubernetes.io/projected/322c240c-fbd4-40ea-80bf-cc8bf2611394-kube-api-access-hsjh9\") pod \"nova-cell0-11a2-account-create-update-gkhvg\" (UID: \"322c240c-fbd4-40ea-80bf-cc8bf2611394\") " pod="openstack/nova-cell0-11a2-account-create-update-gkhvg" Mar 20 09:07:53.956683 master-0 kubenswrapper[18707]: I0320 09:07:53.956128 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/322c240c-fbd4-40ea-80bf-cc8bf2611394-operator-scripts\") pod \"nova-cell0-11a2-account-create-update-gkhvg\" (UID: \"322c240c-fbd4-40ea-80bf-cc8bf2611394\") " pod="openstack/nova-cell0-11a2-account-create-update-gkhvg" Mar 20 09:07:53.961609 master-0 kubenswrapper[18707]: I0320 09:07:53.961552 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c91fb20b-d4f7-4c69-b78c-458e9122718d-operator-scripts\") pod \"nova-cell1-db-create-fh7jr\" (UID: \"c91fb20b-d4f7-4c69-b78c-458e9122718d\") " pod="openstack/nova-cell1-db-create-fh7jr" Mar 20 09:07:53.961927 master-0 kubenswrapper[18707]: I0320 09:07:53.961727 18707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") on node \"master-0\" " Mar 20 09:07:53.961927 master-0 kubenswrapper[18707]: I0320 09:07:53.961749 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:53.972615 master-0 kubenswrapper[18707]: I0320 09:07:53.972587 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3995-account-create-update-7pfxc"] Mar 20 09:07:53.974548 master-0 kubenswrapper[18707]: I0320 09:07:53.974532 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3995-account-create-update-7pfxc" Mar 20 09:07:53.987139 master-0 kubenswrapper[18707]: I0320 09:07:53.986969 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 09:07:54.001315 master-0 kubenswrapper[18707]: I0320 09:07:54.001259 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-af59-account-create-update-tk2tl" Mar 20 09:07:54.001521 master-0 kubenswrapper[18707]: I0320 09:07:54.001376 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-config-data" (OuterVolumeSpecName: "config-data") pod "dbd0581a-055a-4728-8979-e0cfd24dd897" (UID: "dbd0581a-055a-4728-8979-e0cfd24dd897"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:54.011760 master-0 kubenswrapper[18707]: I0320 09:07:54.011714 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dbd0581a-055a-4728-8979-e0cfd24dd897" (UID: "dbd0581a-055a-4728-8979-e0cfd24dd897"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:54.045111 master-0 kubenswrapper[18707]: I0320 09:07:54.044927 18707 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 09:07:54.046770 master-0 kubenswrapper[18707]: I0320 09:07:54.046731 18707 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647" (UniqueName: "kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e") on node "master-0" Mar 20 09:07:54.064059 master-0 kubenswrapper[18707]: I0320 09:07:54.063915 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00e361f2-f12a-4354-b907-8575930aee6b-operator-scripts\") pod \"nova-cell1-3995-account-create-update-7pfxc\" (UID: \"00e361f2-f12a-4354-b907-8575930aee6b\") " pod="openstack/nova-cell1-3995-account-create-update-7pfxc" Mar 20 09:07:54.064059 master-0 kubenswrapper[18707]: I0320 09:07:54.063999 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsjh9\" (UniqueName: \"kubernetes.io/projected/322c240c-fbd4-40ea-80bf-cc8bf2611394-kube-api-access-hsjh9\") pod \"nova-cell0-11a2-account-create-update-gkhvg\" (UID: \"322c240c-fbd4-40ea-80bf-cc8bf2611394\") " pod="openstack/nova-cell0-11a2-account-create-update-gkhvg" Mar 20 09:07:54.064386 master-0 kubenswrapper[18707]: I0320 09:07:54.064112 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6xxt\" (UniqueName: \"kubernetes.io/projected/00e361f2-f12a-4354-b907-8575930aee6b-kube-api-access-k6xxt\") pod \"nova-cell1-3995-account-create-update-7pfxc\" (UID: \"00e361f2-f12a-4354-b907-8575930aee6b\") " pod="openstack/nova-cell1-3995-account-create-update-7pfxc" Mar 20 09:07:54.064386 master-0 kubenswrapper[18707]: I0320 09:07:54.064141 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/322c240c-fbd4-40ea-80bf-cc8bf2611394-operator-scripts\") pod \"nova-cell0-11a2-account-create-update-gkhvg\" (UID: \"322c240c-fbd4-40ea-80bf-cc8bf2611394\") " pod="openstack/nova-cell0-11a2-account-create-update-gkhvg" Mar 20 09:07:54.064386 master-0 kubenswrapper[18707]: I0320 09:07:54.064312 18707 reconciler_common.go:293] "Volume detached for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:54.064386 master-0 kubenswrapper[18707]: I0320 09:07:54.064327 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:54.064386 master-0 kubenswrapper[18707]: I0320 09:07:54.064337 18707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbd0581a-055a-4728-8979-e0cfd24dd897-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:54.065278 master-0 kubenswrapper[18707]: I0320 09:07:54.065221 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/322c240c-fbd4-40ea-80bf-cc8bf2611394-operator-scripts\") pod \"nova-cell0-11a2-account-create-update-gkhvg\" (UID: \"322c240c-fbd4-40ea-80bf-cc8bf2611394\") " pod="openstack/nova-cell0-11a2-account-create-update-gkhvg" Mar 20 09:07:54.152838 master-0 kubenswrapper[18707]: I0320 09:07:54.152746 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3995-account-create-update-7pfxc"] Mar 20 09:07:54.157527 master-0 kubenswrapper[18707]: I0320 09:07:54.157217 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsjh9\" (UniqueName: \"kubernetes.io/projected/322c240c-fbd4-40ea-80bf-cc8bf2611394-kube-api-access-hsjh9\") pod \"nova-cell0-11a2-account-create-update-gkhvg\" (UID: \"322c240c-fbd4-40ea-80bf-cc8bf2611394\") " pod="openstack/nova-cell0-11a2-account-create-update-gkhvg" Mar 20 09:07:54.170133 master-0 kubenswrapper[18707]: I0320 09:07:54.168988 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00e361f2-f12a-4354-b907-8575930aee6b-operator-scripts\") pod \"nova-cell1-3995-account-create-update-7pfxc\" (UID: \"00e361f2-f12a-4354-b907-8575930aee6b\") " pod="openstack/nova-cell1-3995-account-create-update-7pfxc" Mar 20 09:07:54.170133 master-0 kubenswrapper[18707]: I0320 09:07:54.169336 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6xxt\" (UniqueName: \"kubernetes.io/projected/00e361f2-f12a-4354-b907-8575930aee6b-kube-api-access-k6xxt\") pod \"nova-cell1-3995-account-create-update-7pfxc\" (UID: \"00e361f2-f12a-4354-b907-8575930aee6b\") " pod="openstack/nova-cell1-3995-account-create-update-7pfxc" Mar 20 09:07:54.170683 master-0 kubenswrapper[18707]: I0320 09:07:54.170332 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00e361f2-f12a-4354-b907-8575930aee6b-operator-scripts\") pod \"nova-cell1-3995-account-create-update-7pfxc\" (UID: \"00e361f2-f12a-4354-b907-8575930aee6b\") " pod="openstack/nova-cell1-3995-account-create-update-7pfxc" Mar 20 09:07:54.185291 master-0 kubenswrapper[18707]: I0320 09:07:54.185005 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2trmh\" (UniqueName: \"kubernetes.io/projected/c91fb20b-d4f7-4c69-b78c-458e9122718d-kube-api-access-2trmh\") pod \"nova-cell1-db-create-fh7jr\" (UID: \"c91fb20b-d4f7-4c69-b78c-458e9122718d\") " pod="openstack/nova-cell1-db-create-fh7jr" Mar 20 09:07:54.203426 master-0 kubenswrapper[18707]: I0320 09:07:54.203201 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-11a2-account-create-update-gkhvg" Mar 20 09:07:54.371889 master-0 kubenswrapper[18707]: I0320 09:07:54.371837 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fh7jr" Mar 20 09:07:54.441148 master-0 kubenswrapper[18707]: I0320 09:07:54.440904 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:54.450220 master-0 kubenswrapper[18707]: I0320 09:07:54.450082 18707 generic.go:334] "Generic (PLEG): container finished" podID="dcee9936-db51-42d9-ada7-279cec63d27e" containerID="0b3e345b6cd271ceb5c5d53750e1102868ebb3da5bfa88101d7a166c578666dd" exitCode=0 Mar 20 09:07:54.450220 master-0 kubenswrapper[18707]: I0320 09:07:54.450142 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-internal-api-0" event={"ID":"dcee9936-db51-42d9-ada7-279cec63d27e","Type":"ContainerDied","Data":"0b3e345b6cd271ceb5c5d53750e1102868ebb3da5bfa88101d7a166c578666dd"} Mar 20 09:07:54.452043 master-0 kubenswrapper[18707]: I0320 09:07:54.451983 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-external-api-0" event={"ID":"dbd0581a-055a-4728-8979-e0cfd24dd897","Type":"ContainerDied","Data":"3ed72e9fb703d6eef89fbb39d6abf266470f90b00500eaccc685f161e32630ce"} Mar 20 09:07:54.452111 master-0 kubenswrapper[18707]: I0320 09:07:54.452059 18707 scope.go:117] "RemoveContainer" containerID="52a8b1a6a8e5d9b43c350b60899b2648676b53d28fa03fd9d17da7a7480b7c39" Mar 20 09:07:54.452111 master-0 kubenswrapper[18707]: I0320 09:07:54.452013 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:54.565329 master-0 kubenswrapper[18707]: I0320 09:07:54.565080 18707 scope.go:117] "RemoveContainer" containerID="07e92ed41c0a2abbb89f39a0f628c94ad2999142ab7519ad10f78eb258075752" Mar 20 09:07:54.897270 master-0 kubenswrapper[18707]: I0320 09:07:54.895582 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6xxt\" (UniqueName: \"kubernetes.io/projected/00e361f2-f12a-4354-b907-8575930aee6b-kube-api-access-k6xxt\") pod \"nova-cell1-3995-account-create-update-7pfxc\" (UID: \"00e361f2-f12a-4354-b907-8575930aee6b\") " pod="openstack/nova-cell1-3995-account-create-update-7pfxc" Mar 20 09:07:54.923589 master-0 kubenswrapper[18707]: I0320 09:07:54.923502 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3995-account-create-update-7pfxc" Mar 20 09:07:55.394555 master-0 kubenswrapper[18707]: I0320 09:07:55.390616 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-th2rs"] Mar 20 09:07:55.407046 master-0 kubenswrapper[18707]: I0320 09:07:55.406730 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hdkgz"] Mar 20 09:07:55.426299 master-0 kubenswrapper[18707]: W0320 09:07:55.425704 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod322c240c_fbd4_40ea_80bf_cc8bf2611394.slice/crio-1e8cc915e86a914ee7ca4c136c68a969ad74279043f7d3874a4265b6019a0eec WatchSource:0}: Error finding container 1e8cc915e86a914ee7ca4c136c68a969ad74279043f7d3874a4265b6019a0eec: Status 404 returned error can't find the container with id 1e8cc915e86a914ee7ca4c136c68a969ad74279043f7d3874a4265b6019a0eec Mar 20 09:07:55.431874 master-0 kubenswrapper[18707]: W0320 09:07:55.431827 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2bb4fbf_4157_4051_bfbb_0b0a011fbc49.slice/crio-a94983d1fe65129b2ce3fb84977e176b6fcfa2a03670ff9e78971291fde50c8a WatchSource:0}: Error finding container a94983d1fe65129b2ce3fb84977e176b6fcfa2a03670ff9e78971291fde50c8a: Status 404 returned error can't find the container with id a94983d1fe65129b2ce3fb84977e176b6fcfa2a03670ff9e78971291fde50c8a Mar 20 09:07:55.434345 master-0 kubenswrapper[18707]: I0320 09:07:55.434299 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-af59-account-create-update-tk2tl"] Mar 20 09:07:55.445570 master-0 kubenswrapper[18707]: I0320 09:07:55.445490 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fh7jr"] Mar 20 09:07:55.465962 master-0 kubenswrapper[18707]: I0320 09:07:55.465839 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-11a2-account-create-update-gkhvg"] Mar 20 09:07:55.466729 master-0 kubenswrapper[18707]: I0320 09:07:55.466684 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-af59-account-create-update-tk2tl" event={"ID":"e2bb4fbf-4157-4051-bfbb-0b0a011fbc49","Type":"ContainerStarted","Data":"a94983d1fe65129b2ce3fb84977e176b6fcfa2a03670ff9e78971291fde50c8a"} Mar 20 09:07:55.472110 master-0 kubenswrapper[18707]: I0320 09:07:55.469069 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fh7jr" event={"ID":"c91fb20b-d4f7-4c69-b78c-458e9122718d","Type":"ContainerStarted","Data":"3ce37ea0fda3c42157346a211ce2a8ad2552ff1f96dd458846a3f3b8dccb196e"} Mar 20 09:07:55.472110 master-0 kubenswrapper[18707]: I0320 09:07:55.471610 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-11a2-account-create-update-gkhvg" event={"ID":"322c240c-fbd4-40ea-80bf-cc8bf2611394","Type":"ContainerStarted","Data":"1e8cc915e86a914ee7ca4c136c68a969ad74279043f7d3874a4265b6019a0eec"} Mar 20 09:07:55.478783 master-0 kubenswrapper[18707]: I0320 09:07:55.478745 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 09:07:55.479500 master-0 kubenswrapper[18707]: I0320 09:07:55.479406 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 09:07:55.491833 master-0 kubenswrapper[18707]: I0320 09:07:55.491700 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hdkgz" event={"ID":"2a210b09-ea3d-4841-9cb3-f86a7f93985e","Type":"ContainerStarted","Data":"20a1b5baa3699da09a461312bff05ce3f520e831c8e72117e0b19a5f01f83067"} Mar 20 09:07:55.855483 master-0 kubenswrapper[18707]: I0320 09:07:55.855404 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-27086-default-external-api-0"] Mar 20 09:07:56.082290 master-0 kubenswrapper[18707]: I0320 09:07:56.082215 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3995-account-create-update-7pfxc"] Mar 20 09:07:56.102372 master-0 kubenswrapper[18707]: I0320 09:07:56.102306 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 09:07:56.341805 master-0 kubenswrapper[18707]: I0320 09:07:56.335212 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:07:56.454382 master-0 kubenswrapper[18707]: I0320 09:07:56.454318 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-config-data\") pod \"dcee9936-db51-42d9-ada7-279cec63d27e\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " Mar 20 09:07:56.455047 master-0 kubenswrapper[18707]: I0320 09:07:56.454429 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dcee9936-db51-42d9-ada7-279cec63d27e-httpd-run\") pod \"dcee9936-db51-42d9-ada7-279cec63d27e\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " Mar 20 09:07:56.455047 master-0 kubenswrapper[18707]: I0320 09:07:56.454561 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcee9936-db51-42d9-ada7-279cec63d27e-logs\") pod \"dcee9936-db51-42d9-ada7-279cec63d27e\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " Mar 20 09:07:56.455047 master-0 kubenswrapper[18707]: I0320 09:07:56.454959 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcee9936-db51-42d9-ada7-279cec63d27e-logs" (OuterVolumeSpecName: "logs") pod "dcee9936-db51-42d9-ada7-279cec63d27e" (UID: "dcee9936-db51-42d9-ada7-279cec63d27e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:07:56.455280 master-0 kubenswrapper[18707]: I0320 09:07:56.455174 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcee9936-db51-42d9-ada7-279cec63d27e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dcee9936-db51-42d9-ada7-279cec63d27e" (UID: "dcee9936-db51-42d9-ada7-279cec63d27e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:07:56.457751 master-0 kubenswrapper[18707]: I0320 09:07:56.457596 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526\") pod \"dcee9936-db51-42d9-ada7-279cec63d27e\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " Mar 20 09:07:56.457751 master-0 kubenswrapper[18707]: I0320 09:07:56.457742 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59x5s\" (UniqueName: \"kubernetes.io/projected/dcee9936-db51-42d9-ada7-279cec63d27e-kube-api-access-59x5s\") pod \"dcee9936-db51-42d9-ada7-279cec63d27e\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " Mar 20 09:07:56.457950 master-0 kubenswrapper[18707]: I0320 09:07:56.457919 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-internal-tls-certs\") pod \"dcee9936-db51-42d9-ada7-279cec63d27e\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " Mar 20 09:07:56.458387 master-0 kubenswrapper[18707]: I0320 09:07:56.458328 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-combined-ca-bundle\") pod \"dcee9936-db51-42d9-ada7-279cec63d27e\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " Mar 20 09:07:56.458466 master-0 kubenswrapper[18707]: I0320 09:07:56.458420 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-scripts\") pod \"dcee9936-db51-42d9-ada7-279cec63d27e\" (UID: \"dcee9936-db51-42d9-ada7-279cec63d27e\") " Mar 20 09:07:56.459976 master-0 kubenswrapper[18707]: I0320 09:07:56.459927 18707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dcee9936-db51-42d9-ada7-279cec63d27e-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:56.459976 master-0 kubenswrapper[18707]: I0320 09:07:56.459965 18707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcee9936-db51-42d9-ada7-279cec63d27e-logs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:56.461569 master-0 kubenswrapper[18707]: I0320 09:07:56.461531 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-scripts" (OuterVolumeSpecName: "scripts") pod "dcee9936-db51-42d9-ada7-279cec63d27e" (UID: "dcee9936-db51-42d9-ada7-279cec63d27e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:56.461902 master-0 kubenswrapper[18707]: I0320 09:07:56.461874 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcee9936-db51-42d9-ada7-279cec63d27e-kube-api-access-59x5s" (OuterVolumeSpecName: "kube-api-access-59x5s") pod "dcee9936-db51-42d9-ada7-279cec63d27e" (UID: "dcee9936-db51-42d9-ada7-279cec63d27e"). InnerVolumeSpecName "kube-api-access-59x5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:56.484865 master-0 kubenswrapper[18707]: I0320 09:07:56.484795 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcee9936-db51-42d9-ada7-279cec63d27e" (UID: "dcee9936-db51-42d9-ada7-279cec63d27e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:56.508155 master-0 kubenswrapper[18707]: I0320 09:07:56.508024 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hdkgz" event={"ID":"2a210b09-ea3d-4841-9cb3-f86a7f93985e","Type":"ContainerStarted","Data":"efadccc7335ef5746b901dfd94c7b880f33f291b80645babf0af8bc0f20da7a6"} Mar 20 09:07:56.509273 master-0 kubenswrapper[18707]: I0320 09:07:56.509240 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-config-data" (OuterVolumeSpecName: "config-data") pod "dcee9936-db51-42d9-ada7-279cec63d27e" (UID: "dcee9936-db51-42d9-ada7-279cec63d27e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:56.510934 master-0 kubenswrapper[18707]: I0320 09:07:56.510890 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3995-account-create-update-7pfxc" event={"ID":"00e361f2-f12a-4354-b907-8575930aee6b","Type":"ContainerStarted","Data":"efd0cd3870d7bb5c4d1284f37fcb40d97b3cff58096055662b090cda6ca21dc4"} Mar 20 09:07:56.513407 master-0 kubenswrapper[18707]: I0320 09:07:56.513270 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-internal-api-0" event={"ID":"dcee9936-db51-42d9-ada7-279cec63d27e","Type":"ContainerDied","Data":"7ec36585b68818943f960e71df365caced8449fdda07a25674d1a668effffffd"} Mar 20 09:07:56.513407 master-0 kubenswrapper[18707]: I0320 09:07:56.513316 18707 scope.go:117] "RemoveContainer" containerID="0b3e345b6cd271ceb5c5d53750e1102868ebb3da5bfa88101d7a166c578666dd" Mar 20 09:07:56.513571 master-0 kubenswrapper[18707]: I0320 09:07:56.513466 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:07:56.516245 master-0 kubenswrapper[18707]: I0320 09:07:56.516212 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-th2rs" event={"ID":"7a88fa32-c356-4dfe-abb9-d342391d52de","Type":"ContainerStarted","Data":"3781b977803f4f55d01defab452e08559aa9c24cf54e36e6cda6472c873010f9"} Mar 20 09:07:56.516245 master-0 kubenswrapper[18707]: I0320 09:07:56.516243 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-th2rs" event={"ID":"7a88fa32-c356-4dfe-abb9-d342391d52de","Type":"ContainerStarted","Data":"1e05f0b575a19ec2cd0498827fee3c523b7e8475173e1b2ea287b8886183da6d"} Mar 20 09:07:56.518839 master-0 kubenswrapper[18707]: I0320 09:07:56.518552 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-af59-account-create-update-tk2tl" event={"ID":"e2bb4fbf-4157-4051-bfbb-0b0a011fbc49","Type":"ContainerStarted","Data":"0b0a32a280f6f9c928c440482a6960695f1264a73f0430e6bc1d433ea50c9e0d"} Mar 20 09:07:56.520996 master-0 kubenswrapper[18707]: I0320 09:07:56.520962 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fh7jr" event={"ID":"c91fb20b-d4f7-4c69-b78c-458e9122718d","Type":"ContainerStarted","Data":"3292113ebdf4562c8aa9de4a76cec8de9c8fa34dcde6904df11f1fe55a43489d"} Mar 20 09:07:56.523030 master-0 kubenswrapper[18707]: I0320 09:07:56.522996 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-11a2-account-create-update-gkhvg" event={"ID":"322c240c-fbd4-40ea-80bf-cc8bf2611394","Type":"ContainerStarted","Data":"b1b3eb57ad082df997cd23e2063a58bf55379fb798326e0a02544c62fe451c2f"} Mar 20 09:07:56.534863 master-0 kubenswrapper[18707]: I0320 09:07:56.534803 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dcee9936-db51-42d9-ada7-279cec63d27e" (UID: "dcee9936-db51-42d9-ada7-279cec63d27e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:56.561822 master-0 kubenswrapper[18707]: I0320 09:07:56.561760 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:56.561822 master-0 kubenswrapper[18707]: I0320 09:07:56.561813 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:56.561990 master-0 kubenswrapper[18707]: I0320 09:07:56.561830 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59x5s\" (UniqueName: \"kubernetes.io/projected/dcee9936-db51-42d9-ada7-279cec63d27e-kube-api-access-59x5s\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:56.561990 master-0 kubenswrapper[18707]: I0320 09:07:56.561844 18707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:56.561990 master-0 kubenswrapper[18707]: I0320 09:07:56.561857 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcee9936-db51-42d9-ada7-279cec63d27e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:56.626119 master-0 kubenswrapper[18707]: I0320 09:07:56.626070 18707 scope.go:117] "RemoveContainer" containerID="188e328723a831b8c019606c55e41bb77c059e030357cb18f0d149f0569d2036" Mar 20 09:07:56.636019 master-0 kubenswrapper[18707]: I0320 09:07:56.635968 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526" (OuterVolumeSpecName: "glance") pod "dcee9936-db51-42d9-ada7-279cec63d27e" (UID: "dcee9936-db51-42d9-ada7-279cec63d27e"). InnerVolumeSpecName "pvc-ed7fc861-4795-473d-84e6-66068cd18122". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 09:07:56.663821 master-0 kubenswrapper[18707]: I0320 09:07:56.663764 18707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-ed7fc861-4795-473d-84e6-66068cd18122\" (UniqueName: \"kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526\") on node \"master-0\" " Mar 20 09:07:56.706055 master-0 kubenswrapper[18707]: I0320 09:07:56.705998 18707 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 09:07:56.706312 master-0 kubenswrapper[18707]: I0320 09:07:56.706284 18707 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-ed7fc861-4795-473d-84e6-66068cd18122" (UniqueName: "kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526") on node "master-0" Mar 20 09:07:56.766210 master-0 kubenswrapper[18707]: I0320 09:07:56.766049 18707 reconciler_common.go:293] "Volume detached for volume \"pvc-ed7fc861-4795-473d-84e6-66068cd18122\" (UniqueName: \"kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:56.888560 master-0 kubenswrapper[18707]: I0320 09:07:56.888489 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-27086-default-external-api-0"] Mar 20 09:07:57.122136 master-0 kubenswrapper[18707]: I0320 09:07:57.122052 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd0581a-055a-4728-8979-e0cfd24dd897" path="/var/lib/kubelet/pods/dbd0581a-055a-4728-8979-e0cfd24dd897/volumes" Mar 20 09:07:57.439903 master-0 kubenswrapper[18707]: I0320 09:07:57.439685 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:57.440087 master-0 kubenswrapper[18707]: I0320 09:07:57.439928 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:57.522288 master-0 kubenswrapper[18707]: I0320 09:07:57.522203 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-27086-default-external-api-0"] Mar 20 09:07:57.522989 master-0 kubenswrapper[18707]: E0320 09:07:57.522954 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcee9936-db51-42d9-ada7-279cec63d27e" containerName="glance-httpd" Mar 20 09:07:57.522989 master-0 kubenswrapper[18707]: I0320 09:07:57.522982 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcee9936-db51-42d9-ada7-279cec63d27e" containerName="glance-httpd" Mar 20 09:07:57.523066 master-0 kubenswrapper[18707]: E0320 09:07:57.523026 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcee9936-db51-42d9-ada7-279cec63d27e" containerName="glance-log" Mar 20 09:07:57.523066 master-0 kubenswrapper[18707]: I0320 09:07:57.523039 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcee9936-db51-42d9-ada7-279cec63d27e" containerName="glance-log" Mar 20 09:07:57.523448 master-0 kubenswrapper[18707]: I0320 09:07:57.523424 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcee9936-db51-42d9-ada7-279cec63d27e" containerName="glance-log" Mar 20 09:07:57.523522 master-0 kubenswrapper[18707]: I0320 09:07:57.523503 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcee9936-db51-42d9-ada7-279cec63d27e" containerName="glance-httpd" Mar 20 09:07:57.525380 master-0 kubenswrapper[18707]: I0320 09:07:57.525127 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:57.527723 master-0 kubenswrapper[18707]: I0320 09:07:57.527675 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 09:07:57.527973 master-0 kubenswrapper[18707]: I0320 09:07:57.527948 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-27086-default-external-config-data" Mar 20 09:07:57.528133 master-0 kubenswrapper[18707]: I0320 09:07:57.528103 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 09:07:57.559714 master-0 kubenswrapper[18707]: I0320 09:07:57.559647 18707 generic.go:334] "Generic (PLEG): container finished" podID="2dafeade-6a38-4134-ac1c-7331ea038aec" containerID="1dda1c268f511fb2abda022887e5169f2df07d603ed922eb6f445725a0b163af" exitCode=0 Mar 20 09:07:57.559936 master-0 kubenswrapper[18707]: I0320 09:07:57.559766 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8b47bf878-2ftwn" event={"ID":"2dafeade-6a38-4134-ac1c-7331ea038aec","Type":"ContainerDied","Data":"1dda1c268f511fb2abda022887e5169f2df07d603ed922eb6f445725a0b163af"} Mar 20 09:07:57.562359 master-0 kubenswrapper[18707]: I0320 09:07:57.562312 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3995-account-create-update-7pfxc" event={"ID":"00e361f2-f12a-4354-b907-8575930aee6b","Type":"ContainerStarted","Data":"75b6c5368bf2ea838b4e8fc4f30d310cf7658e15c2f592fe7eb4ed10519d5bde"} Mar 20 09:07:57.664149 master-0 kubenswrapper[18707]: I0320 09:07:57.664063 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:07:58.103347 master-0 kubenswrapper[18707]: I0320 09:07:58.103288 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-internal-tls-certs\") pod \"2dafeade-6a38-4134-ac1c-7331ea038aec\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " Mar 20 09:07:58.103737 master-0 kubenswrapper[18707]: I0320 09:07:58.103705 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-combined-ca-bundle\") pod \"2dafeade-6a38-4134-ac1c-7331ea038aec\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " Mar 20 09:07:58.103947 master-0 kubenswrapper[18707]: I0320 09:07:58.103926 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-scripts\") pod \"2dafeade-6a38-4134-ac1c-7331ea038aec\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " Mar 20 09:07:58.104107 master-0 kubenswrapper[18707]: I0320 09:07:58.104090 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-config-data\") pod \"2dafeade-6a38-4134-ac1c-7331ea038aec\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " Mar 20 09:07:58.104258 master-0 kubenswrapper[18707]: I0320 09:07:58.104239 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m99fb\" (UniqueName: \"kubernetes.io/projected/2dafeade-6a38-4134-ac1c-7331ea038aec-kube-api-access-m99fb\") pod \"2dafeade-6a38-4134-ac1c-7331ea038aec\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " Mar 20 09:07:58.104383 master-0 kubenswrapper[18707]: I0320 09:07:58.104367 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dafeade-6a38-4134-ac1c-7331ea038aec-logs\") pod \"2dafeade-6a38-4134-ac1c-7331ea038aec\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " Mar 20 09:07:58.104632 master-0 kubenswrapper[18707]: I0320 09:07:58.104614 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-public-tls-certs\") pod \"2dafeade-6a38-4134-ac1c-7331ea038aec\" (UID: \"2dafeade-6a38-4134-ac1c-7331ea038aec\") " Mar 20 09:07:58.105148 master-0 kubenswrapper[18707]: I0320 09:07:58.105122 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9292cf08-615f-430e-a059-a2d17257282f-combined-ca-bundle\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.105366 master-0 kubenswrapper[18707]: I0320 09:07:58.105346 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9292cf08-615f-430e-a059-a2d17257282f-public-tls-certs\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.105520 master-0 kubenswrapper[18707]: I0320 09:07:58.105498 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2jmp\" (UniqueName: \"kubernetes.io/projected/9292cf08-615f-430e-a059-a2d17257282f-kube-api-access-j2jmp\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.105710 master-0 kubenswrapper[18707]: I0320 09:07:58.105689 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9292cf08-615f-430e-a059-a2d17257282f-httpd-run\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.105822 master-0 kubenswrapper[18707]: I0320 09:07:58.105803 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9292cf08-615f-430e-a059-a2d17257282f-config-data\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.105993 master-0 kubenswrapper[18707]: I0320 09:07:58.105972 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9292cf08-615f-430e-a059-a2d17257282f-scripts\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.106147 master-0 kubenswrapper[18707]: I0320 09:07:58.106128 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9292cf08-615f-430e-a059-a2d17257282f-logs\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.106336 master-0 kubenswrapper[18707]: I0320 09:07:58.106316 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.109730 master-0 kubenswrapper[18707]: I0320 09:07:58.109672 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-scripts" (OuterVolumeSpecName: "scripts") pod "2dafeade-6a38-4134-ac1c-7331ea038aec" (UID: "2dafeade-6a38-4134-ac1c-7331ea038aec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:58.114801 master-0 kubenswrapper[18707]: I0320 09:07:58.114363 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dafeade-6a38-4134-ac1c-7331ea038aec-logs" (OuterVolumeSpecName: "logs") pod "2dafeade-6a38-4134-ac1c-7331ea038aec" (UID: "2dafeade-6a38-4134-ac1c-7331ea038aec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:07:58.126151 master-0 kubenswrapper[18707]: I0320 09:07:58.121354 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dafeade-6a38-4134-ac1c-7331ea038aec-kube-api-access-m99fb" (OuterVolumeSpecName: "kube-api-access-m99fb") pod "2dafeade-6a38-4134-ac1c-7331ea038aec" (UID: "2dafeade-6a38-4134-ac1c-7331ea038aec"). InnerVolumeSpecName "kube-api-access-m99fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:58.129407 master-0 kubenswrapper[18707]: I0320 09:07:58.129325 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-27086-default-external-api-0"] Mar 20 09:07:58.198942 master-0 kubenswrapper[18707]: I0320 09:07:58.198883 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-config-data" (OuterVolumeSpecName: "config-data") pod "2dafeade-6a38-4134-ac1c-7331ea038aec" (UID: "2dafeade-6a38-4134-ac1c-7331ea038aec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:58.210032 master-0 kubenswrapper[18707]: I0320 09:07:58.209174 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9292cf08-615f-430e-a059-a2d17257282f-combined-ca-bundle\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.213013 master-0 kubenswrapper[18707]: I0320 09:07:58.210300 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9292cf08-615f-430e-a059-a2d17257282f-public-tls-certs\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.213013 master-0 kubenswrapper[18707]: I0320 09:07:58.210364 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2jmp\" (UniqueName: \"kubernetes.io/projected/9292cf08-615f-430e-a059-a2d17257282f-kube-api-access-j2jmp\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.213013 master-0 kubenswrapper[18707]: I0320 09:07:58.210441 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9292cf08-615f-430e-a059-a2d17257282f-httpd-run\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.213013 master-0 kubenswrapper[18707]: I0320 09:07:58.210467 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9292cf08-615f-430e-a059-a2d17257282f-config-data\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.213013 master-0 kubenswrapper[18707]: I0320 09:07:58.211122 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9292cf08-615f-430e-a059-a2d17257282f-httpd-run\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.213013 master-0 kubenswrapper[18707]: I0320 09:07:58.211226 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9292cf08-615f-430e-a059-a2d17257282f-scripts\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.213013 master-0 kubenswrapper[18707]: I0320 09:07:58.211343 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9292cf08-615f-430e-a059-a2d17257282f-logs\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.213013 master-0 kubenswrapper[18707]: I0320 09:07:58.211472 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:58.213013 master-0 kubenswrapper[18707]: I0320 09:07:58.211490 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:58.213013 master-0 kubenswrapper[18707]: I0320 09:07:58.211504 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m99fb\" (UniqueName: \"kubernetes.io/projected/2dafeade-6a38-4134-ac1c-7331ea038aec-kube-api-access-m99fb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:58.213013 master-0 kubenswrapper[18707]: I0320 09:07:58.211516 18707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dafeade-6a38-4134-ac1c-7331ea038aec-logs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:58.213013 master-0 kubenswrapper[18707]: I0320 09:07:58.211817 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9292cf08-615f-430e-a059-a2d17257282f-logs\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.214404 master-0 kubenswrapper[18707]: I0320 09:07:58.214194 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dafeade-6a38-4134-ac1c-7331ea038aec" (UID: "2dafeade-6a38-4134-ac1c-7331ea038aec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:58.215574 master-0 kubenswrapper[18707]: I0320 09:07:58.215513 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9292cf08-615f-430e-a059-a2d17257282f-public-tls-certs\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.215660 master-0 kubenswrapper[18707]: I0320 09:07:58.215551 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9292cf08-615f-430e-a059-a2d17257282f-config-data\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.217050 master-0 kubenswrapper[18707]: I0320 09:07:58.217015 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9292cf08-615f-430e-a059-a2d17257282f-scripts\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.217292 master-0 kubenswrapper[18707]: I0320 09:07:58.217265 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9292cf08-615f-430e-a059-a2d17257282f-combined-ca-bundle\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.262746 master-0 kubenswrapper[18707]: I0320 09:07:58.262527 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2dafeade-6a38-4134-ac1c-7331ea038aec" (UID: "2dafeade-6a38-4134-ac1c-7331ea038aec"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:58.280454 master-0 kubenswrapper[18707]: I0320 09:07:58.280382 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2dafeade-6a38-4134-ac1c-7331ea038aec" (UID: "2dafeade-6a38-4134-ac1c-7331ea038aec"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:58.318660 master-0 kubenswrapper[18707]: I0320 09:07:58.318575 18707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:58.318660 master-0 kubenswrapper[18707]: I0320 09:07:58.318639 18707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:58.318660 master-0 kubenswrapper[18707]: I0320 09:07:58.318654 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dafeade-6a38-4134-ac1c-7331ea038aec-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:07:58.344812 master-0 kubenswrapper[18707]: I0320 09:07:58.344699 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-th2rs" podStartSLOduration=5.344655898 podStartE2EDuration="5.344655898s" podCreationTimestamp="2026-03-20 09:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:07:58.341362644 +0000 UTC m=+1623.497543000" watchObservedRunningTime="2026-03-20 09:07:58.344655898 +0000 UTC m=+1623.500836254" Mar 20 09:07:58.421405 master-0 kubenswrapper[18707]: I0320 09:07:58.421340 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.429605 master-0 kubenswrapper[18707]: I0320 09:07:58.429547 18707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 09:07:58.429831 master-0 kubenswrapper[18707]: I0320 09:07:58.429606 18707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/83e46ef83a145054ad4570ca2d941bb2fcebc41478e778d97e5839befd9dd6f3/globalmount\"" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.438733 master-0 kubenswrapper[18707]: I0320 09:07:58.438689 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:58.439919 master-0 kubenswrapper[18707]: I0320 09:07:58.439900 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" podUID="484602b5-3f62-4bb5-bfc4-5a0ab4f56c8d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 09:07:58.592329 master-0 kubenswrapper[18707]: I0320 09:07:58.590803 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2jmp\" (UniqueName: \"kubernetes.io/projected/9292cf08-615f-430e-a059-a2d17257282f-kube-api-access-j2jmp\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:58.594609 master-0 kubenswrapper[18707]: I0320 09:07:58.594555 18707 generic.go:334] "Generic (PLEG): container finished" podID="e2bb4fbf-4157-4051-bfbb-0b0a011fbc49" containerID="0b0a32a280f6f9c928c440482a6960695f1264a73f0430e6bc1d433ea50c9e0d" exitCode=0 Mar 20 09:07:58.594703 master-0 kubenswrapper[18707]: I0320 09:07:58.594632 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-af59-account-create-update-tk2tl" event={"ID":"e2bb4fbf-4157-4051-bfbb-0b0a011fbc49","Type":"ContainerDied","Data":"0b0a32a280f6f9c928c440482a6960695f1264a73f0430e6bc1d433ea50c9e0d"} Mar 20 09:07:58.598715 master-0 kubenswrapper[18707]: I0320 09:07:58.598664 18707 generic.go:334] "Generic (PLEG): container finished" podID="c91fb20b-d4f7-4c69-b78c-458e9122718d" containerID="3292113ebdf4562c8aa9de4a76cec8de9c8fa34dcde6904df11f1fe55a43489d" exitCode=0 Mar 20 09:07:58.598917 master-0 kubenswrapper[18707]: I0320 09:07:58.598726 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fh7jr" event={"ID":"c91fb20b-d4f7-4c69-b78c-458e9122718d","Type":"ContainerDied","Data":"3292113ebdf4562c8aa9de4a76cec8de9c8fa34dcde6904df11f1fe55a43489d"} Mar 20 09:07:58.601811 master-0 kubenswrapper[18707]: I0320 09:07:58.601775 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8b47bf878-2ftwn" event={"ID":"2dafeade-6a38-4134-ac1c-7331ea038aec","Type":"ContainerDied","Data":"3a775266802c1a940d94e0ee69c727654606b52f10dd599b4ae5f931674df38d"} Mar 20 09:07:58.601936 master-0 kubenswrapper[18707]: I0320 09:07:58.601826 18707 scope.go:117] "RemoveContainer" containerID="1dda1c268f511fb2abda022887e5169f2df07d603ed922eb6f445725a0b163af" Mar 20 09:07:58.601971 master-0 kubenswrapper[18707]: I0320 09:07:58.601947 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8b47bf878-2ftwn" Mar 20 09:07:58.608672 master-0 kubenswrapper[18707]: I0320 09:07:58.608162 18707 generic.go:334] "Generic (PLEG): container finished" podID="322c240c-fbd4-40ea-80bf-cc8bf2611394" containerID="b1b3eb57ad082df997cd23e2063a58bf55379fb798326e0a02544c62fe451c2f" exitCode=0 Mar 20 09:07:58.608672 master-0 kubenswrapper[18707]: I0320 09:07:58.608296 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-11a2-account-create-update-gkhvg" event={"ID":"322c240c-fbd4-40ea-80bf-cc8bf2611394","Type":"ContainerDied","Data":"b1b3eb57ad082df997cd23e2063a58bf55379fb798326e0a02544c62fe451c2f"} Mar 20 09:07:58.610928 master-0 kubenswrapper[18707]: I0320 09:07:58.610869 18707 generic.go:334] "Generic (PLEG): container finished" podID="2a210b09-ea3d-4841-9cb3-f86a7f93985e" containerID="efadccc7335ef5746b901dfd94c7b880f33f291b80645babf0af8bc0f20da7a6" exitCode=0 Mar 20 09:07:58.611028 master-0 kubenswrapper[18707]: I0320 09:07:58.610942 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hdkgz" event={"ID":"2a210b09-ea3d-4841-9cb3-f86a7f93985e","Type":"ContainerDied","Data":"efadccc7335ef5746b901dfd94c7b880f33f291b80645babf0af8bc0f20da7a6"} Mar 20 09:07:58.612634 master-0 kubenswrapper[18707]: I0320 09:07:58.612512 18707 generic.go:334] "Generic (PLEG): container finished" podID="00e361f2-f12a-4354-b907-8575930aee6b" containerID="75b6c5368bf2ea838b4e8fc4f30d310cf7658e15c2f592fe7eb4ed10519d5bde" exitCode=0 Mar 20 09:07:58.612634 master-0 kubenswrapper[18707]: I0320 09:07:58.612556 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3995-account-create-update-7pfxc" event={"ID":"00e361f2-f12a-4354-b907-8575930aee6b","Type":"ContainerDied","Data":"75b6c5368bf2ea838b4e8fc4f30d310cf7658e15c2f592fe7eb4ed10519d5bde"} Mar 20 09:07:58.848956 master-0 kubenswrapper[18707]: I0320 09:07:58.848891 18707 scope.go:117] "RemoveContainer" containerID="89b2d0395c21db068f91d1a8531bc530fddf5e8897d2499384549e1ac365ed68" Mar 20 09:07:59.370208 master-0 kubenswrapper[18707]: I0320 09:07:59.366656 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-27086-default-internal-api-0"] Mar 20 09:07:59.605008 master-0 kubenswrapper[18707]: I0320 09:07:59.604930 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c52c5b10-3247-40dd-a4c1-8cd8b0dd6647\" (UniqueName: \"kubernetes.io/csi/topolvm.io^716d1466-9e83-45d5-98f8-21e63a0b353e\") pod \"glance-27086-default-external-api-0\" (UID: \"9292cf08-615f-430e-a059-a2d17257282f\") " pod="openstack/glance-27086-default-external-api-0" Mar 20 09:07:59.625511 master-0 kubenswrapper[18707]: I0320 09:07:59.625393 18707 generic.go:334] "Generic (PLEG): container finished" podID="7a88fa32-c356-4dfe-abb9-d342391d52de" containerID="3781b977803f4f55d01defab452e08559aa9c24cf54e36e6cda6472c873010f9" exitCode=0 Mar 20 09:07:59.625511 master-0 kubenswrapper[18707]: I0320 09:07:59.625507 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-th2rs" event={"ID":"7a88fa32-c356-4dfe-abb9-d342391d52de","Type":"ContainerDied","Data":"3781b977803f4f55d01defab452e08559aa9c24cf54e36e6cda6472c873010f9"} Mar 20 09:08:00.127977 master-0 kubenswrapper[18707]: I0320 09:08:00.127915 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-27086-default-internal-api-0"] Mar 20 09:08:00.167616 master-0 kubenswrapper[18707]: I0320 09:08:00.167520 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fh7jr" Mar 20 09:08:00.283437 master-0 kubenswrapper[18707]: I0320 09:08:00.278355 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c91fb20b-d4f7-4c69-b78c-458e9122718d-operator-scripts\") pod \"c91fb20b-d4f7-4c69-b78c-458e9122718d\" (UID: \"c91fb20b-d4f7-4c69-b78c-458e9122718d\") " Mar 20 09:08:00.283437 master-0 kubenswrapper[18707]: I0320 09:08:00.278531 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2trmh\" (UniqueName: \"kubernetes.io/projected/c91fb20b-d4f7-4c69-b78c-458e9122718d-kube-api-access-2trmh\") pod \"c91fb20b-d4f7-4c69-b78c-458e9122718d\" (UID: \"c91fb20b-d4f7-4c69-b78c-458e9122718d\") " Mar 20 09:08:00.283437 master-0 kubenswrapper[18707]: I0320 09:08:00.279225 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c91fb20b-d4f7-4c69-b78c-458e9122718d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c91fb20b-d4f7-4c69-b78c-458e9122718d" (UID: "c91fb20b-d4f7-4c69-b78c-458e9122718d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:08:00.283437 master-0 kubenswrapper[18707]: I0320 09:08:00.280538 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c91fb20b-d4f7-4c69-b78c-458e9122718d-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:00.337598 master-0 kubenswrapper[18707]: I0320 09:08:00.334407 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c91fb20b-d4f7-4c69-b78c-458e9122718d-kube-api-access-2trmh" (OuterVolumeSpecName: "kube-api-access-2trmh") pod "c91fb20b-d4f7-4c69-b78c-458e9122718d" (UID: "c91fb20b-d4f7-4c69-b78c-458e9122718d"). InnerVolumeSpecName "kube-api-access-2trmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:08:00.383232 master-0 kubenswrapper[18707]: I0320 09:08:00.383146 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2trmh\" (UniqueName: \"kubernetes.io/projected/c91fb20b-d4f7-4c69-b78c-458e9122718d-kube-api-access-2trmh\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:00.533812 master-0 kubenswrapper[18707]: I0320 09:08:00.533642 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3995-account-create-update-7pfxc" Mar 20 09:08:00.543035 master-0 kubenswrapper[18707]: I0320 09:08:00.542992 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-af59-account-create-update-tk2tl" Mar 20 09:08:00.550997 master-0 kubenswrapper[18707]: I0320 09:08:00.550938 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-11a2-account-create-update-gkhvg" Mar 20 09:08:00.561619 master-0 kubenswrapper[18707]: I0320 09:08:00.561383 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hdkgz" Mar 20 09:08:00.588144 master-0 kubenswrapper[18707]: I0320 09:08:00.588063 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00e361f2-f12a-4354-b907-8575930aee6b-operator-scripts\") pod \"00e361f2-f12a-4354-b907-8575930aee6b\" (UID: \"00e361f2-f12a-4354-b907-8575930aee6b\") " Mar 20 09:08:00.588406 master-0 kubenswrapper[18707]: I0320 09:08:00.588286 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cldzz\" (UniqueName: \"kubernetes.io/projected/e2bb4fbf-4157-4051-bfbb-0b0a011fbc49-kube-api-access-cldzz\") pod \"e2bb4fbf-4157-4051-bfbb-0b0a011fbc49\" (UID: \"e2bb4fbf-4157-4051-bfbb-0b0a011fbc49\") " Mar 20 09:08:00.588648 master-0 kubenswrapper[18707]: I0320 09:08:00.588602 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6xxt\" (UniqueName: \"kubernetes.io/projected/00e361f2-f12a-4354-b907-8575930aee6b-kube-api-access-k6xxt\") pod \"00e361f2-f12a-4354-b907-8575930aee6b\" (UID: \"00e361f2-f12a-4354-b907-8575930aee6b\") " Mar 20 09:08:00.588721 master-0 kubenswrapper[18707]: I0320 09:08:00.588686 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00e361f2-f12a-4354-b907-8575930aee6b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "00e361f2-f12a-4354-b907-8575930aee6b" (UID: "00e361f2-f12a-4354-b907-8575930aee6b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:08:00.588859 master-0 kubenswrapper[18707]: I0320 09:08:00.588806 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2bb4fbf-4157-4051-bfbb-0b0a011fbc49-operator-scripts\") pod \"e2bb4fbf-4157-4051-bfbb-0b0a011fbc49\" (UID: \"e2bb4fbf-4157-4051-bfbb-0b0a011fbc49\") " Mar 20 09:08:00.588910 master-0 kubenswrapper[18707]: I0320 09:08:00.588885 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsjh9\" (UniqueName: \"kubernetes.io/projected/322c240c-fbd4-40ea-80bf-cc8bf2611394-kube-api-access-hsjh9\") pod \"322c240c-fbd4-40ea-80bf-cc8bf2611394\" (UID: \"322c240c-fbd4-40ea-80bf-cc8bf2611394\") " Mar 20 09:08:00.589280 master-0 kubenswrapper[18707]: I0320 09:08:00.589232 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/322c240c-fbd4-40ea-80bf-cc8bf2611394-operator-scripts\") pod \"322c240c-fbd4-40ea-80bf-cc8bf2611394\" (UID: \"322c240c-fbd4-40ea-80bf-cc8bf2611394\") " Mar 20 09:08:00.589620 master-0 kubenswrapper[18707]: I0320 09:08:00.589588 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/322c240c-fbd4-40ea-80bf-cc8bf2611394-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "322c240c-fbd4-40ea-80bf-cc8bf2611394" (UID: "322c240c-fbd4-40ea-80bf-cc8bf2611394"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:08:00.590904 master-0 kubenswrapper[18707]: I0320 09:08:00.590549 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/322c240c-fbd4-40ea-80bf-cc8bf2611394-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:00.590904 master-0 kubenswrapper[18707]: I0320 09:08:00.590606 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00e361f2-f12a-4354-b907-8575930aee6b-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:00.591606 master-0 kubenswrapper[18707]: I0320 09:08:00.591574 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2bb4fbf-4157-4051-bfbb-0b0a011fbc49-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e2bb4fbf-4157-4051-bfbb-0b0a011fbc49" (UID: "e2bb4fbf-4157-4051-bfbb-0b0a011fbc49"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:08:00.592449 master-0 kubenswrapper[18707]: I0320 09:08:00.592423 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2bb4fbf-4157-4051-bfbb-0b0a011fbc49-kube-api-access-cldzz" (OuterVolumeSpecName: "kube-api-access-cldzz") pod "e2bb4fbf-4157-4051-bfbb-0b0a011fbc49" (UID: "e2bb4fbf-4157-4051-bfbb-0b0a011fbc49"). InnerVolumeSpecName "kube-api-access-cldzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:08:00.592595 master-0 kubenswrapper[18707]: I0320 09:08:00.592535 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/322c240c-fbd4-40ea-80bf-cc8bf2611394-kube-api-access-hsjh9" (OuterVolumeSpecName: "kube-api-access-hsjh9") pod "322c240c-fbd4-40ea-80bf-cc8bf2611394" (UID: "322c240c-fbd4-40ea-80bf-cc8bf2611394"). InnerVolumeSpecName "kube-api-access-hsjh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:08:00.595114 master-0 kubenswrapper[18707]: I0320 09:08:00.595052 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00e361f2-f12a-4354-b907-8575930aee6b-kube-api-access-k6xxt" (OuterVolumeSpecName: "kube-api-access-k6xxt") pod "00e361f2-f12a-4354-b907-8575930aee6b" (UID: "00e361f2-f12a-4354-b907-8575930aee6b"). InnerVolumeSpecName "kube-api-access-k6xxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:08:00.640383 master-0 kubenswrapper[18707]: I0320 09:08:00.640323 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-af59-account-create-update-tk2tl" event={"ID":"e2bb4fbf-4157-4051-bfbb-0b0a011fbc49","Type":"ContainerDied","Data":"a94983d1fe65129b2ce3fb84977e176b6fcfa2a03670ff9e78971291fde50c8a"} Mar 20 09:08:00.640383 master-0 kubenswrapper[18707]: I0320 09:08:00.640379 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a94983d1fe65129b2ce3fb84977e176b6fcfa2a03670ff9e78971291fde50c8a" Mar 20 09:08:00.640920 master-0 kubenswrapper[18707]: I0320 09:08:00.640403 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-af59-account-create-update-tk2tl" Mar 20 09:08:00.642169 master-0 kubenswrapper[18707]: I0320 09:08:00.642127 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fh7jr" event={"ID":"c91fb20b-d4f7-4c69-b78c-458e9122718d","Type":"ContainerDied","Data":"3ce37ea0fda3c42157346a211ce2a8ad2552ff1f96dd458846a3f3b8dccb196e"} Mar 20 09:08:00.642269 master-0 kubenswrapper[18707]: I0320 09:08:00.642172 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ce37ea0fda3c42157346a211ce2a8ad2552ff1f96dd458846a3f3b8dccb196e" Mar 20 09:08:00.642269 master-0 kubenswrapper[18707]: I0320 09:08:00.642210 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fh7jr" Mar 20 09:08:00.643646 master-0 kubenswrapper[18707]: I0320 09:08:00.643614 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-11a2-account-create-update-gkhvg" event={"ID":"322c240c-fbd4-40ea-80bf-cc8bf2611394","Type":"ContainerDied","Data":"1e8cc915e86a914ee7ca4c136c68a969ad74279043f7d3874a4265b6019a0eec"} Mar 20 09:08:00.643719 master-0 kubenswrapper[18707]: I0320 09:08:00.643645 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e8cc915e86a914ee7ca4c136c68a969ad74279043f7d3874a4265b6019a0eec" Mar 20 09:08:00.643719 master-0 kubenswrapper[18707]: I0320 09:08:00.643655 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-11a2-account-create-update-gkhvg" Mar 20 09:08:00.645155 master-0 kubenswrapper[18707]: I0320 09:08:00.645098 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hdkgz" event={"ID":"2a210b09-ea3d-4841-9cb3-f86a7f93985e","Type":"ContainerDied","Data":"20a1b5baa3699da09a461312bff05ce3f520e831c8e72117e0b19a5f01f83067"} Mar 20 09:08:00.645260 master-0 kubenswrapper[18707]: I0320 09:08:00.645156 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20a1b5baa3699da09a461312bff05ce3f520e831c8e72117e0b19a5f01f83067" Mar 20 09:08:00.645260 master-0 kubenswrapper[18707]: I0320 09:08:00.645242 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hdkgz" Mar 20 09:08:00.647416 master-0 kubenswrapper[18707]: I0320 09:08:00.647380 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3995-account-create-update-7pfxc" Mar 20 09:08:00.648639 master-0 kubenswrapper[18707]: I0320 09:08:00.648593 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3995-account-create-update-7pfxc" event={"ID":"00e361f2-f12a-4354-b907-8575930aee6b","Type":"ContainerDied","Data":"efd0cd3870d7bb5c4d1284f37fcb40d97b3cff58096055662b090cda6ca21dc4"} Mar 20 09:08:00.648713 master-0 kubenswrapper[18707]: I0320 09:08:00.648652 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efd0cd3870d7bb5c4d1284f37fcb40d97b3cff58096055662b090cda6ca21dc4" Mar 20 09:08:00.691721 master-0 kubenswrapper[18707]: I0320 09:08:00.691617 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp6fb\" (UniqueName: \"kubernetes.io/projected/2a210b09-ea3d-4841-9cb3-f86a7f93985e-kube-api-access-kp6fb\") pod \"2a210b09-ea3d-4841-9cb3-f86a7f93985e\" (UID: \"2a210b09-ea3d-4841-9cb3-f86a7f93985e\") " Mar 20 09:08:00.692849 master-0 kubenswrapper[18707]: I0320 09:08:00.691953 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a210b09-ea3d-4841-9cb3-f86a7f93985e-operator-scripts\") pod \"2a210b09-ea3d-4841-9cb3-f86a7f93985e\" (UID: \"2a210b09-ea3d-4841-9cb3-f86a7f93985e\") " Mar 20 09:08:00.693004 master-0 kubenswrapper[18707]: I0320 09:08:00.692970 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cldzz\" (UniqueName: \"kubernetes.io/projected/e2bb4fbf-4157-4051-bfbb-0b0a011fbc49-kube-api-access-cldzz\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:00.693004 master-0 kubenswrapper[18707]: I0320 09:08:00.693000 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6xxt\" (UniqueName: \"kubernetes.io/projected/00e361f2-f12a-4354-b907-8575930aee6b-kube-api-access-k6xxt\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:00.693107 master-0 kubenswrapper[18707]: I0320 09:08:00.693016 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2bb4fbf-4157-4051-bfbb-0b0a011fbc49-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:00.693107 master-0 kubenswrapper[18707]: I0320 09:08:00.693029 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsjh9\" (UniqueName: \"kubernetes.io/projected/322c240c-fbd4-40ea-80bf-cc8bf2611394-kube-api-access-hsjh9\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:00.693487 master-0 kubenswrapper[18707]: I0320 09:08:00.693453 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a210b09-ea3d-4841-9cb3-f86a7f93985e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a210b09-ea3d-4841-9cb3-f86a7f93985e" (UID: "2a210b09-ea3d-4841-9cb3-f86a7f93985e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:08:00.698592 master-0 kubenswrapper[18707]: I0320 09:08:00.698524 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a210b09-ea3d-4841-9cb3-f86a7f93985e-kube-api-access-kp6fb" (OuterVolumeSpecName: "kube-api-access-kp6fb") pod "2a210b09-ea3d-4841-9cb3-f86a7f93985e" (UID: "2a210b09-ea3d-4841-9cb3-f86a7f93985e"). InnerVolumeSpecName "kube-api-access-kp6fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:08:00.795540 master-0 kubenswrapper[18707]: I0320 09:08:00.795481 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a210b09-ea3d-4841-9cb3-f86a7f93985e-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:00.795540 master-0 kubenswrapper[18707]: I0320 09:08:00.795533 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp6fb\" (UniqueName: \"kubernetes.io/projected/2a210b09-ea3d-4841-9cb3-f86a7f93985e-kube-api-access-kp6fb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:00.852349 master-0 kubenswrapper[18707]: I0320 09:08:00.852297 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.007215 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-27086-default-internal-api-0"] Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: E0320 09:08:01.007803 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bb4fbf-4157-4051-bfbb-0b0a011fbc49" containerName="mariadb-account-create-update" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.007823 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bb4fbf-4157-4051-bfbb-0b0a011fbc49" containerName="mariadb-account-create-update" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: E0320 09:08:01.007936 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a210b09-ea3d-4841-9cb3-f86a7f93985e" containerName="mariadb-database-create" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.007949 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a210b09-ea3d-4841-9cb3-f86a7f93985e" containerName="mariadb-database-create" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: E0320 09:08:01.007961 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dafeade-6a38-4134-ac1c-7331ea038aec" containerName="placement-api" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.007970 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dafeade-6a38-4134-ac1c-7331ea038aec" containerName="placement-api" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: E0320 09:08:01.007979 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="322c240c-fbd4-40ea-80bf-cc8bf2611394" containerName="mariadb-account-create-update" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.007987 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="322c240c-fbd4-40ea-80bf-cc8bf2611394" containerName="mariadb-account-create-update" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: E0320 09:08:01.008014 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91fb20b-d4f7-4c69-b78c-458e9122718d" containerName="mariadb-database-create" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.008023 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91fb20b-d4f7-4c69-b78c-458e9122718d" containerName="mariadb-database-create" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: E0320 09:08:01.008036 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dafeade-6a38-4134-ac1c-7331ea038aec" containerName="placement-log" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.008046 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dafeade-6a38-4134-ac1c-7331ea038aec" containerName="placement-log" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: E0320 09:08:01.008072 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00e361f2-f12a-4354-b907-8575930aee6b" containerName="mariadb-account-create-update" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.008087 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="00e361f2-f12a-4354-b907-8575930aee6b" containerName="mariadb-account-create-update" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.008455 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dafeade-6a38-4134-ac1c-7331ea038aec" containerName="placement-log" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.008481 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c91fb20b-d4f7-4c69-b78c-458e9122718d" containerName="mariadb-database-create" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.008523 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bb4fbf-4157-4051-bfbb-0b0a011fbc49" containerName="mariadb-account-create-update" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.008544 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a210b09-ea3d-4841-9cb3-f86a7f93985e" containerName="mariadb-database-create" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.008560 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="00e361f2-f12a-4354-b907-8575930aee6b" containerName="mariadb-account-create-update" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.008580 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dafeade-6a38-4134-ac1c-7331ea038aec" containerName="placement-api" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.008587 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="322c240c-fbd4-40ea-80bf-cc8bf2611394" containerName="mariadb-account-create-update" Mar 20 09:08:01.015033 master-0 kubenswrapper[18707]: I0320 09:08:01.010139 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.024452 master-0 kubenswrapper[18707]: I0320 09:08:01.023261 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-11a2-account-create-update-gkhvg" podStartSLOduration=8.023239541 podStartE2EDuration="8.023239541s" podCreationTimestamp="2026-03-20 09:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:08:00.981152938 +0000 UTC m=+1626.137333294" watchObservedRunningTime="2026-03-20 09:08:01.023239541 +0000 UTC m=+1626.179419897" Mar 20 09:08:01.027822 master-0 kubenswrapper[18707]: I0320 09:08:01.027008 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 09:08:01.027822 master-0 kubenswrapper[18707]: I0320 09:08:01.027289 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-27086-default-internal-config-data" Mar 20 09:08:01.061767 master-0 kubenswrapper[18707]: I0320 09:08:01.061737 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-th2rs" Mar 20 09:08:01.109203 master-0 kubenswrapper[18707]: I0320 09:08:01.108738 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcee9936-db51-42d9-ada7-279cec63d27e" path="/var/lib/kubelet/pods/dcee9936-db51-42d9-ada7-279cec63d27e/volumes" Mar 20 09:08:01.249314 master-0 kubenswrapper[18707]: I0320 09:08:01.249265 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-27086-default-internal-api-0"] Mar 20 09:08:01.336128 master-0 kubenswrapper[18707]: I0320 09:08:01.336080 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a88fa32-c356-4dfe-abb9-d342391d52de-operator-scripts\") pod \"7a88fa32-c356-4dfe-abb9-d342391d52de\" (UID: \"7a88fa32-c356-4dfe-abb9-d342391d52de\") " Mar 20 09:08:01.336490 master-0 kubenswrapper[18707]: I0320 09:08:01.336473 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m42wd\" (UniqueName: \"kubernetes.io/projected/7a88fa32-c356-4dfe-abb9-d342391d52de-kube-api-access-m42wd\") pod \"7a88fa32-c356-4dfe-abb9-d342391d52de\" (UID: \"7a88fa32-c356-4dfe-abb9-d342391d52de\") " Mar 20 09:08:01.336623 master-0 kubenswrapper[18707]: I0320 09:08:01.336578 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a88fa32-c356-4dfe-abb9-d342391d52de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a88fa32-c356-4dfe-abb9-d342391d52de" (UID: "7a88fa32-c356-4dfe-abb9-d342391d52de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:08:01.337234 master-0 kubenswrapper[18707]: I0320 09:08:01.337175 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ed7fc861-4795-473d-84e6-66068cd18122\" (UniqueName: \"kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.337385 master-0 kubenswrapper[18707]: I0320 09:08:01.337362 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6d63cc-ca93-492b-b959-f30511f5b3d6-scripts\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.337714 master-0 kubenswrapper[18707]: I0320 09:08:01.337610 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6d63cc-ca93-492b-b959-f30511f5b3d6-config-data\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.337771 master-0 kubenswrapper[18707]: I0320 09:08:01.337712 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6d63cc-ca93-492b-b959-f30511f5b3d6-combined-ca-bundle\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.337962 master-0 kubenswrapper[18707]: I0320 09:08:01.337935 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f6d63cc-ca93-492b-b959-f30511f5b3d6-logs\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.338007 master-0 kubenswrapper[18707]: I0320 09:08:01.337966 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59rwd\" (UniqueName: \"kubernetes.io/projected/6f6d63cc-ca93-492b-b959-f30511f5b3d6-kube-api-access-59rwd\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.338294 master-0 kubenswrapper[18707]: I0320 09:08:01.338266 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f6d63cc-ca93-492b-b959-f30511f5b3d6-httpd-run\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.338355 master-0 kubenswrapper[18707]: I0320 09:08:01.338297 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f6d63cc-ca93-492b-b959-f30511f5b3d6-internal-tls-certs\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.338484 master-0 kubenswrapper[18707]: I0320 09:08:01.338464 18707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a88fa32-c356-4dfe-abb9-d342391d52de-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:01.342327 master-0 kubenswrapper[18707]: I0320 09:08:01.342291 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a88fa32-c356-4dfe-abb9-d342391d52de-kube-api-access-m42wd" (OuterVolumeSpecName: "kube-api-access-m42wd") pod "7a88fa32-c356-4dfe-abb9-d342391d52de" (UID: "7a88fa32-c356-4dfe-abb9-d342391d52de"). InnerVolumeSpecName "kube-api-access-m42wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:08:01.440703 master-0 kubenswrapper[18707]: I0320 09:08:01.440647 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f6d63cc-ca93-492b-b959-f30511f5b3d6-logs\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.440703 master-0 kubenswrapper[18707]: I0320 09:08:01.440705 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59rwd\" (UniqueName: \"kubernetes.io/projected/6f6d63cc-ca93-492b-b959-f30511f5b3d6-kube-api-access-59rwd\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.441323 master-0 kubenswrapper[18707]: I0320 09:08:01.441275 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f6d63cc-ca93-492b-b959-f30511f5b3d6-logs\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.441378 master-0 kubenswrapper[18707]: I0320 09:08:01.441313 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f6d63cc-ca93-492b-b959-f30511f5b3d6-httpd-run\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.441427 master-0 kubenswrapper[18707]: I0320 09:08:01.441382 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f6d63cc-ca93-492b-b959-f30511f5b3d6-internal-tls-certs\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.441579 master-0 kubenswrapper[18707]: I0320 09:08:01.441551 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6d63cc-ca93-492b-b959-f30511f5b3d6-scripts\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.441629 master-0 kubenswrapper[18707]: I0320 09:08:01.441603 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6d63cc-ca93-492b-b959-f30511f5b3d6-config-data\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.441662 master-0 kubenswrapper[18707]: I0320 09:08:01.441639 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6d63cc-ca93-492b-b959-f30511f5b3d6-combined-ca-bundle\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.441858 master-0 kubenswrapper[18707]: I0320 09:08:01.441824 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m42wd\" (UniqueName: \"kubernetes.io/projected/7a88fa32-c356-4dfe-abb9-d342391d52de-kube-api-access-m42wd\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:01.441910 master-0 kubenswrapper[18707]: I0320 09:08:01.441846 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f6d63cc-ca93-492b-b959-f30511f5b3d6-httpd-run\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.449480 master-0 kubenswrapper[18707]: I0320 09:08:01.446363 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f6d63cc-ca93-492b-b959-f30511f5b3d6-internal-tls-certs\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.449480 master-0 kubenswrapper[18707]: I0320 09:08:01.447161 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6d63cc-ca93-492b-b959-f30511f5b3d6-config-data\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.449480 master-0 kubenswrapper[18707]: I0320 09:08:01.447874 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6d63cc-ca93-492b-b959-f30511f5b3d6-scripts\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.450544 master-0 kubenswrapper[18707]: I0320 09:08:01.450518 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6d63cc-ca93-492b-b959-f30511f5b3d6-combined-ca-bundle\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.646474 master-0 kubenswrapper[18707]: I0320 09:08:01.646099 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ed7fc861-4795-473d-84e6-66068cd18122\" (UniqueName: \"kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.648372 master-0 kubenswrapper[18707]: I0320 09:08:01.648326 18707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 09:08:01.648457 master-0 kubenswrapper[18707]: I0320 09:08:01.648385 18707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ed7fc861-4795-473d-84e6-66068cd18122\" (UniqueName: \"kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c5eaa71b76f2346f9535597ed8ece2d32bc4c6547a6a7c691a392bf9719026bd/globalmount\"" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:01.670591 master-0 kubenswrapper[18707]: I0320 09:08:01.670514 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-th2rs" event={"ID":"7a88fa32-c356-4dfe-abb9-d342391d52de","Type":"ContainerDied","Data":"1e05f0b575a19ec2cd0498827fee3c523b7e8475173e1b2ea287b8886183da6d"} Mar 20 09:08:01.670591 master-0 kubenswrapper[18707]: I0320 09:08:01.670585 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e05f0b575a19ec2cd0498827fee3c523b7e8475173e1b2ea287b8886183da6d" Mar 20 09:08:01.670826 master-0 kubenswrapper[18707]: I0320 09:08:01.670608 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-th2rs" Mar 20 09:08:01.948955 master-0 kubenswrapper[18707]: I0320 09:08:01.948839 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59rwd\" (UniqueName: \"kubernetes.io/projected/6f6d63cc-ca93-492b-b959-f30511f5b3d6-kube-api-access-59rwd\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:02.513756 master-0 kubenswrapper[18707]: I0320 09:08:02.513696 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ed7fc861-4795-473d-84e6-66068cd18122\" (UniqueName: \"kubernetes.io/csi/topolvm.io^35b1c0cf-b771-4356-8b10-533d70850526\") pod \"glance-27086-default-internal-api-0\" (UID: \"6f6d63cc-ca93-492b-b959-f30511f5b3d6\") " pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:02.599178 master-0 kubenswrapper[18707]: I0320 09:08:02.599067 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-fh7jr" podStartSLOduration=9.599037941 podStartE2EDuration="9.599037941s" podCreationTimestamp="2026-03-20 09:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:08:02.591366822 +0000 UTC m=+1627.747547178" watchObservedRunningTime="2026-03-20 09:08:02.599037941 +0000 UTC m=+1627.755218317" Mar 20 09:08:02.851966 master-0 kubenswrapper[18707]: I0320 09:08:02.851915 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:02.908231 master-0 kubenswrapper[18707]: I0320 09:08:02.901629 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8b47bf878-2ftwn"] Mar 20 09:08:02.921313 master-0 kubenswrapper[18707]: I0320 09:08:02.921251 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8b47bf878-2ftwn"] Mar 20 09:08:02.936947 master-0 kubenswrapper[18707]: I0320 09:08:02.934216 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-27086-default-external-api-0"] Mar 20 09:08:03.188472 master-0 kubenswrapper[18707]: I0320 09:08:03.180071 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dafeade-6a38-4134-ac1c-7331ea038aec" path="/var/lib/kubelet/pods/2dafeade-6a38-4134-ac1c-7331ea038aec/volumes" Mar 20 09:08:03.440798 master-0 kubenswrapper[18707]: I0320 09:08:03.440694 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:08:03.444361 master-0 kubenswrapper[18707]: I0320 09:08:03.444252 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-84fcf8cbb-q9wtd" Mar 20 09:08:03.698770 master-0 kubenswrapper[18707]: I0320 09:08:03.698597 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-external-api-0" event={"ID":"9292cf08-615f-430e-a059-a2d17257282f","Type":"ContainerStarted","Data":"e508b630a4070649790709263693f4bfec42a062ed260277578ea2e70f677553"} Mar 20 09:08:04.500368 master-0 kubenswrapper[18707]: I0320 09:08:04.500280 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-27086-default-internal-api-0"] Mar 20 09:08:04.733302 master-0 kubenswrapper[18707]: I0320 09:08:04.729455 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-internal-api-0" event={"ID":"6f6d63cc-ca93-492b-b959-f30511f5b3d6","Type":"ContainerStarted","Data":"542b53ea3c3ac0688780a553bc6cdb31938fbffdfa3be67909209079e8fab9af"} Mar 20 09:08:05.290502 master-0 kubenswrapper[18707]: I0320 09:08:05.290452 18707 trace.go:236] Trace[1426817552]: "Calculate volume metrics of cache for pod openshift-operator-controller/operator-controller-controller-manager-57777556ff-nk2rf" (20-Mar-2026 09:08:00.769) (total time: 4519ms): Mar 20 09:08:05.290502 master-0 kubenswrapper[18707]: Trace[1426817552]: [4.519342323s] [4.519342323s] END Mar 20 09:08:05.747321 master-0 kubenswrapper[18707]: I0320 09:08:05.747266 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-internal-api-0" event={"ID":"6f6d63cc-ca93-492b-b959-f30511f5b3d6","Type":"ContainerStarted","Data":"67d8cce0c868c82c089d94eb11308c35d9e3857f41dfee0e153865a216e7bc8c"} Mar 20 09:08:05.749985 master-0 kubenswrapper[18707]: I0320 09:08:05.749934 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-external-api-0" event={"ID":"9292cf08-615f-430e-a059-a2d17257282f","Type":"ContainerStarted","Data":"a4912a464d62c3f607643198ab42322c44a5da5067eaefa0408ce2ccdfeb699d"} Mar 20 09:08:05.750094 master-0 kubenswrapper[18707]: I0320 09:08:05.749988 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-external-api-0" event={"ID":"9292cf08-615f-430e-a059-a2d17257282f","Type":"ContainerStarted","Data":"5b043c8fc5514087f27fce30f4c0f127abe032963d6510054cccbb64ca22c57a"} Mar 20 09:08:05.840006 master-0 kubenswrapper[18707]: I0320 09:08:05.839644 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-27086-default-external-api-0" podStartSLOduration=9.839618632 podStartE2EDuration="9.839618632s" podCreationTimestamp="2026-03-20 09:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:08:05.778924298 +0000 UTC m=+1630.935104654" watchObservedRunningTime="2026-03-20 09:08:05.839618632 +0000 UTC m=+1630.995798988" Mar 20 09:08:06.770930 master-0 kubenswrapper[18707]: I0320 09:08:06.769764 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-27086-default-internal-api-0" event={"ID":"6f6d63cc-ca93-492b-b959-f30511f5b3d6","Type":"ContainerStarted","Data":"d323b4b0f3ce930fd630d12e19ab27953bf78650dbedf3ea397cdd74098f526f"} Mar 20 09:08:07.083647 master-0 kubenswrapper[18707]: I0320 09:08:07.083549 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-27086-default-internal-api-0" podStartSLOduration=7.083525798 podStartE2EDuration="7.083525798s" podCreationTimestamp="2026-03-20 09:08:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:08:07.070508486 +0000 UTC m=+1632.226688842" watchObservedRunningTime="2026-03-20 09:08:07.083525798 +0000 UTC m=+1632.239706154" Mar 20 09:08:09.449711 master-0 kubenswrapper[18707]: I0320 09:08:09.449643 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zzsdf"] Mar 20 09:08:09.450381 master-0 kubenswrapper[18707]: E0320 09:08:09.450333 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a88fa32-c356-4dfe-abb9-d342391d52de" containerName="mariadb-database-create" Mar 20 09:08:09.450381 master-0 kubenswrapper[18707]: I0320 09:08:09.450381 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a88fa32-c356-4dfe-abb9-d342391d52de" containerName="mariadb-database-create" Mar 20 09:08:09.450726 master-0 kubenswrapper[18707]: I0320 09:08:09.450703 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a88fa32-c356-4dfe-abb9-d342391d52de" containerName="mariadb-database-create" Mar 20 09:08:09.451710 master-0 kubenswrapper[18707]: I0320 09:08:09.451677 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:09.453973 master-0 kubenswrapper[18707]: I0320 09:08:09.453843 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 09:08:09.460459 master-0 kubenswrapper[18707]: I0320 09:08:09.460409 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 09:08:09.474338 master-0 kubenswrapper[18707]: I0320 09:08:09.473035 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zzsdf"] Mar 20 09:08:09.592554 master-0 kubenswrapper[18707]: I0320 09:08:09.590940 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-config-data\") pod \"nova-cell0-conductor-db-sync-zzsdf\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:09.592554 master-0 kubenswrapper[18707]: I0320 09:08:09.591064 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-scripts\") pod \"nova-cell0-conductor-db-sync-zzsdf\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:09.592554 master-0 kubenswrapper[18707]: I0320 09:08:09.591126 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6g7t\" (UniqueName: \"kubernetes.io/projected/84cb3618-9229-4d84-9041-e632f4e9709e-kube-api-access-d6g7t\") pod \"nova-cell0-conductor-db-sync-zzsdf\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:09.592554 master-0 kubenswrapper[18707]: I0320 09:08:09.591218 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zzsdf\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:09.693578 master-0 kubenswrapper[18707]: I0320 09:08:09.693501 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zzsdf\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:09.693837 master-0 kubenswrapper[18707]: I0320 09:08:09.693699 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-config-data\") pod \"nova-cell0-conductor-db-sync-zzsdf\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:09.694359 master-0 kubenswrapper[18707]: I0320 09:08:09.694311 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-scripts\") pod \"nova-cell0-conductor-db-sync-zzsdf\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:09.694434 master-0 kubenswrapper[18707]: I0320 09:08:09.694408 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6g7t\" (UniqueName: \"kubernetes.io/projected/84cb3618-9229-4d84-9041-e632f4e9709e-kube-api-access-d6g7t\") pod \"nova-cell0-conductor-db-sync-zzsdf\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:09.697782 master-0 kubenswrapper[18707]: I0320 09:08:09.697736 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-config-data\") pod \"nova-cell0-conductor-db-sync-zzsdf\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:09.698258 master-0 kubenswrapper[18707]: I0320 09:08:09.698177 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-scripts\") pod \"nova-cell0-conductor-db-sync-zzsdf\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:09.698418 master-0 kubenswrapper[18707]: I0320 09:08:09.698354 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zzsdf\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:09.780642 master-0 kubenswrapper[18707]: I0320 09:08:09.778902 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6g7t\" (UniqueName: \"kubernetes.io/projected/84cb3618-9229-4d84-9041-e632f4e9709e-kube-api-access-d6g7t\") pod \"nova-cell0-conductor-db-sync-zzsdf\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:09.829346 master-0 kubenswrapper[18707]: I0320 09:08:09.829284 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:10.379207 master-0 kubenswrapper[18707]: W0320 09:08:10.374797 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84cb3618_9229_4d84_9041_e632f4e9709e.slice/crio-57c46d6980eee7c5020eadb8b95d2402dc296970d589c5e83625d217df065582 WatchSource:0}: Error finding container 57c46d6980eee7c5020eadb8b95d2402dc296970d589c5e83625d217df065582: Status 404 returned error can't find the container with id 57c46d6980eee7c5020eadb8b95d2402dc296970d589c5e83625d217df065582 Mar 20 09:08:10.379207 master-0 kubenswrapper[18707]: I0320 09:08:10.377080 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zzsdf"] Mar 20 09:08:10.818572 master-0 kubenswrapper[18707]: I0320 09:08:10.818513 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zzsdf" event={"ID":"84cb3618-9229-4d84-9041-e632f4e9709e","Type":"ContainerStarted","Data":"57c46d6980eee7c5020eadb8b95d2402dc296970d589c5e83625d217df065582"} Mar 20 09:08:10.853707 master-0 kubenswrapper[18707]: I0320 09:08:10.853001 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:08:10.853929 master-0 kubenswrapper[18707]: I0320 09:08:10.853915 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:08:10.879017 master-0 kubenswrapper[18707]: I0320 09:08:10.878977 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:08:10.891602 master-0 kubenswrapper[18707]: I0320 09:08:10.891548 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:08:11.830364 master-0 kubenswrapper[18707]: I0320 09:08:11.830274 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:08:11.830364 master-0 kubenswrapper[18707]: I0320 09:08:11.830341 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:08:12.854201 master-0 kubenswrapper[18707]: I0320 09:08:12.854123 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:12.854201 master-0 kubenswrapper[18707]: I0320 09:08:12.854199 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:12.895401 master-0 kubenswrapper[18707]: I0320 09:08:12.895324 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:12.911222 master-0 kubenswrapper[18707]: I0320 09:08:12.908024 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:13.859951 master-0 kubenswrapper[18707]: I0320 09:08:13.859876 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 09:08:13.859951 master-0 kubenswrapper[18707]: I0320 09:08:13.859924 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 09:08:13.861417 master-0 kubenswrapper[18707]: I0320 09:08:13.860394 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:13.861417 master-0 kubenswrapper[18707]: I0320 09:08:13.860897 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:15.883780 master-0 kubenswrapper[18707]: I0320 09:08:15.883717 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 09:08:15.883780 master-0 kubenswrapper[18707]: I0320 09:08:15.883759 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 09:08:22.452869 master-0 kubenswrapper[18707]: I0320 09:08:22.452780 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:08:22.453510 master-0 kubenswrapper[18707]: I0320 09:08:22.452990 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 09:08:22.462287 master-0 kubenswrapper[18707]: I0320 09:08:22.461907 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:22.462287 master-0 kubenswrapper[18707]: I0320 09:08:22.462031 18707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 09:08:22.462561 master-0 kubenswrapper[18707]: I0320 09:08:22.462436 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-27086-default-internal-api-0" Mar 20 09:08:22.463521 master-0 kubenswrapper[18707]: I0320 09:08:22.463490 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-27086-default-external-api-0" Mar 20 09:08:23.015803 master-0 kubenswrapper[18707]: I0320 09:08:23.015523 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zzsdf" event={"ID":"84cb3618-9229-4d84-9041-e632f4e9709e","Type":"ContainerStarted","Data":"196ab9ce3249d543464a89f1e5cd6ee5417006111bbdfa2c75f3f300afd473f8"} Mar 20 09:08:23.057529 master-0 kubenswrapper[18707]: I0320 09:08:23.057436 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-zzsdf" podStartSLOduration=2.678833794 podStartE2EDuration="14.057406798s" podCreationTimestamp="2026-03-20 09:08:09 +0000 UTC" firstStartedPulling="2026-03-20 09:08:10.377636081 +0000 UTC m=+1635.533816437" lastFinishedPulling="2026-03-20 09:08:21.756209085 +0000 UTC m=+1646.912389441" observedRunningTime="2026-03-20 09:08:23.039955859 +0000 UTC m=+1648.196136215" watchObservedRunningTime="2026-03-20 09:08:23.057406798 +0000 UTC m=+1648.213587164" Mar 20 09:08:41.291757 master-0 kubenswrapper[18707]: I0320 09:08:41.291683 18707 generic.go:334] "Generic (PLEG): container finished" podID="84cb3618-9229-4d84-9041-e632f4e9709e" containerID="196ab9ce3249d543464a89f1e5cd6ee5417006111bbdfa2c75f3f300afd473f8" exitCode=0 Mar 20 09:08:41.292903 master-0 kubenswrapper[18707]: I0320 09:08:41.291797 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zzsdf" event={"ID":"84cb3618-9229-4d84-9041-e632f4e9709e","Type":"ContainerDied","Data":"196ab9ce3249d543464a89f1e5cd6ee5417006111bbdfa2c75f3f300afd473f8"} Mar 20 09:08:42.821410 master-0 kubenswrapper[18707]: I0320 09:08:42.821361 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:43.017404 master-0 kubenswrapper[18707]: I0320 09:08:43.015540 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-combined-ca-bundle\") pod \"84cb3618-9229-4d84-9041-e632f4e9709e\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " Mar 20 09:08:43.017404 master-0 kubenswrapper[18707]: I0320 09:08:43.015666 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6g7t\" (UniqueName: \"kubernetes.io/projected/84cb3618-9229-4d84-9041-e632f4e9709e-kube-api-access-d6g7t\") pod \"84cb3618-9229-4d84-9041-e632f4e9709e\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " Mar 20 09:08:43.017404 master-0 kubenswrapper[18707]: I0320 09:08:43.015751 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-config-data\") pod \"84cb3618-9229-4d84-9041-e632f4e9709e\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " Mar 20 09:08:43.017404 master-0 kubenswrapper[18707]: I0320 09:08:43.015951 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-scripts\") pod \"84cb3618-9229-4d84-9041-e632f4e9709e\" (UID: \"84cb3618-9229-4d84-9041-e632f4e9709e\") " Mar 20 09:08:43.024100 master-0 kubenswrapper[18707]: I0320 09:08:43.019666 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-scripts" (OuterVolumeSpecName: "scripts") pod "84cb3618-9229-4d84-9041-e632f4e9709e" (UID: "84cb3618-9229-4d84-9041-e632f4e9709e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:08:43.024100 master-0 kubenswrapper[18707]: I0320 09:08:43.020154 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84cb3618-9229-4d84-9041-e632f4e9709e-kube-api-access-d6g7t" (OuterVolumeSpecName: "kube-api-access-d6g7t") pod "84cb3618-9229-4d84-9041-e632f4e9709e" (UID: "84cb3618-9229-4d84-9041-e632f4e9709e"). InnerVolumeSpecName "kube-api-access-d6g7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:08:43.055821 master-0 kubenswrapper[18707]: I0320 09:08:43.055738 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84cb3618-9229-4d84-9041-e632f4e9709e" (UID: "84cb3618-9229-4d84-9041-e632f4e9709e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:08:43.076269 master-0 kubenswrapper[18707]: I0320 09:08:43.076165 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-config-data" (OuterVolumeSpecName: "config-data") pod "84cb3618-9229-4d84-9041-e632f4e9709e" (UID: "84cb3618-9229-4d84-9041-e632f4e9709e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:08:43.120038 master-0 kubenswrapper[18707]: I0320 09:08:43.119986 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:43.120848 master-0 kubenswrapper[18707]: I0320 09:08:43.120648 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:43.120848 master-0 kubenswrapper[18707]: I0320 09:08:43.120714 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cb3618-9229-4d84-9041-e632f4e9709e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:43.120848 master-0 kubenswrapper[18707]: I0320 09:08:43.120733 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6g7t\" (UniqueName: \"kubernetes.io/projected/84cb3618-9229-4d84-9041-e632f4e9709e-kube-api-access-d6g7t\") on node \"master-0\" DevicePath \"\"" Mar 20 09:08:43.379570 master-0 kubenswrapper[18707]: I0320 09:08:43.379487 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zzsdf" event={"ID":"84cb3618-9229-4d84-9041-e632f4e9709e","Type":"ContainerDied","Data":"57c46d6980eee7c5020eadb8b95d2402dc296970d589c5e83625d217df065582"} Mar 20 09:08:43.379570 master-0 kubenswrapper[18707]: I0320 09:08:43.379553 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57c46d6980eee7c5020eadb8b95d2402dc296970d589c5e83625d217df065582" Mar 20 09:08:43.379989 master-0 kubenswrapper[18707]: I0320 09:08:43.379616 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zzsdf" Mar 20 09:08:43.494411 master-0 kubenswrapper[18707]: I0320 09:08:43.494339 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 09:08:43.495008 master-0 kubenswrapper[18707]: E0320 09:08:43.494976 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84cb3618-9229-4d84-9041-e632f4e9709e" containerName="nova-cell0-conductor-db-sync" Mar 20 09:08:43.495496 master-0 kubenswrapper[18707]: I0320 09:08:43.495023 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cb3618-9229-4d84-9041-e632f4e9709e" containerName="nova-cell0-conductor-db-sync" Mar 20 09:08:43.495496 master-0 kubenswrapper[18707]: I0320 09:08:43.495444 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="84cb3618-9229-4d84-9041-e632f4e9709e" containerName="nova-cell0-conductor-db-sync" Mar 20 09:08:43.496644 master-0 kubenswrapper[18707]: I0320 09:08:43.496593 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 09:08:43.499975 master-0 kubenswrapper[18707]: I0320 09:08:43.499937 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 09:08:43.532845 master-0 kubenswrapper[18707]: I0320 09:08:43.532746 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfqjf\" (UniqueName: \"kubernetes.io/projected/bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120-kube-api-access-xfqjf\") pod \"nova-cell0-conductor-0\" (UID: \"bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:08:43.543566 master-0 kubenswrapper[18707]: I0320 09:08:43.533351 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:08:43.543566 master-0 kubenswrapper[18707]: I0320 09:08:43.533482 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:08:43.559684 master-0 kubenswrapper[18707]: I0320 09:08:43.558996 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 09:08:43.635342 master-0 kubenswrapper[18707]: I0320 09:08:43.635210 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:08:43.635342 master-0 kubenswrapper[18707]: I0320 09:08:43.635285 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:08:43.636317 master-0 kubenswrapper[18707]: I0320 09:08:43.636273 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfqjf\" (UniqueName: \"kubernetes.io/projected/bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120-kube-api-access-xfqjf\") pod \"nova-cell0-conductor-0\" (UID: \"bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:08:43.639789 master-0 kubenswrapper[18707]: I0320 09:08:43.639722 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:08:43.641052 master-0 kubenswrapper[18707]: I0320 09:08:43.641000 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:08:43.655769 master-0 kubenswrapper[18707]: I0320 09:08:43.655733 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfqjf\" (UniqueName: \"kubernetes.io/projected/bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120-kube-api-access-xfqjf\") pod \"nova-cell0-conductor-0\" (UID: \"bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120\") " pod="openstack/nova-cell0-conductor-0" Mar 20 09:08:43.826419 master-0 kubenswrapper[18707]: I0320 09:08:43.826360 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 09:08:44.337350 master-0 kubenswrapper[18707]: W0320 09:08:44.337287 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc5bbb1f_7ab6_46dd_9bd1_cdf24a6f6120.slice/crio-7eb260b4c8fea0d429a620ff7f7406e678d16d2dfaf040bc847592179d2ac43a WatchSource:0}: Error finding container 7eb260b4c8fea0d429a620ff7f7406e678d16d2dfaf040bc847592179d2ac43a: Status 404 returned error can't find the container with id 7eb260b4c8fea0d429a620ff7f7406e678d16d2dfaf040bc847592179d2ac43a Mar 20 09:08:44.340694 master-0 kubenswrapper[18707]: I0320 09:08:44.340653 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 09:08:44.393034 master-0 kubenswrapper[18707]: I0320 09:08:44.392923 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120","Type":"ContainerStarted","Data":"7eb260b4c8fea0d429a620ff7f7406e678d16d2dfaf040bc847592179d2ac43a"} Mar 20 09:08:45.411079 master-0 kubenswrapper[18707]: I0320 09:08:45.410954 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bc5bbb1f-7ab6-46dd-9bd1-cdf24a6f6120","Type":"ContainerStarted","Data":"0cafeb0b6f2519285b02f1eafb75d9126cd29f78154bb4530e315e40d53628bf"} Mar 20 09:08:45.412289 master-0 kubenswrapper[18707]: I0320 09:08:45.412236 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 09:08:45.529315 master-0 kubenswrapper[18707]: I0320 09:08:45.529150 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.529109239 podStartE2EDuration="2.529109239s" podCreationTimestamp="2026-03-20 09:08:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:08:45.520266356 +0000 UTC m=+1670.676446732" watchObservedRunningTime="2026-03-20 09:08:45.529109239 +0000 UTC m=+1670.685289595" Mar 20 09:08:53.856409 master-0 kubenswrapper[18707]: I0320 09:08:53.856229 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 09:08:54.982547 master-0 kubenswrapper[18707]: I0320 09:08:54.982447 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-czjjp"] Mar 20 09:08:54.984628 master-0 kubenswrapper[18707]: I0320 09:08:54.984371 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:08:54.986908 master-0 kubenswrapper[18707]: I0320 09:08:54.986854 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 09:08:54.986992 master-0 kubenswrapper[18707]: I0320 09:08:54.986928 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 09:08:55.047476 master-0 kubenswrapper[18707]: I0320 09:08:55.047394 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-scripts\") pod \"nova-cell0-cell-mapping-czjjp\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:08:55.047741 master-0 kubenswrapper[18707]: I0320 09:08:55.047604 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-czjjp\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:08:55.047835 master-0 kubenswrapper[18707]: I0320 09:08:55.047804 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-config-data\") pod \"nova-cell0-cell-mapping-czjjp\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:08:55.047950 master-0 kubenswrapper[18707]: I0320 09:08:55.047926 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7s7z\" (UniqueName: \"kubernetes.io/projected/1f93103b-ad0e-4911-a6a3-003be4b823f4-kube-api-access-z7s7z\") pod \"nova-cell0-cell-mapping-czjjp\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:08:55.090252 master-0 kubenswrapper[18707]: I0320 09:08:55.090168 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-czjjp"] Mar 20 09:08:55.148774 master-0 kubenswrapper[18707]: I0320 09:08:55.148711 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-config-data\") pod \"nova-cell0-cell-mapping-czjjp\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:08:55.149067 master-0 kubenswrapper[18707]: I0320 09:08:55.148787 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7s7z\" (UniqueName: \"kubernetes.io/projected/1f93103b-ad0e-4911-a6a3-003be4b823f4-kube-api-access-z7s7z\") pod \"nova-cell0-cell-mapping-czjjp\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:08:55.149136 master-0 kubenswrapper[18707]: I0320 09:08:55.149098 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-scripts\") pod \"nova-cell0-cell-mapping-czjjp\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:08:55.149219 master-0 kubenswrapper[18707]: I0320 09:08:55.149153 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-czjjp\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:08:55.151115 master-0 kubenswrapper[18707]: I0320 09:08:55.151074 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 09:08:55.152852 master-0 kubenswrapper[18707]: I0320 09:08:55.152818 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-czjjp\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:08:55.153194 master-0 kubenswrapper[18707]: I0320 09:08:55.153152 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 09:08:55.164071 master-0 kubenswrapper[18707]: I0320 09:08:55.164022 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-config-data\") pod \"nova-cell0-cell-mapping-czjjp\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:08:55.173701 master-0 kubenswrapper[18707]: I0320 09:08:55.168935 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-scripts\") pod \"nova-cell0-cell-mapping-czjjp\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:08:55.261411 master-0 kubenswrapper[18707]: I0320 09:08:55.261224 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7s7z\" (UniqueName: \"kubernetes.io/projected/1f93103b-ad0e-4911-a6a3-003be4b823f4-kube-api-access-z7s7z\") pod \"nova-cell0-cell-mapping-czjjp\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:08:55.307247 master-0 kubenswrapper[18707]: I0320 09:08:55.307157 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:08:56.130304 master-0 kubenswrapper[18707]: I0320 09:08:56.130130 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-czjjp"] Mar 20 09:08:56.313764 master-0 kubenswrapper[18707]: I0320 09:08:56.307303 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bcpbl"] Mar 20 09:08:56.313764 master-0 kubenswrapper[18707]: I0320 09:08:56.313028 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:08:56.332545 master-0 kubenswrapper[18707]: I0320 09:08:56.321425 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 09:08:56.332545 master-0 kubenswrapper[18707]: I0320 09:08:56.328102 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 09:08:56.409745 master-0 kubenswrapper[18707]: I0320 09:08:56.407780 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 09:08:56.411000 master-0 kubenswrapper[18707]: I0320 09:08:56.410945 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-config-data\") pod \"nova-cell1-conductor-db-sync-bcpbl\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:08:56.411682 master-0 kubenswrapper[18707]: I0320 09:08:56.411648 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bcpbl\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:08:56.411776 master-0 kubenswrapper[18707]: I0320 09:08:56.411699 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:08:56.412116 master-0 kubenswrapper[18707]: I0320 09:08:56.412066 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-scripts\") pod \"nova-cell1-conductor-db-sync-bcpbl\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:08:56.412283 master-0 kubenswrapper[18707]: I0320 09:08:56.412254 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnflw\" (UniqueName: \"kubernetes.io/projected/c5f8abfc-e320-410d-bb6b-b5055c9fc454-kube-api-access-dnflw\") pod \"nova-cell1-conductor-db-sync-bcpbl\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:08:56.427758 master-0 kubenswrapper[18707]: I0320 09:08:56.425588 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 09:08:56.468429 master-0 kubenswrapper[18707]: I0320 09:08:56.468105 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bcpbl"] Mar 20 09:08:56.489593 master-0 kubenswrapper[18707]: I0320 09:08:56.489525 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 09:08:56.492057 master-0 kubenswrapper[18707]: I0320 09:08:56.492019 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:08:56.493899 master-0 kubenswrapper[18707]: I0320 09:08:56.493865 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 09:08:56.508069 master-0 kubenswrapper[18707]: I0320 09:08:56.508021 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:08:56.515106 master-0 kubenswrapper[18707]: I0320 09:08:56.515057 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-scripts\") pod \"nova-cell1-conductor-db-sync-bcpbl\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:08:56.515211 master-0 kubenswrapper[18707]: I0320 09:08:56.515122 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8fn8\" (UniqueName: \"kubernetes.io/projected/ff5a931d-3951-482d-88a0-eefac071090a-kube-api-access-t8fn8\") pod \"nova-api-0\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " pod="openstack/nova-api-0" Mar 20 09:08:56.515266 master-0 kubenswrapper[18707]: I0320 09:08:56.515219 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnflw\" (UniqueName: \"kubernetes.io/projected/c5f8abfc-e320-410d-bb6b-b5055c9fc454-kube-api-access-dnflw\") pod \"nova-cell1-conductor-db-sync-bcpbl\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:08:56.515266 master-0 kubenswrapper[18707]: I0320 09:08:56.515249 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5a931d-3951-482d-88a0-eefac071090a-config-data\") pod \"nova-api-0\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " pod="openstack/nova-api-0" Mar 20 09:08:56.515348 master-0 kubenswrapper[18707]: I0320 09:08:56.515270 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-config-data\") pod \"nova-cell1-conductor-db-sync-bcpbl\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:08:56.515348 master-0 kubenswrapper[18707]: I0320 09:08:56.515340 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff5a931d-3951-482d-88a0-eefac071090a-logs\") pod \"nova-api-0\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " pod="openstack/nova-api-0" Mar 20 09:08:56.515439 master-0 kubenswrapper[18707]: I0320 09:08:56.515417 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bcpbl\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:08:56.515528 master-0 kubenswrapper[18707]: I0320 09:08:56.515496 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5a931d-3951-482d-88a0-eefac071090a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " pod="openstack/nova-api-0" Mar 20 09:08:56.519728 master-0 kubenswrapper[18707]: I0320 09:08:56.519676 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-config-data\") pod \"nova-cell1-conductor-db-sync-bcpbl\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:08:56.520014 master-0 kubenswrapper[18707]: I0320 09:08:56.519969 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-scripts\") pod \"nova-cell1-conductor-db-sync-bcpbl\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:08:56.520081 master-0 kubenswrapper[18707]: I0320 09:08:56.520017 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bcpbl\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:08:56.520465 master-0 kubenswrapper[18707]: I0320 09:08:56.520434 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:08:56.522286 master-0 kubenswrapper[18707]: I0320 09:08:56.522255 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 09:08:56.524519 master-0 kubenswrapper[18707]: I0320 09:08:56.524498 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 09:08:56.532100 master-0 kubenswrapper[18707]: I0320 09:08:56.532043 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 09:08:56.558070 master-0 kubenswrapper[18707]: I0320 09:08:56.558003 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-czjjp" event={"ID":"1f93103b-ad0e-4911-a6a3-003be4b823f4","Type":"ContainerStarted","Data":"0ca4671785f530edea35237ca9107943b49d9f5401388b2e17bc54478b9e5b74"} Mar 20 09:08:56.561874 master-0 kubenswrapper[18707]: I0320 09:08:56.560023 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:08:56.598289 master-0 kubenswrapper[18707]: I0320 09:08:56.598154 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnflw\" (UniqueName: \"kubernetes.io/projected/c5f8abfc-e320-410d-bb6b-b5055c9fc454-kube-api-access-dnflw\") pod \"nova-cell1-conductor-db-sync-bcpbl\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:08:56.619299 master-0 kubenswrapper[18707]: I0320 09:08:56.618203 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5a931d-3951-482d-88a0-eefac071090a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " pod="openstack/nova-api-0" Mar 20 09:08:56.619299 master-0 kubenswrapper[18707]: I0320 09:08:56.618265 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nd7c\" (UniqueName: \"kubernetes.io/projected/21d04a7e-7a88-496f-b2ed-65183508c0af-kube-api-access-9nd7c\") pod \"nova-scheduler-0\" (UID: \"21d04a7e-7a88-496f-b2ed-65183508c0af\") " pod="openstack/nova-scheduler-0" Mar 20 09:08:56.619299 master-0 kubenswrapper[18707]: I0320 09:08:56.618323 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d04a7e-7a88-496f-b2ed-65183508c0af-config-data\") pod \"nova-scheduler-0\" (UID: \"21d04a7e-7a88-496f-b2ed-65183508c0af\") " pod="openstack/nova-scheduler-0" Mar 20 09:08:56.619299 master-0 kubenswrapper[18707]: I0320 09:08:56.618413 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63076b6f-15f6-4802-bdaa-12c3d504d239-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"63076b6f-15f6-4802-bdaa-12c3d504d239\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:08:56.619299 master-0 kubenswrapper[18707]: I0320 09:08:56.618510 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8fn8\" (UniqueName: \"kubernetes.io/projected/ff5a931d-3951-482d-88a0-eefac071090a-kube-api-access-t8fn8\") pod \"nova-api-0\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " pod="openstack/nova-api-0" Mar 20 09:08:56.619299 master-0 kubenswrapper[18707]: I0320 09:08:56.618778 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5a931d-3951-482d-88a0-eefac071090a-config-data\") pod \"nova-api-0\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " pod="openstack/nova-api-0" Mar 20 09:08:56.619299 master-0 kubenswrapper[18707]: I0320 09:08:56.618834 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htdr8\" (UniqueName: \"kubernetes.io/projected/63076b6f-15f6-4802-bdaa-12c3d504d239-kube-api-access-htdr8\") pod \"nova-cell1-novncproxy-0\" (UID: \"63076b6f-15f6-4802-bdaa-12c3d504d239\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:08:56.619299 master-0 kubenswrapper[18707]: I0320 09:08:56.618879 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d04a7e-7a88-496f-b2ed-65183508c0af-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"21d04a7e-7a88-496f-b2ed-65183508c0af\") " pod="openstack/nova-scheduler-0" Mar 20 09:08:56.619299 master-0 kubenswrapper[18707]: I0320 09:08:56.618934 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff5a931d-3951-482d-88a0-eefac071090a-logs\") pod \"nova-api-0\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " pod="openstack/nova-api-0" Mar 20 09:08:56.619299 master-0 kubenswrapper[18707]: I0320 09:08:56.619085 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63076b6f-15f6-4802-bdaa-12c3d504d239-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"63076b6f-15f6-4802-bdaa-12c3d504d239\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:08:56.619923 master-0 kubenswrapper[18707]: I0320 09:08:56.619752 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff5a931d-3951-482d-88a0-eefac071090a-logs\") pod \"nova-api-0\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " pod="openstack/nova-api-0" Mar 20 09:08:56.622615 master-0 kubenswrapper[18707]: I0320 09:08:56.622573 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5a931d-3951-482d-88a0-eefac071090a-config-data\") pod \"nova-api-0\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " pod="openstack/nova-api-0" Mar 20 09:08:56.623249 master-0 kubenswrapper[18707]: I0320 09:08:56.623220 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5a931d-3951-482d-88a0-eefac071090a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " pod="openstack/nova-api-0" Mar 20 09:08:56.722052 master-0 kubenswrapper[18707]: I0320 09:08:56.721975 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nd7c\" (UniqueName: \"kubernetes.io/projected/21d04a7e-7a88-496f-b2ed-65183508c0af-kube-api-access-9nd7c\") pod \"nova-scheduler-0\" (UID: \"21d04a7e-7a88-496f-b2ed-65183508c0af\") " pod="openstack/nova-scheduler-0" Mar 20 09:08:56.722286 master-0 kubenswrapper[18707]: I0320 09:08:56.722227 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d04a7e-7a88-496f-b2ed-65183508c0af-config-data\") pod \"nova-scheduler-0\" (UID: \"21d04a7e-7a88-496f-b2ed-65183508c0af\") " pod="openstack/nova-scheduler-0" Mar 20 09:08:56.722459 master-0 kubenswrapper[18707]: I0320 09:08:56.722424 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63076b6f-15f6-4802-bdaa-12c3d504d239-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"63076b6f-15f6-4802-bdaa-12c3d504d239\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:08:56.722718 master-0 kubenswrapper[18707]: I0320 09:08:56.722688 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htdr8\" (UniqueName: \"kubernetes.io/projected/63076b6f-15f6-4802-bdaa-12c3d504d239-kube-api-access-htdr8\") pod \"nova-cell1-novncproxy-0\" (UID: \"63076b6f-15f6-4802-bdaa-12c3d504d239\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:08:56.722784 master-0 kubenswrapper[18707]: I0320 09:08:56.722735 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d04a7e-7a88-496f-b2ed-65183508c0af-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"21d04a7e-7a88-496f-b2ed-65183508c0af\") " pod="openstack/nova-scheduler-0" Mar 20 09:08:56.722925 master-0 kubenswrapper[18707]: I0320 09:08:56.722895 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63076b6f-15f6-4802-bdaa-12c3d504d239-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"63076b6f-15f6-4802-bdaa-12c3d504d239\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:08:56.723575 master-0 kubenswrapper[18707]: I0320 09:08:56.723541 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8fn8\" (UniqueName: \"kubernetes.io/projected/ff5a931d-3951-482d-88a0-eefac071090a-kube-api-access-t8fn8\") pod \"nova-api-0\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " pod="openstack/nova-api-0" Mar 20 09:08:56.726942 master-0 kubenswrapper[18707]: I0320 09:08:56.726898 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:08:56.727091 master-0 kubenswrapper[18707]: I0320 09:08:56.727063 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63076b6f-15f6-4802-bdaa-12c3d504d239-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"63076b6f-15f6-4802-bdaa-12c3d504d239\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:08:56.727410 master-0 kubenswrapper[18707]: I0320 09:08:56.727386 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63076b6f-15f6-4802-bdaa-12c3d504d239-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"63076b6f-15f6-4802-bdaa-12c3d504d239\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:08:56.727544 master-0 kubenswrapper[18707]: I0320 09:08:56.727496 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d04a7e-7a88-496f-b2ed-65183508c0af-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"21d04a7e-7a88-496f-b2ed-65183508c0af\") " pod="openstack/nova-scheduler-0" Mar 20 09:08:56.727900 master-0 kubenswrapper[18707]: I0320 09:08:56.727808 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d04a7e-7a88-496f-b2ed-65183508c0af-config-data\") pod \"nova-scheduler-0\" (UID: \"21d04a7e-7a88-496f-b2ed-65183508c0af\") " pod="openstack/nova-scheduler-0" Mar 20 09:08:56.781420 master-0 kubenswrapper[18707]: I0320 09:08:56.781335 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:08:56.819716 master-0 kubenswrapper[18707]: I0320 09:08:56.819480 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:08:56.822752 master-0 kubenswrapper[18707]: I0320 09:08:56.822700 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 09:08:56.826282 master-0 kubenswrapper[18707]: I0320 09:08:56.826168 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 09:08:56.936484 master-0 kubenswrapper[18707]: I0320 09:08:56.932146 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f123c6c-5e29-43af-9378-b7f8ee76b81c-config-data\") pod \"nova-metadata-0\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " pod="openstack/nova-metadata-0" Mar 20 09:08:56.936484 master-0 kubenswrapper[18707]: I0320 09:08:56.932288 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f123c6c-5e29-43af-9378-b7f8ee76b81c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " pod="openstack/nova-metadata-0" Mar 20 09:08:56.936484 master-0 kubenswrapper[18707]: I0320 09:08:56.932579 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5t2r\" (UniqueName: \"kubernetes.io/projected/4f123c6c-5e29-43af-9378-b7f8ee76b81c-kube-api-access-r5t2r\") pod \"nova-metadata-0\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " pod="openstack/nova-metadata-0" Mar 20 09:08:56.936484 master-0 kubenswrapper[18707]: I0320 09:08:56.933346 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f123c6c-5e29-43af-9378-b7f8ee76b81c-logs\") pod \"nova-metadata-0\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " pod="openstack/nova-metadata-0" Mar 20 09:08:57.049639 master-0 kubenswrapper[18707]: I0320 09:08:57.027071 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:08:57.049639 master-0 kubenswrapper[18707]: I0320 09:08:57.039232 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nd7c\" (UniqueName: \"kubernetes.io/projected/21d04a7e-7a88-496f-b2ed-65183508c0af-kube-api-access-9nd7c\") pod \"nova-scheduler-0\" (UID: \"21d04a7e-7a88-496f-b2ed-65183508c0af\") " pod="openstack/nova-scheduler-0" Mar 20 09:08:57.049639 master-0 kubenswrapper[18707]: I0320 09:08:57.042317 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f123c6c-5e29-43af-9378-b7f8ee76b81c-logs\") pod \"nova-metadata-0\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " pod="openstack/nova-metadata-0" Mar 20 09:08:57.049639 master-0 kubenswrapper[18707]: I0320 09:08:57.042613 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f123c6c-5e29-43af-9378-b7f8ee76b81c-config-data\") pod \"nova-metadata-0\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " pod="openstack/nova-metadata-0" Mar 20 09:08:57.049639 master-0 kubenswrapper[18707]: I0320 09:08:57.042665 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f123c6c-5e29-43af-9378-b7f8ee76b81c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " pod="openstack/nova-metadata-0" Mar 20 09:08:57.049639 master-0 kubenswrapper[18707]: I0320 09:08:57.042942 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5t2r\" (UniqueName: \"kubernetes.io/projected/4f123c6c-5e29-43af-9378-b7f8ee76b81c-kube-api-access-r5t2r\") pod \"nova-metadata-0\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " pod="openstack/nova-metadata-0" Mar 20 09:08:57.049639 master-0 kubenswrapper[18707]: I0320 09:08:57.043793 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htdr8\" (UniqueName: \"kubernetes.io/projected/63076b6f-15f6-4802-bdaa-12c3d504d239-kube-api-access-htdr8\") pod \"nova-cell1-novncproxy-0\" (UID: \"63076b6f-15f6-4802-bdaa-12c3d504d239\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:08:57.049639 master-0 kubenswrapper[18707]: I0320 09:08:57.046069 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f123c6c-5e29-43af-9378-b7f8ee76b81c-logs\") pod \"nova-metadata-0\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " pod="openstack/nova-metadata-0" Mar 20 09:08:57.050864 master-0 kubenswrapper[18707]: I0320 09:08:57.050820 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f123c6c-5e29-43af-9378-b7f8ee76b81c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " pod="openstack/nova-metadata-0" Mar 20 09:08:57.065432 master-0 kubenswrapper[18707]: I0320 09:08:57.062455 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f123c6c-5e29-43af-9378-b7f8ee76b81c-config-data\") pod \"nova-metadata-0\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " pod="openstack/nova-metadata-0" Mar 20 09:08:57.116886 master-0 kubenswrapper[18707]: I0320 09:08:57.097323 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5t2r\" (UniqueName: \"kubernetes.io/projected/4f123c6c-5e29-43af-9378-b7f8ee76b81c-kube-api-access-r5t2r\") pod \"nova-metadata-0\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " pod="openstack/nova-metadata-0" Mar 20 09:08:57.118682 master-0 kubenswrapper[18707]: I0320 09:08:57.117723 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:08:57.185030 master-0 kubenswrapper[18707]: I0320 09:08:57.162066 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 09:08:57.267029 master-0 kubenswrapper[18707]: I0320 09:08:57.266944 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 09:08:57.600951 master-0 kubenswrapper[18707]: I0320 09:08:57.600871 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-czjjp" event={"ID":"1f93103b-ad0e-4911-a6a3-003be4b823f4","Type":"ContainerStarted","Data":"fe7fd02d21ebaa61cbde631fde06c10b7a747b5bcbd963dbb611920c73fc8549"} Mar 20 09:08:57.658213 master-0 kubenswrapper[18707]: I0320 09:08:57.646397 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bcpbl"] Mar 20 09:08:57.669763 master-0 kubenswrapper[18707]: W0320 09:08:57.669690 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5f8abfc_e320_410d_bb6b_b5055c9fc454.slice/crio-6c60649a627301428da55fe8fdaa398612a2de266b036fe068c5fbc04e5752f8 WatchSource:0}: Error finding container 6c60649a627301428da55fe8fdaa398612a2de266b036fe068c5fbc04e5752f8: Status 404 returned error can't find the container with id 6c60649a627301428da55fe8fdaa398612a2de266b036fe068c5fbc04e5752f8 Mar 20 09:08:57.676372 master-0 kubenswrapper[18707]: I0320 09:08:57.676302 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75597d7cbf-d9w8g"] Mar 20 09:08:57.682976 master-0 kubenswrapper[18707]: I0320 09:08:57.682935 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.686686 master-0 kubenswrapper[18707]: I0320 09:08:57.686618 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:08:57.702756 master-0 kubenswrapper[18707]: I0320 09:08:57.702619 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-czjjp" podStartSLOduration=3.702604428 podStartE2EDuration="3.702604428s" podCreationTimestamp="2026-03-20 09:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:08:57.701812795 +0000 UTC m=+1682.857993151" watchObservedRunningTime="2026-03-20 09:08:57.702604428 +0000 UTC m=+1682.858784784" Mar 20 09:08:57.763916 master-0 kubenswrapper[18707]: I0320 09:08:57.763827 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75597d7cbf-d9w8g"] Mar 20 09:08:57.827785 master-0 kubenswrapper[18707]: I0320 09:08:57.827715 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-edpm-b\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.828210 master-0 kubenswrapper[18707]: I0320 09:08:57.828154 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-config\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.828339 master-0 kubenswrapper[18707]: I0320 09:08:57.828320 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-dns-svc\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.828642 master-0 kubenswrapper[18707]: I0320 09:08:57.828620 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-edpm-a\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.828764 master-0 kubenswrapper[18707]: I0320 09:08:57.828742 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-dns-swift-storage-0\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.829047 master-0 kubenswrapper[18707]: I0320 09:08:57.829023 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-ovsdbserver-sb\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.829403 master-0 kubenswrapper[18707]: I0320 09:08:57.829324 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggxp9\" (UniqueName: \"kubernetes.io/projected/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-kube-api-access-ggxp9\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.829725 master-0 kubenswrapper[18707]: I0320 09:08:57.829698 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-ovsdbserver-nb\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.934243 master-0 kubenswrapper[18707]: I0320 09:08:57.934149 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggxp9\" (UniqueName: \"kubernetes.io/projected/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-kube-api-access-ggxp9\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.934500 master-0 kubenswrapper[18707]: I0320 09:08:57.934341 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-ovsdbserver-nb\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.934926 master-0 kubenswrapper[18707]: I0320 09:08:57.934890 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-edpm-b\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.935346 master-0 kubenswrapper[18707]: I0320 09:08:57.935265 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-config\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.935481 master-0 kubenswrapper[18707]: I0320 09:08:57.935453 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-dns-svc\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.935990 master-0 kubenswrapper[18707]: I0320 09:08:57.935927 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-ovsdbserver-nb\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.936281 master-0 kubenswrapper[18707]: I0320 09:08:57.936167 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-edpm-a\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.936505 master-0 kubenswrapper[18707]: I0320 09:08:57.936459 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-dns-swift-storage-0\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.937166 master-0 kubenswrapper[18707]: I0320 09:08:57.937123 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-ovsdbserver-sb\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.937418 master-0 kubenswrapper[18707]: I0320 09:08:57.937384 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-config\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.937511 master-0 kubenswrapper[18707]: I0320 09:08:57.936565 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-edpm-b\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.938023 master-0 kubenswrapper[18707]: I0320 09:08:57.937993 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-dns-svc\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.939795 master-0 kubenswrapper[18707]: I0320 09:08:57.939745 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-dns-swift-storage-0\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.941412 master-0 kubenswrapper[18707]: I0320 09:08:57.941380 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-ovsdbserver-sb\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.963166 master-0 kubenswrapper[18707]: I0320 09:08:57.950446 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-edpm-a\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.970936 master-0 kubenswrapper[18707]: I0320 09:08:57.969639 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggxp9\" (UniqueName: \"kubernetes.io/projected/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-kube-api-access-ggxp9\") pod \"dnsmasq-dns-75597d7cbf-d9w8g\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:57.972000 master-0 kubenswrapper[18707]: I0320 09:08:57.971911 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:08:58.050657 master-0 kubenswrapper[18707]: I0320 09:08:58.050586 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:08:58.101606 master-0 kubenswrapper[18707]: I0320 09:08:58.101548 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 09:08:58.294667 master-0 kubenswrapper[18707]: W0320 09:08:58.294605 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21d04a7e_7a88_496f_b2ed_65183508c0af.slice/crio-1dc7e271690697646640142269b3ddbfea37abb8641f1e4a912389d098f4a1e3 WatchSource:0}: Error finding container 1dc7e271690697646640142269b3ddbfea37abb8641f1e4a912389d098f4a1e3: Status 404 returned error can't find the container with id 1dc7e271690697646640142269b3ddbfea37abb8641f1e4a912389d098f4a1e3 Mar 20 09:08:58.310388 master-0 kubenswrapper[18707]: I0320 09:08:58.304589 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:08:58.641836 master-0 kubenswrapper[18707]: I0320 09:08:58.640166 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bcpbl" event={"ID":"c5f8abfc-e320-410d-bb6b-b5055c9fc454","Type":"ContainerStarted","Data":"b49b9299c3c3b8fbbf401298ef6ad785f7952b5e47ff00f1fd406349be7a18b2"} Mar 20 09:08:58.641836 master-0 kubenswrapper[18707]: I0320 09:08:58.640275 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bcpbl" event={"ID":"c5f8abfc-e320-410d-bb6b-b5055c9fc454","Type":"ContainerStarted","Data":"6c60649a627301428da55fe8fdaa398612a2de266b036fe068c5fbc04e5752f8"} Mar 20 09:08:58.652141 master-0 kubenswrapper[18707]: I0320 09:08:58.652032 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"21d04a7e-7a88-496f-b2ed-65183508c0af","Type":"ContainerStarted","Data":"1dc7e271690697646640142269b3ddbfea37abb8641f1e4a912389d098f4a1e3"} Mar 20 09:08:58.663177 master-0 kubenswrapper[18707]: I0320 09:08:58.663107 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f123c6c-5e29-43af-9378-b7f8ee76b81c","Type":"ContainerStarted","Data":"2ac4bf6a41d2e3a09e8eae5764b40c29c2aa296ebeb97630e3f4c5e3833809bb"} Mar 20 09:08:58.671602 master-0 kubenswrapper[18707]: I0320 09:08:58.671518 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff5a931d-3951-482d-88a0-eefac071090a","Type":"ContainerStarted","Data":"7cba5c4f3a057c2ea26d1dbec2730e5d9fc72d0f59d4817550c3702d5952856f"} Mar 20 09:08:58.673691 master-0 kubenswrapper[18707]: I0320 09:08:58.673642 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"63076b6f-15f6-4802-bdaa-12c3d504d239","Type":"ContainerStarted","Data":"a565abb05da1fa95796369f48ed3181a754751b03d1b5568c51f5655146f88bd"} Mar 20 09:08:58.704264 master-0 kubenswrapper[18707]: I0320 09:08:58.699848 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-bcpbl" podStartSLOduration=2.699820454 podStartE2EDuration="2.699820454s" podCreationTimestamp="2026-03-20 09:08:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:08:58.68950398 +0000 UTC m=+1683.845684336" watchObservedRunningTime="2026-03-20 09:08:58.699820454 +0000 UTC m=+1683.856000810" Mar 20 09:08:58.933370 master-0 kubenswrapper[18707]: I0320 09:08:58.933130 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75597d7cbf-d9w8g"] Mar 20 09:08:59.691151 master-0 kubenswrapper[18707]: I0320 09:08:59.691094 18707 generic.go:334] "Generic (PLEG): container finished" podID="d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" containerID="61db0defc56b28a464d5616caeffb9183f0071b2d7a89a8035cb4767a9f835c8" exitCode=0 Mar 20 09:08:59.692728 master-0 kubenswrapper[18707]: I0320 09:08:59.692696 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" event={"ID":"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340","Type":"ContainerDied","Data":"61db0defc56b28a464d5616caeffb9183f0071b2d7a89a8035cb4767a9f835c8"} Mar 20 09:08:59.692899 master-0 kubenswrapper[18707]: I0320 09:08:59.692735 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" event={"ID":"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340","Type":"ContainerStarted","Data":"fc50cdebf51cd7d051e634fd551cd14c4521e6a5b9d2081bfd63c41b1c0a6082"} Mar 20 09:09:00.724606 master-0 kubenswrapper[18707]: I0320 09:09:00.724539 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" event={"ID":"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340","Type":"ContainerStarted","Data":"36c08af42bc218440f37751dca61d756d06093a6314c88456efff4cd6b669eb4"} Mar 20 09:09:00.725513 master-0 kubenswrapper[18707]: I0320 09:09:00.725008 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:09:04.796794 master-0 kubenswrapper[18707]: I0320 09:09:04.793368 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff5a931d-3951-482d-88a0-eefac071090a","Type":"ContainerStarted","Data":"9b66d44c659d2aeda44895720ee5568310584c0e8b0240ec21d2621b8e7d8f46"} Mar 20 09:09:04.796794 master-0 kubenswrapper[18707]: I0320 09:09:04.794981 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"63076b6f-15f6-4802-bdaa-12c3d504d239","Type":"ContainerStarted","Data":"94ddf8292a7594d86823c17e01a86c565ba77c45bc280d391d285c5d478688bb"} Mar 20 09:09:04.807737 master-0 kubenswrapper[18707]: I0320 09:09:04.797081 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"21d04a7e-7a88-496f-b2ed-65183508c0af","Type":"ContainerStarted","Data":"c0a20a321758b89a7197046f590fbf478d8bb8ff29a8be0633a162d93309aede"} Mar 20 09:09:04.807737 master-0 kubenswrapper[18707]: I0320 09:09:04.798884 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f123c6c-5e29-43af-9378-b7f8ee76b81c","Type":"ContainerStarted","Data":"1aed233d8dd33f8e88f726b178e898cd6aa9c5d65d1656f00f71ca241561a719"} Mar 20 09:09:05.824135 master-0 kubenswrapper[18707]: I0320 09:09:05.824048 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f123c6c-5e29-43af-9378-b7f8ee76b81c","Type":"ContainerStarted","Data":"bab17a153100b5d15c2495320b986a4d103813d15314176f773c1ff3132efa42"} Mar 20 09:09:05.827976 master-0 kubenswrapper[18707]: I0320 09:09:05.827936 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff5a931d-3951-482d-88a0-eefac071090a","Type":"ContainerStarted","Data":"b82cd12eb0349521e54c4271e337d2fa464ce2aca64ca51c414337a3711ae639"} Mar 20 09:09:06.782338 master-0 kubenswrapper[18707]: I0320 09:09:06.782239 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 09:09:06.783241 master-0 kubenswrapper[18707]: I0320 09:09:06.783222 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 09:09:06.847579 master-0 kubenswrapper[18707]: I0320 09:09:06.847485 18707 generic.go:334] "Generic (PLEG): container finished" podID="1f93103b-ad0e-4911-a6a3-003be4b823f4" containerID="fe7fd02d21ebaa61cbde631fde06c10b7a747b5bcbd963dbb611920c73fc8549" exitCode=0 Mar 20 09:09:06.848227 master-0 kubenswrapper[18707]: I0320 09:09:06.848022 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-czjjp" event={"ID":"1f93103b-ad0e-4911-a6a3-003be4b823f4","Type":"ContainerDied","Data":"fe7fd02d21ebaa61cbde631fde06c10b7a747b5bcbd963dbb611920c73fc8549"} Mar 20 09:09:07.119354 master-0 kubenswrapper[18707]: I0320 09:09:07.119252 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:07.119649 master-0 kubenswrapper[18707]: I0320 09:09:07.119482 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:07.150602 master-0 kubenswrapper[18707]: I0320 09:09:07.148536 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:07.171396 master-0 kubenswrapper[18707]: I0320 09:09:07.163886 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 09:09:07.171396 master-0 kubenswrapper[18707]: I0320 09:09:07.163952 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 09:09:07.268511 master-0 kubenswrapper[18707]: I0320 09:09:07.268444 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 09:09:07.268874 master-0 kubenswrapper[18707]: I0320 09:09:07.268850 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 09:09:07.300399 master-0 kubenswrapper[18707]: I0320 09:09:07.300318 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 09:09:07.869684 master-0 kubenswrapper[18707]: I0320 09:09:07.869587 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ff5a931d-3951-482d-88a0-eefac071090a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.0.251:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 09:09:07.870640 master-0 kubenswrapper[18707]: I0320 09:09:07.869764 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ff5a931d-3951-482d-88a0-eefac071090a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.0.251:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 09:09:07.887696 master-0 kubenswrapper[18707]: I0320 09:09:07.887619 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:07.910420 master-0 kubenswrapper[18707]: I0320 09:09:07.910338 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 09:09:08.053655 master-0 kubenswrapper[18707]: I0320 09:09:08.053496 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:09:08.246777 master-0 kubenswrapper[18707]: I0320 09:09:08.246550 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4f123c6c-5e29-43af-9378-b7f8ee76b81c" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.128.0.254:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 09:09:08.247062 master-0 kubenswrapper[18707]: I0320 09:09:08.246859 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4f123c6c-5e29-43af-9378-b7f8ee76b81c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.128.0.254:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 09:09:08.457376 master-0 kubenswrapper[18707]: I0320 09:09:08.457317 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:09:08.883257 master-0 kubenswrapper[18707]: I0320 09:09:08.883145 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-czjjp" Mar 20 09:09:08.883731 master-0 kubenswrapper[18707]: I0320 09:09:08.883450 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-czjjp" event={"ID":"1f93103b-ad0e-4911-a6a3-003be4b823f4","Type":"ContainerDied","Data":"0ca4671785f530edea35237ca9107943b49d9f5401388b2e17bc54478b9e5b74"} Mar 20 09:09:08.883731 master-0 kubenswrapper[18707]: I0320 09:09:08.883479 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ca4671785f530edea35237ca9107943b49d9f5401388b2e17bc54478b9e5b74" Mar 20 09:09:10.523016 master-0 kubenswrapper[18707]: E0320 09:09:10.522915 18707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:09:00Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:09:00Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:09:00Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:09:00Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 09:09:10.906570 master-0 kubenswrapper[18707]: I0320 09:09:10.906500 18707 generic.go:334] "Generic (PLEG): container finished" podID="e0736ead-ff56-4b8e-b654-e8b3f5d1f702" containerID="3d2f0c99734566cf7cfee7855e2c1019b1aa135fa80b03a461c0c6e99f8f3a81" exitCode=1 Mar 20 09:09:10.906795 master-0 kubenswrapper[18707]: I0320 09:09:10.906593 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" event={"ID":"e0736ead-ff56-4b8e-b654-e8b3f5d1f702","Type":"ContainerDied","Data":"3d2f0c99734566cf7cfee7855e2c1019b1aa135fa80b03a461c0c6e99f8f3a81"} Mar 20 09:09:10.907658 master-0 kubenswrapper[18707]: I0320 09:09:10.907605 18707 scope.go:117] "RemoveContainer" containerID="3d2f0c99734566cf7cfee7855e2c1019b1aa135fa80b03a461c0c6e99f8f3a81" Mar 20 09:09:10.911066 master-0 kubenswrapper[18707]: I0320 09:09:10.911030 18707 generic.go:334] "Generic (PLEG): container finished" podID="c5f8abfc-e320-410d-bb6b-b5055c9fc454" containerID="b49b9299c3c3b8fbbf401298ef6ad785f7952b5e47ff00f1fd406349be7a18b2" exitCode=0 Mar 20 09:09:10.911129 master-0 kubenswrapper[18707]: I0320 09:09:10.911062 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bcpbl" event={"ID":"c5f8abfc-e320-410d-bb6b-b5055c9fc454","Type":"ContainerDied","Data":"b49b9299c3c3b8fbbf401298ef6ad785f7952b5e47ff00f1fd406349be7a18b2"} Mar 20 09:09:11.691500 master-0 kubenswrapper[18707]: I0320 09:09:11.691376 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" podUID="109b7903-cd46-4a38-93c2-87253251c130" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.155:8081/healthz\": dial tcp 10.128.0.155:8081: connect: connection refused" Mar 20 09:09:11.691500 master-0 kubenswrapper[18707]: I0320 09:09:11.691376 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" podUID="109b7903-cd46-4a38-93c2-87253251c130" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.155:8081/readyz\": dial tcp 10.128.0.155:8081: connect: connection refused" Mar 20 09:09:11.943338 master-0 kubenswrapper[18707]: I0320 09:09:11.943267 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" event={"ID":"e0736ead-ff56-4b8e-b654-e8b3f5d1f702","Type":"ContainerStarted","Data":"6d1b78c8df3d556ed0f64707a8c8928f04e5e1cdd99bc847171e787a53547d44"} Mar 20 09:09:12.370196 master-0 kubenswrapper[18707]: I0320 09:09:12.363466 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" podUID="24bbc2db-37fd-4bae-a9e5-9edb02f2c783" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.161:8081/readyz\": dial tcp 10.128.0.161:8081: connect: connection refused" Mar 20 09:09:12.370196 master-0 kubenswrapper[18707]: I0320 09:09:12.363697 18707 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" podUID="24bbc2db-37fd-4bae-a9e5-9edb02f2c783" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.161:8081/healthz\": dial tcp 10.128.0.161:8081: connect: connection refused" Mar 20 09:09:12.481093 master-0 kubenswrapper[18707]: I0320 09:09:12.481025 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:09:12.965950 master-0 kubenswrapper[18707]: I0320 09:09:12.965537 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bcpbl" Mar 20 09:09:12.965950 master-0 kubenswrapper[18707]: I0320 09:09:12.965561 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bcpbl" event={"ID":"c5f8abfc-e320-410d-bb6b-b5055c9fc454","Type":"ContainerDied","Data":"6c60649a627301428da55fe8fdaa398612a2de266b036fe068c5fbc04e5752f8"} Mar 20 09:09:12.965950 master-0 kubenswrapper[18707]: I0320 09:09:12.965619 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c60649a627301428da55fe8fdaa398612a2de266b036fe068c5fbc04e5752f8" Mar 20 09:09:12.972433 master-0 kubenswrapper[18707]: I0320 09:09:12.972369 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0e7a3622f1a5180efe08fda88825b245/kube-controller-manager/0.log" Mar 20 09:09:12.972433 master-0 kubenswrapper[18707]: I0320 09:09:12.972435 18707 generic.go:334] "Generic (PLEG): container finished" podID="0e7a3622f1a5180efe08fda88825b245" containerID="c4be5b1afaa795d923771de5c6a9f6ff1e511f892a95ee852af0f37c42ad2e9a" exitCode=1 Mar 20 09:09:12.972902 master-0 kubenswrapper[18707]: I0320 09:09:12.972507 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0e7a3622f1a5180efe08fda88825b245","Type":"ContainerDied","Data":"c4be5b1afaa795d923771de5c6a9f6ff1e511f892a95ee852af0f37c42ad2e9a"} Mar 20 09:09:12.973373 master-0 kubenswrapper[18707]: I0320 09:09:12.973326 18707 scope.go:117] "RemoveContainer" containerID="c4be5b1afaa795d923771de5c6a9f6ff1e511f892a95ee852af0f37c42ad2e9a" Mar 20 09:09:12.977925 master-0 kubenswrapper[18707]: I0320 09:09:12.977884 18707 generic.go:334] "Generic (PLEG): container finished" podID="b9627b82-4e5a-4ceb-a906-0657397e66e9" containerID="f53c606f4e16a36fa27913165478c4d3fb9a888f7b6c436d260e86fcf335ac1a" exitCode=1 Mar 20 09:09:12.978008 master-0 kubenswrapper[18707]: I0320 09:09:12.977948 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2" event={"ID":"b9627b82-4e5a-4ceb-a906-0657397e66e9","Type":"ContainerDied","Data":"f53c606f4e16a36fa27913165478c4d3fb9a888f7b6c436d260e86fcf335ac1a"} Mar 20 09:09:12.978701 master-0 kubenswrapper[18707]: I0320 09:09:12.978484 18707 scope.go:117] "RemoveContainer" containerID="f53c606f4e16a36fa27913165478c4d3fb9a888f7b6c436d260e86fcf335ac1a" Mar 20 09:09:12.983608 master-0 kubenswrapper[18707]: I0320 09:09:12.983569 18707 generic.go:334] "Generic (PLEG): container finished" podID="24bbc2db-37fd-4bae-a9e5-9edb02f2c783" containerID="e6482c6356b9a5da416c9d5e6c0c378863aa5a586768f387dea975a4671d40f8" exitCode=1 Mar 20 09:09:12.983666 master-0 kubenswrapper[18707]: I0320 09:09:12.983633 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" event={"ID":"24bbc2db-37fd-4bae-a9e5-9edb02f2c783","Type":"ContainerDied","Data":"e6482c6356b9a5da416c9d5e6c0c378863aa5a586768f387dea975a4671d40f8"} Mar 20 09:09:12.984113 master-0 kubenswrapper[18707]: I0320 09:09:12.984081 18707 scope.go:117] "RemoveContainer" containerID="e6482c6356b9a5da416c9d5e6c0c378863aa5a586768f387dea975a4671d40f8" Mar 20 09:09:12.985352 master-0 kubenswrapper[18707]: I0320 09:09:12.985317 18707 generic.go:334] "Generic (PLEG): container finished" podID="109b7903-cd46-4a38-93c2-87253251c130" containerID="86d8f99eccf592938e10dac5be1c4825de6d5200d9e172d031c96eaa477ba53b" exitCode=1 Mar 20 09:09:12.985417 master-0 kubenswrapper[18707]: I0320 09:09:12.985378 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" event={"ID":"109b7903-cd46-4a38-93c2-87253251c130","Type":"ContainerDied","Data":"86d8f99eccf592938e10dac5be1c4825de6d5200d9e172d031c96eaa477ba53b"} Mar 20 09:09:12.985719 master-0 kubenswrapper[18707]: I0320 09:09:12.985688 18707 scope.go:117] "RemoveContainer" containerID="86d8f99eccf592938e10dac5be1c4825de6d5200d9e172d031c96eaa477ba53b" Mar 20 09:09:12.987870 master-0 kubenswrapper[18707]: I0320 09:09:12.987799 18707 generic.go:334] "Generic (PLEG): container finished" podID="8053f444-54ad-4a79-8bac-8ee78d1d081b" containerID="891dfc58f84db38966e061317bdfe7ab0aa0a40372ac95b82f623fb3383a9447" exitCode=1 Mar 20 09:09:12.988654 master-0 kubenswrapper[18707]: I0320 09:09:12.988619 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" event={"ID":"8053f444-54ad-4a79-8bac-8ee78d1d081b","Type":"ContainerDied","Data":"891dfc58f84db38966e061317bdfe7ab0aa0a40372ac95b82f623fb3383a9447"} Mar 20 09:09:12.988767 master-0 kubenswrapper[18707]: I0320 09:09:12.988733 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" Mar 20 09:09:12.989287 master-0 kubenswrapper[18707]: I0320 09:09:12.989016 18707 scope.go:117] "RemoveContainer" containerID="891dfc58f84db38966e061317bdfe7ab0aa0a40372ac95b82f623fb3383a9447" Mar 20 09:09:13.035496 master-0 kubenswrapper[18707]: E0320 09:09:13.035412 18707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 09:09:13.641560 master-0 kubenswrapper[18707]: I0320 09:09:13.636276 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:09:14.009390 master-0 kubenswrapper[18707]: I0320 09:09:14.007905 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2" event={"ID":"b9627b82-4e5a-4ceb-a906-0657397e66e9","Type":"ContainerStarted","Data":"748d831dc963a885b594c46f7654bdae86735abb4c0f9c3f7c3553e44dafe84e"} Mar 20 09:09:14.009390 master-0 kubenswrapper[18707]: I0320 09:09:14.008173 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2" Mar 20 09:09:14.010465 master-0 kubenswrapper[18707]: I0320 09:09:14.010435 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" event={"ID":"24bbc2db-37fd-4bae-a9e5-9edb02f2c783","Type":"ContainerStarted","Data":"0745b31065c0dba74c210b229bb6eb83c04ebbb0acaf8aa6defeb96f7c6f4729"} Mar 20 09:09:14.010760 master-0 kubenswrapper[18707]: I0320 09:09:14.010732 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" Mar 20 09:09:14.012512 master-0 kubenswrapper[18707]: I0320 09:09:14.012464 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" event={"ID":"109b7903-cd46-4a38-93c2-87253251c130","Type":"ContainerStarted","Data":"0fcbf24ed85dc63ba289234f9dc53181e57b2d9fde8380580b7657143f0235a7"} Mar 20 09:09:14.012694 master-0 kubenswrapper[18707]: I0320 09:09:14.012677 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" Mar 20 09:09:14.015326 master-0 kubenswrapper[18707]: I0320 09:09:14.015282 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" event={"ID":"8053f444-54ad-4a79-8bac-8ee78d1d081b","Type":"ContainerStarted","Data":"62ff9f626a236e010c468de7cdca7e3779613a4d1063dd775b54e576ccb20f74"} Mar 20 09:09:14.015549 master-0 kubenswrapper[18707]: I0320 09:09:14.015529 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:09:14.568217 master-0 kubenswrapper[18707]: E0320 09:09:14.568049 18707 controller.go:195] "Failed to update lease" err="Operation cannot be fulfilled on leases.coordination.k8s.io \"master-0\": the object has been modified; please apply your changes to the latest version and try again" Mar 20 09:09:14.783119 master-0 kubenswrapper[18707]: I0320 09:09:14.782082 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 09:09:14.783119 master-0 kubenswrapper[18707]: I0320 09:09:14.782148 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 09:09:15.033658 master-0 kubenswrapper[18707]: I0320 09:09:15.033452 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0e7a3622f1a5180efe08fda88825b245/kube-controller-manager/0.log" Mar 20 09:09:15.035106 master-0 kubenswrapper[18707]: I0320 09:09:15.035050 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0e7a3622f1a5180efe08fda88825b245","Type":"ContainerStarted","Data":"bd0333db25b67ff2f4ede507be8edeb92570d2d78308e3835068aca0e4a7311d"} Mar 20 09:09:15.164015 master-0 kubenswrapper[18707]: I0320 09:09:15.163960 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 09:09:15.164316 master-0 kubenswrapper[18707]: I0320 09:09:15.164293 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 09:09:15.630215 master-0 kubenswrapper[18707]: I0320 09:09:15.630014 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" podStartSLOduration=18.629993621 podStartE2EDuration="18.629993621s" podCreationTimestamp="2026-03-20 09:08:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:09:15.608127556 +0000 UTC m=+1700.764307922" watchObservedRunningTime="2026-03-20 09:09:15.629993621 +0000 UTC m=+1700.786173977" Mar 20 09:09:16.413114 master-0 kubenswrapper[18707]: I0320 09:09:16.412994 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=14.302646365 podStartE2EDuration="20.412965165s" podCreationTimestamp="2026-03-20 09:08:56 +0000 UTC" firstStartedPulling="2026-03-20 09:08:57.680405633 +0000 UTC m=+1682.836585989" lastFinishedPulling="2026-03-20 09:09:03.790724433 +0000 UTC m=+1688.946904789" observedRunningTime="2026-03-20 09:09:16.397881834 +0000 UTC m=+1701.554062190" watchObservedRunningTime="2026-03-20 09:09:16.412965165 +0000 UTC m=+1701.569145521" Mar 20 09:09:16.431378 master-0 kubenswrapper[18707]: I0320 09:09:16.431334 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-scripts\") pod \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " Mar 20 09:09:16.431566 master-0 kubenswrapper[18707]: I0320 09:09:16.431552 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-scripts\") pod \"1f93103b-ad0e-4911-a6a3-003be4b823f4\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " Mar 20 09:09:16.431907 master-0 kubenswrapper[18707]: I0320 09:09:16.431891 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-config-data\") pod \"1f93103b-ad0e-4911-a6a3-003be4b823f4\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " Mar 20 09:09:16.432055 master-0 kubenswrapper[18707]: I0320 09:09:16.432041 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-config-data\") pod \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " Mar 20 09:09:16.432244 master-0 kubenswrapper[18707]: I0320 09:09:16.432224 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7s7z\" (UniqueName: \"kubernetes.io/projected/1f93103b-ad0e-4911-a6a3-003be4b823f4-kube-api-access-z7s7z\") pod \"1f93103b-ad0e-4911-a6a3-003be4b823f4\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " Mar 20 09:09:16.432363 master-0 kubenswrapper[18707]: I0320 09:09:16.432345 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-combined-ca-bundle\") pod \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " Mar 20 09:09:16.432469 master-0 kubenswrapper[18707]: I0320 09:09:16.432454 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-combined-ca-bundle\") pod \"1f93103b-ad0e-4911-a6a3-003be4b823f4\" (UID: \"1f93103b-ad0e-4911-a6a3-003be4b823f4\") " Mar 20 09:09:16.432550 master-0 kubenswrapper[18707]: I0320 09:09:16.432539 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnflw\" (UniqueName: \"kubernetes.io/projected/c5f8abfc-e320-410d-bb6b-b5055c9fc454-kube-api-access-dnflw\") pod \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\" (UID: \"c5f8abfc-e320-410d-bb6b-b5055c9fc454\") " Mar 20 09:09:16.435715 master-0 kubenswrapper[18707]: I0320 09:09:16.435675 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-scripts" (OuterVolumeSpecName: "scripts") pod "c5f8abfc-e320-410d-bb6b-b5055c9fc454" (UID: "c5f8abfc-e320-410d-bb6b-b5055c9fc454"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:09:16.437008 master-0 kubenswrapper[18707]: I0320 09:09:16.436800 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-scripts" (OuterVolumeSpecName: "scripts") pod "1f93103b-ad0e-4911-a6a3-003be4b823f4" (UID: "1f93103b-ad0e-4911-a6a3-003be4b823f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:09:16.437587 master-0 kubenswrapper[18707]: I0320 09:09:16.437516 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:16.437587 master-0 kubenswrapper[18707]: I0320 09:09:16.437537 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:16.441902 master-0 kubenswrapper[18707]: I0320 09:09:16.441833 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f8abfc-e320-410d-bb6b-b5055c9fc454-kube-api-access-dnflw" (OuterVolumeSpecName: "kube-api-access-dnflw") pod "c5f8abfc-e320-410d-bb6b-b5055c9fc454" (UID: "c5f8abfc-e320-410d-bb6b-b5055c9fc454"). InnerVolumeSpecName "kube-api-access-dnflw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:09:16.442794 master-0 kubenswrapper[18707]: I0320 09:09:16.442642 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f93103b-ad0e-4911-a6a3-003be4b823f4-kube-api-access-z7s7z" (OuterVolumeSpecName: "kube-api-access-z7s7z") pod "1f93103b-ad0e-4911-a6a3-003be4b823f4" (UID: "1f93103b-ad0e-4911-a6a3-003be4b823f4"). InnerVolumeSpecName "kube-api-access-z7s7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:09:16.467566 master-0 kubenswrapper[18707]: I0320 09:09:16.467480 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5f8abfc-e320-410d-bb6b-b5055c9fc454" (UID: "c5f8abfc-e320-410d-bb6b-b5055c9fc454"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:09:16.468710 master-0 kubenswrapper[18707]: I0320 09:09:16.468650 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-config-data" (OuterVolumeSpecName: "config-data") pod "1f93103b-ad0e-4911-a6a3-003be4b823f4" (UID: "1f93103b-ad0e-4911-a6a3-003be4b823f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:09:16.469347 master-0 kubenswrapper[18707]: I0320 09:09:16.469306 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f93103b-ad0e-4911-a6a3-003be4b823f4" (UID: "1f93103b-ad0e-4911-a6a3-003be4b823f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:09:16.482887 master-0 kubenswrapper[18707]: I0320 09:09:16.481881 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-config-data" (OuterVolumeSpecName: "config-data") pod "c5f8abfc-e320-410d-bb6b-b5055c9fc454" (UID: "c5f8abfc-e320-410d-bb6b-b5055c9fc454"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:09:16.543654 master-0 kubenswrapper[18707]: I0320 09:09:16.543597 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:16.543654 master-0 kubenswrapper[18707]: I0320 09:09:16.543636 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:16.543654 master-0 kubenswrapper[18707]: I0320 09:09:16.543646 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f8abfc-e320-410d-bb6b-b5055c9fc454-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:16.543654 master-0 kubenswrapper[18707]: I0320 09:09:16.543657 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7s7z\" (UniqueName: \"kubernetes.io/projected/1f93103b-ad0e-4911-a6a3-003be4b823f4-kube-api-access-z7s7z\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:16.543871 master-0 kubenswrapper[18707]: I0320 09:09:16.543668 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f93103b-ad0e-4911-a6a3-003be4b823f4-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:16.543871 master-0 kubenswrapper[18707]: I0320 09:09:16.543678 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnflw\" (UniqueName: \"kubernetes.io/projected/c5f8abfc-e320-410d-bb6b-b5055c9fc454-kube-api-access-dnflw\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:16.787062 master-0 kubenswrapper[18707]: I0320 09:09:16.787008 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 09:09:16.787669 master-0 kubenswrapper[18707]: I0320 09:09:16.787622 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 09:09:16.789991 master-0 kubenswrapper[18707]: I0320 09:09:16.789965 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 09:09:17.071605 master-0 kubenswrapper[18707]: I0320 09:09:17.071549 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 09:09:17.166092 master-0 kubenswrapper[18707]: I0320 09:09:17.166032 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 09:09:17.166468 master-0 kubenswrapper[18707]: I0320 09:09:17.166427 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 09:09:17.168959 master-0 kubenswrapper[18707]: I0320 09:09:17.168913 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 09:09:17.317210 master-0 kubenswrapper[18707]: I0320 09:09:17.316088 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=15.442594551 podStartE2EDuration="21.316057262s" podCreationTimestamp="2026-03-20 09:08:56 +0000 UTC" firstStartedPulling="2026-03-20 09:08:58.296706815 +0000 UTC m=+1683.452887171" lastFinishedPulling="2026-03-20 09:09:04.170169526 +0000 UTC m=+1689.326349882" observedRunningTime="2026-03-20 09:09:17.303012299 +0000 UTC m=+1702.459192685" watchObservedRunningTime="2026-03-20 09:09:17.316057262 +0000 UTC m=+1702.472237628" Mar 20 09:09:17.550110 master-0 kubenswrapper[18707]: I0320 09:09:17.550010 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=15.396732641 podStartE2EDuration="21.549985417s" podCreationTimestamp="2026-03-20 09:08:56 +0000 UTC" firstStartedPulling="2026-03-20 09:08:58.010698932 +0000 UTC m=+1683.166879288" lastFinishedPulling="2026-03-20 09:09:04.163951708 +0000 UTC m=+1689.320132064" observedRunningTime="2026-03-20 09:09:17.53645638 +0000 UTC m=+1702.692636756" watchObservedRunningTime="2026-03-20 09:09:17.549985417 +0000 UTC m=+1702.706165773" Mar 20 09:09:17.583481 master-0 kubenswrapper[18707]: I0320 09:09:17.583356 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=15.535173855 podStartE2EDuration="21.583331489s" podCreationTimestamp="2026-03-20 09:08:56 +0000 UTC" firstStartedPulling="2026-03-20 09:08:58.116366611 +0000 UTC m=+1683.272546967" lastFinishedPulling="2026-03-20 09:09:04.164524245 +0000 UTC m=+1689.320704601" observedRunningTime="2026-03-20 09:09:17.572563912 +0000 UTC m=+1702.728744268" watchObservedRunningTime="2026-03-20 09:09:17.583331489 +0000 UTC m=+1702.739511865" Mar 20 09:09:18.078909 master-0 kubenswrapper[18707]: I0320 09:09:18.078848 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 09:09:19.229269 master-0 kubenswrapper[18707]: I0320 09:09:19.229107 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 09:09:19.229869 master-0 kubenswrapper[18707]: I0320 09:09:19.229283 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 09:09:19.233649 master-0 kubenswrapper[18707]: I0320 09:09:19.233604 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 09:09:21.480176 master-0 kubenswrapper[18707]: I0320 09:09:21.480097 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-dq5f2" Mar 20 09:09:21.693205 master-0 kubenswrapper[18707]: I0320 09:09:21.693137 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-fr295" Mar 20 09:09:22.367202 master-0 kubenswrapper[18707]: I0320 09:09:22.366897 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nf9vn" Mar 20 09:09:23.646294 master-0 kubenswrapper[18707]: I0320 09:09:23.646232 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899s6vn4" Mar 20 09:09:29.234577 master-0 kubenswrapper[18707]: I0320 09:09:29.234494 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 20 09:09:31.834396 master-0 kubenswrapper[18707]: I0320 09:09:31.834331 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 09:09:31.834909 master-0 kubenswrapper[18707]: I0320 09:09:31.834566 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="63076b6f-15f6-4802-bdaa-12c3d504d239" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://94ddf8292a7594d86823c17e01a86c565ba77c45bc280d391d285c5d478688bb" gracePeriod=30 Mar 20 09:09:31.956101 master-0 kubenswrapper[18707]: I0320 09:09:31.955105 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:09:31.956101 master-0 kubenswrapper[18707]: I0320 09:09:31.955514 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4f123c6c-5e29-43af-9378-b7f8ee76b81c" containerName="nova-metadata-log" containerID="cri-o://1aed233d8dd33f8e88f726b178e898cd6aa9c5d65d1656f00f71ca241561a719" gracePeriod=30 Mar 20 09:09:31.956417 master-0 kubenswrapper[18707]: I0320 09:09:31.956115 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4f123c6c-5e29-43af-9378-b7f8ee76b81c" containerName="nova-metadata-metadata" containerID="cri-o://bab17a153100b5d15c2495320b986a4d103813d15314176f773c1ff3132efa42" gracePeriod=30 Mar 20 09:09:32.049256 master-0 kubenswrapper[18707]: I0320 09:09:32.039832 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 09:09:32.049256 master-0 kubenswrapper[18707]: E0320 09:09:32.040524 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f8abfc-e320-410d-bb6b-b5055c9fc454" containerName="nova-cell1-conductor-db-sync" Mar 20 09:09:32.049256 master-0 kubenswrapper[18707]: I0320 09:09:32.040543 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f8abfc-e320-410d-bb6b-b5055c9fc454" containerName="nova-cell1-conductor-db-sync" Mar 20 09:09:32.049256 master-0 kubenswrapper[18707]: E0320 09:09:32.040570 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f93103b-ad0e-4911-a6a3-003be4b823f4" containerName="nova-manage" Mar 20 09:09:32.049256 master-0 kubenswrapper[18707]: I0320 09:09:32.040579 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f93103b-ad0e-4911-a6a3-003be4b823f4" containerName="nova-manage" Mar 20 09:09:32.067297 master-0 kubenswrapper[18707]: I0320 09:09:32.065506 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f93103b-ad0e-4911-a6a3-003be4b823f4" containerName="nova-manage" Mar 20 09:09:32.067297 master-0 kubenswrapper[18707]: I0320 09:09:32.065584 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f8abfc-e320-410d-bb6b-b5055c9fc454" containerName="nova-cell1-conductor-db-sync" Mar 20 09:09:32.102232 master-0 kubenswrapper[18707]: I0320 09:09:32.098657 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 09:09:32.104222 master-0 kubenswrapper[18707]: I0320 09:09:32.102539 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 09:09:32.138099 master-0 kubenswrapper[18707]: I0320 09:09:32.138042 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="63076b6f-15f6-4802-bdaa-12c3d504d239" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"http://10.128.0.252:6080/vnc_lite.html\": dial tcp 10.128.0.252:6080: connect: connection refused" Mar 20 09:09:32.138522 master-0 kubenswrapper[18707]: I0320 09:09:32.138347 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 09:09:32.168341 master-0 kubenswrapper[18707]: I0320 09:09:32.152271 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cd5f68b57-cbttm"] Mar 20 09:09:32.168341 master-0 kubenswrapper[18707]: I0320 09:09:32.152577 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" podUID="4428e0d9-0da9-4aa4-8422-8baa68054f53" containerName="dnsmasq-dns" containerID="cri-o://b50f09fc604465b033012e195b4d71b6f8106ade9dd4d689d929e810eae17345" gracePeriod=10 Mar 20 09:09:32.202256 master-0 kubenswrapper[18707]: I0320 09:09:32.198273 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7a89cf6-6326-4a53-b489-1b8c454df190-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c7a89cf6-6326-4a53-b489-1b8c454df190\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:09:32.202256 master-0 kubenswrapper[18707]: I0320 09:09:32.198338 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsnxr\" (UniqueName: \"kubernetes.io/projected/c7a89cf6-6326-4a53-b489-1b8c454df190-kube-api-access-zsnxr\") pod \"nova-cell1-conductor-0\" (UID: \"c7a89cf6-6326-4a53-b489-1b8c454df190\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:09:32.202256 master-0 kubenswrapper[18707]: I0320 09:09:32.198558 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a89cf6-6326-4a53-b489-1b8c454df190-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c7a89cf6-6326-4a53-b489-1b8c454df190\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:09:32.307249 master-0 kubenswrapper[18707]: I0320 09:09:32.300461 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a89cf6-6326-4a53-b489-1b8c454df190-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c7a89cf6-6326-4a53-b489-1b8c454df190\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:09:32.307249 master-0 kubenswrapper[18707]: I0320 09:09:32.300581 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7a89cf6-6326-4a53-b489-1b8c454df190-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c7a89cf6-6326-4a53-b489-1b8c454df190\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:09:32.307249 master-0 kubenswrapper[18707]: I0320 09:09:32.300606 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsnxr\" (UniqueName: \"kubernetes.io/projected/c7a89cf6-6326-4a53-b489-1b8c454df190-kube-api-access-zsnxr\") pod \"nova-cell1-conductor-0\" (UID: \"c7a89cf6-6326-4a53-b489-1b8c454df190\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:09:32.314347 master-0 kubenswrapper[18707]: I0320 09:09:32.313528 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a89cf6-6326-4a53-b489-1b8c454df190-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c7a89cf6-6326-4a53-b489-1b8c454df190\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:09:32.322034 master-0 kubenswrapper[18707]: I0320 09:09:32.320336 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7a89cf6-6326-4a53-b489-1b8c454df190-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c7a89cf6-6326-4a53-b489-1b8c454df190\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:09:32.365257 master-0 kubenswrapper[18707]: I0320 09:09:32.361403 18707 generic.go:334] "Generic (PLEG): container finished" podID="4f123c6c-5e29-43af-9378-b7f8ee76b81c" containerID="1aed233d8dd33f8e88f726b178e898cd6aa9c5d65d1656f00f71ca241561a719" exitCode=143 Mar 20 09:09:32.365257 master-0 kubenswrapper[18707]: I0320 09:09:32.361456 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f123c6c-5e29-43af-9378-b7f8ee76b81c","Type":"ContainerDied","Data":"1aed233d8dd33f8e88f726b178e898cd6aa9c5d65d1656f00f71ca241561a719"} Mar 20 09:09:32.396047 master-0 kubenswrapper[18707]: I0320 09:09:32.393590 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsnxr\" (UniqueName: \"kubernetes.io/projected/c7a89cf6-6326-4a53-b489-1b8c454df190-kube-api-access-zsnxr\") pod \"nova-cell1-conductor-0\" (UID: \"c7a89cf6-6326-4a53-b489-1b8c454df190\") " pod="openstack/nova-cell1-conductor-0" Mar 20 09:09:32.421271 master-0 kubenswrapper[18707]: I0320 09:09:32.416557 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:09:32.421271 master-0 kubenswrapper[18707]: I0320 09:09:32.416807 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff5a931d-3951-482d-88a0-eefac071090a" containerName="nova-api-log" containerID="cri-o://9b66d44c659d2aeda44895720ee5568310584c0e8b0240ec21d2621b8e7d8f46" gracePeriod=30 Mar 20 09:09:32.421271 master-0 kubenswrapper[18707]: I0320 09:09:32.417420 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff5a931d-3951-482d-88a0-eefac071090a" containerName="nova-api-api" containerID="cri-o://b82cd12eb0349521e54c4271e337d2fa464ce2aca64ca51c414337a3711ae639" gracePeriod=30 Mar 20 09:09:32.462003 master-0 kubenswrapper[18707]: I0320 09:09:32.454263 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cc86b957c-nqjxf"] Mar 20 09:09:32.486823 master-0 kubenswrapper[18707]: I0320 09:09:32.480136 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.487140 master-0 kubenswrapper[18707]: I0320 09:09:32.487012 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cc86b957c-nqjxf"] Mar 20 09:09:32.546657 master-0 kubenswrapper[18707]: I0320 09:09:32.520459 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 09:09:32.653892 master-0 kubenswrapper[18707]: I0320 09:09:32.653142 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-dns-svc\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.653892 master-0 kubenswrapper[18707]: I0320 09:09:32.653225 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-edpm-b\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.679237 master-0 kubenswrapper[18707]: I0320 09:09:32.673489 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.679237 master-0 kubenswrapper[18707]: I0320 09:09:32.673595 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.679237 master-0 kubenswrapper[18707]: I0320 09:09:32.673659 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-config\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.679237 master-0 kubenswrapper[18707]: I0320 09:09:32.674124 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-edpm-a\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.679237 master-0 kubenswrapper[18707]: I0320 09:09:32.674753 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.679237 master-0 kubenswrapper[18707]: I0320 09:09:32.674800 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr62h\" (UniqueName: \"kubernetes.io/projected/fba929f9-2dc3-42ed-a9d6-f8785399db16-kube-api-access-lr62h\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.860416 master-0 kubenswrapper[18707]: I0320 09:09:32.860365 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.861029 master-0 kubenswrapper[18707]: I0320 09:09:32.860460 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr62h\" (UniqueName: \"kubernetes.io/projected/fba929f9-2dc3-42ed-a9d6-f8785399db16-kube-api-access-lr62h\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.861029 master-0 kubenswrapper[18707]: I0320 09:09:32.860633 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-dns-svc\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.861029 master-0 kubenswrapper[18707]: I0320 09:09:32.860652 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-edpm-b\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.861029 master-0 kubenswrapper[18707]: I0320 09:09:32.860814 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.861029 master-0 kubenswrapper[18707]: I0320 09:09:32.860856 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.861029 master-0 kubenswrapper[18707]: I0320 09:09:32.860896 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-config\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.863771 master-0 kubenswrapper[18707]: I0320 09:09:32.861326 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-edpm-a\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.863771 master-0 kubenswrapper[18707]: I0320 09:09:32.862287 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.863771 master-0 kubenswrapper[18707]: I0320 09:09:32.863369 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.863985 master-0 kubenswrapper[18707]: I0320 09:09:32.863955 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-edpm-b\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.866786 master-0 kubenswrapper[18707]: I0320 09:09:32.866353 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-edpm-a\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.867149 master-0 kubenswrapper[18707]: I0320 09:09:32.867057 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-config\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.875822 master-0 kubenswrapper[18707]: I0320 09:09:32.875758 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.876383 master-0 kubenswrapper[18707]: I0320 09:09:32.876355 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fba929f9-2dc3-42ed-a9d6-f8785399db16-dns-svc\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.937206 master-0 kubenswrapper[18707]: I0320 09:09:32.932090 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr62h\" (UniqueName: \"kubernetes.io/projected/fba929f9-2dc3-42ed-a9d6-f8785399db16-kube-api-access-lr62h\") pod \"dnsmasq-dns-5cc86b957c-nqjxf\" (UID: \"fba929f9-2dc3-42ed-a9d6-f8785399db16\") " pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:32.981227 master-0 kubenswrapper[18707]: I0320 09:09:32.974796 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:33.314414 master-0 kubenswrapper[18707]: E0320 09:09:33.305964 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4428e0d9_0da9_4aa4_8422_8baa68054f53.slice/crio-b50f09fc604465b033012e195b4d71b6f8106ade9dd4d689d929e810eae17345.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff5a931d_3951_482d_88a0_eefac071090a.slice/crio-9b66d44c659d2aeda44895720ee5568310584c0e8b0240ec21d2621b8e7d8f46.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4428e0d9_0da9_4aa4_8422_8baa68054f53.slice/crio-conmon-b50f09fc604465b033012e195b4d71b6f8106ade9dd4d689d929e810eae17345.scope\": RecentStats: unable to find data in memory cache]" Mar 20 09:09:33.436548 master-0 kubenswrapper[18707]: I0320 09:09:33.436480 18707 generic.go:334] "Generic (PLEG): container finished" podID="4428e0d9-0da9-4aa4-8422-8baa68054f53" containerID="b50f09fc604465b033012e195b4d71b6f8106ade9dd4d689d929e810eae17345" exitCode=0 Mar 20 09:09:33.436760 master-0 kubenswrapper[18707]: I0320 09:09:33.436588 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" event={"ID":"4428e0d9-0da9-4aa4-8422-8baa68054f53","Type":"ContainerDied","Data":"b50f09fc604465b033012e195b4d71b6f8106ade9dd4d689d929e810eae17345"} Mar 20 09:09:33.499063 master-0 kubenswrapper[18707]: I0320 09:09:33.498955 18707 generic.go:334] "Generic (PLEG): container finished" podID="ff5a931d-3951-482d-88a0-eefac071090a" containerID="9b66d44c659d2aeda44895720ee5568310584c0e8b0240ec21d2621b8e7d8f46" exitCode=143 Mar 20 09:09:33.499063 master-0 kubenswrapper[18707]: I0320 09:09:33.499050 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff5a931d-3951-482d-88a0-eefac071090a","Type":"ContainerDied","Data":"9b66d44c659d2aeda44895720ee5568310584c0e8b0240ec21d2621b8e7d8f46"} Mar 20 09:09:33.716633 master-0 kubenswrapper[18707]: I0320 09:09:33.711563 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 09:09:33.762768 master-0 kubenswrapper[18707]: I0320 09:09:33.761252 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:09:33.856129 master-0 kubenswrapper[18707]: I0320 09:09:33.855971 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-edpm-b\") pod \"4428e0d9-0da9-4aa4-8422-8baa68054f53\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " Mar 20 09:09:33.856129 master-0 kubenswrapper[18707]: I0320 09:09:33.856102 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-dns-swift-storage-0\") pod \"4428e0d9-0da9-4aa4-8422-8baa68054f53\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " Mar 20 09:09:33.856441 master-0 kubenswrapper[18707]: I0320 09:09:33.856211 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-edpm-a\") pod \"4428e0d9-0da9-4aa4-8422-8baa68054f53\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " Mar 20 09:09:33.856441 master-0 kubenswrapper[18707]: I0320 09:09:33.856263 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-ovsdbserver-sb\") pod \"4428e0d9-0da9-4aa4-8422-8baa68054f53\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " Mar 20 09:09:33.856441 master-0 kubenswrapper[18707]: I0320 09:09:33.856306 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz429\" (UniqueName: \"kubernetes.io/projected/4428e0d9-0da9-4aa4-8422-8baa68054f53-kube-api-access-lz429\") pod \"4428e0d9-0da9-4aa4-8422-8baa68054f53\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " Mar 20 09:09:33.856441 master-0 kubenswrapper[18707]: I0320 09:09:33.856380 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-ovsdbserver-nb\") pod \"4428e0d9-0da9-4aa4-8422-8baa68054f53\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " Mar 20 09:09:33.856441 master-0 kubenswrapper[18707]: I0320 09:09:33.856435 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-config\") pod \"4428e0d9-0da9-4aa4-8422-8baa68054f53\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " Mar 20 09:09:33.856630 master-0 kubenswrapper[18707]: I0320 09:09:33.856534 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-dns-svc\") pod \"4428e0d9-0da9-4aa4-8422-8baa68054f53\" (UID: \"4428e0d9-0da9-4aa4-8422-8baa68054f53\") " Mar 20 09:09:33.868448 master-0 kubenswrapper[18707]: I0320 09:09:33.864805 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4428e0d9-0da9-4aa4-8422-8baa68054f53-kube-api-access-lz429" (OuterVolumeSpecName: "kube-api-access-lz429") pod "4428e0d9-0da9-4aa4-8422-8baa68054f53" (UID: "4428e0d9-0da9-4aa4-8422-8baa68054f53"). InnerVolumeSpecName "kube-api-access-lz429". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:09:33.965446 master-0 kubenswrapper[18707]: I0320 09:09:33.960316 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz429\" (UniqueName: \"kubernetes.io/projected/4428e0d9-0da9-4aa4-8422-8baa68054f53-kube-api-access-lz429\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:34.059636 master-0 kubenswrapper[18707]: I0320 09:09:34.058584 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4428e0d9-0da9-4aa4-8422-8baa68054f53" (UID: "4428e0d9-0da9-4aa4-8422-8baa68054f53"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:09:34.064955 master-0 kubenswrapper[18707]: I0320 09:09:34.064209 18707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:34.066740 master-0 kubenswrapper[18707]: I0320 09:09:34.066571 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "4428e0d9-0da9-4aa4-8422-8baa68054f53" (UID: "4428e0d9-0da9-4aa4-8422-8baa68054f53"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:09:34.072823 master-0 kubenswrapper[18707]: I0320 09:09:34.069601 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-edpm-b" (OuterVolumeSpecName: "edpm-b") pod "4428e0d9-0da9-4aa4-8422-8baa68054f53" (UID: "4428e0d9-0da9-4aa4-8422-8baa68054f53"). InnerVolumeSpecName "edpm-b". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:09:34.098379 master-0 kubenswrapper[18707]: I0320 09:09:34.074833 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4428e0d9-0da9-4aa4-8422-8baa68054f53" (UID: "4428e0d9-0da9-4aa4-8422-8baa68054f53"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:09:34.107351 master-0 kubenswrapper[18707]: I0320 09:09:34.099674 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4428e0d9-0da9-4aa4-8422-8baa68054f53" (UID: "4428e0d9-0da9-4aa4-8422-8baa68054f53"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:09:34.133805 master-0 kubenswrapper[18707]: I0320 09:09:34.129883 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cc86b957c-nqjxf"] Mar 20 09:09:34.169059 master-0 kubenswrapper[18707]: I0320 09:09:34.168998 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:34.169494 master-0 kubenswrapper[18707]: I0320 09:09:34.169417 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:34.169494 master-0 kubenswrapper[18707]: I0320 09:09:34.169459 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:34.169494 master-0 kubenswrapper[18707]: I0320 09:09:34.169471 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:34.169732 master-0 kubenswrapper[18707]: I0320 09:09:34.169625 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4428e0d9-0da9-4aa4-8422-8baa68054f53" (UID: "4428e0d9-0da9-4aa4-8422-8baa68054f53"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:09:34.173504 master-0 kubenswrapper[18707]: I0320 09:09:34.173416 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:09:34.173809 master-0 kubenswrapper[18707]: I0320 09:09:34.173732 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="21d04a7e-7a88-496f-b2ed-65183508c0af" containerName="nova-scheduler-scheduler" containerID="cri-o://c0a20a321758b89a7197046f590fbf478d8bb8ff29a8be0633a162d93309aede" gracePeriod=30 Mar 20 09:09:34.189934 master-0 kubenswrapper[18707]: I0320 09:09:34.189861 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-config" (OuterVolumeSpecName: "config") pod "4428e0d9-0da9-4aa4-8422-8baa68054f53" (UID: "4428e0d9-0da9-4aa4-8422-8baa68054f53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:09:34.271951 master-0 kubenswrapper[18707]: I0320 09:09:34.271915 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:34.272725 master-0 kubenswrapper[18707]: I0320 09:09:34.272707 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4428e0d9-0da9-4aa4-8422-8baa68054f53-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:34.382429 master-0 kubenswrapper[18707]: I0320 09:09:34.382375 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:34.478656 master-0 kubenswrapper[18707]: I0320 09:09:34.478528 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63076b6f-15f6-4802-bdaa-12c3d504d239-config-data\") pod \"63076b6f-15f6-4802-bdaa-12c3d504d239\" (UID: \"63076b6f-15f6-4802-bdaa-12c3d504d239\") " Mar 20 09:09:34.478656 master-0 kubenswrapper[18707]: I0320 09:09:34.478612 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63076b6f-15f6-4802-bdaa-12c3d504d239-combined-ca-bundle\") pod \"63076b6f-15f6-4802-bdaa-12c3d504d239\" (UID: \"63076b6f-15f6-4802-bdaa-12c3d504d239\") " Mar 20 09:09:34.478975 master-0 kubenswrapper[18707]: I0320 09:09:34.478811 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htdr8\" (UniqueName: \"kubernetes.io/projected/63076b6f-15f6-4802-bdaa-12c3d504d239-kube-api-access-htdr8\") pod \"63076b6f-15f6-4802-bdaa-12c3d504d239\" (UID: \"63076b6f-15f6-4802-bdaa-12c3d504d239\") " Mar 20 09:09:34.487731 master-0 kubenswrapper[18707]: I0320 09:09:34.487494 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63076b6f-15f6-4802-bdaa-12c3d504d239-kube-api-access-htdr8" (OuterVolumeSpecName: "kube-api-access-htdr8") pod "63076b6f-15f6-4802-bdaa-12c3d504d239" (UID: "63076b6f-15f6-4802-bdaa-12c3d504d239"). InnerVolumeSpecName "kube-api-access-htdr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:09:34.511624 master-0 kubenswrapper[18707]: I0320 09:09:34.511374 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63076b6f-15f6-4802-bdaa-12c3d504d239-config-data" (OuterVolumeSpecName: "config-data") pod "63076b6f-15f6-4802-bdaa-12c3d504d239" (UID: "63076b6f-15f6-4802-bdaa-12c3d504d239"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:09:34.513261 master-0 kubenswrapper[18707]: I0320 09:09:34.513170 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" event={"ID":"fba929f9-2dc3-42ed-a9d6-f8785399db16","Type":"ContainerStarted","Data":"3ea1aafc15e91b883f5789a5e473f9da1f848cbd45d3c9174f4efc500c84987b"} Mar 20 09:09:34.515520 master-0 kubenswrapper[18707]: I0320 09:09:34.514553 18707 generic.go:334] "Generic (PLEG): container finished" podID="63076b6f-15f6-4802-bdaa-12c3d504d239" containerID="94ddf8292a7594d86823c17e01a86c565ba77c45bc280d391d285c5d478688bb" exitCode=0 Mar 20 09:09:34.515520 master-0 kubenswrapper[18707]: I0320 09:09:34.514603 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"63076b6f-15f6-4802-bdaa-12c3d504d239","Type":"ContainerDied","Data":"94ddf8292a7594d86823c17e01a86c565ba77c45bc280d391d285c5d478688bb"} Mar 20 09:09:34.515520 master-0 kubenswrapper[18707]: I0320 09:09:34.514623 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"63076b6f-15f6-4802-bdaa-12c3d504d239","Type":"ContainerDied","Data":"a565abb05da1fa95796369f48ed3181a754751b03d1b5568c51f5655146f88bd"} Mar 20 09:09:34.515520 master-0 kubenswrapper[18707]: I0320 09:09:34.514647 18707 scope.go:117] "RemoveContainer" containerID="94ddf8292a7594d86823c17e01a86c565ba77c45bc280d391d285c5d478688bb" Mar 20 09:09:34.515520 master-0 kubenswrapper[18707]: I0320 09:09:34.514790 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:34.518337 master-0 kubenswrapper[18707]: I0320 09:09:34.518268 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" event={"ID":"4428e0d9-0da9-4aa4-8422-8baa68054f53","Type":"ContainerDied","Data":"064a3ef553ed7694a421ad2695f971db9b8daecf098d7155dfed831c8060c5a9"} Mar 20 09:09:34.519477 master-0 kubenswrapper[18707]: I0320 09:09:34.519431 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" Mar 20 09:09:34.520176 master-0 kubenswrapper[18707]: I0320 09:09:34.520138 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c7a89cf6-6326-4a53-b489-1b8c454df190","Type":"ContainerStarted","Data":"3d979f745d295698e8ba71b6b47cd5a38f9a9dd14e312565cf28aa8f8cc77c78"} Mar 20 09:09:34.520279 master-0 kubenswrapper[18707]: I0320 09:09:34.520202 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c7a89cf6-6326-4a53-b489-1b8c454df190","Type":"ContainerStarted","Data":"5dfd788537a8d10e2298f70a6a074195e0a86af0677b882a6a5c817f6ddb9958"} Mar 20 09:09:34.520477 master-0 kubenswrapper[18707]: I0320 09:09:34.520404 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 09:09:34.520742 master-0 kubenswrapper[18707]: I0320 09:09:34.520688 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63076b6f-15f6-4802-bdaa-12c3d504d239-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63076b6f-15f6-4802-bdaa-12c3d504d239" (UID: "63076b6f-15f6-4802-bdaa-12c3d504d239"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:09:34.567485 master-0 kubenswrapper[18707]: I0320 09:09:34.567442 18707 scope.go:117] "RemoveContainer" containerID="94ddf8292a7594d86823c17e01a86c565ba77c45bc280d391d285c5d478688bb" Mar 20 09:09:34.567862 master-0 kubenswrapper[18707]: E0320 09:09:34.567834 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94ddf8292a7594d86823c17e01a86c565ba77c45bc280d391d285c5d478688bb\": container with ID starting with 94ddf8292a7594d86823c17e01a86c565ba77c45bc280d391d285c5d478688bb not found: ID does not exist" containerID="94ddf8292a7594d86823c17e01a86c565ba77c45bc280d391d285c5d478688bb" Mar 20 09:09:34.567947 master-0 kubenswrapper[18707]: I0320 09:09:34.567863 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ddf8292a7594d86823c17e01a86c565ba77c45bc280d391d285c5d478688bb"} err="failed to get container status \"94ddf8292a7594d86823c17e01a86c565ba77c45bc280d391d285c5d478688bb\": rpc error: code = NotFound desc = could not find container \"94ddf8292a7594d86823c17e01a86c565ba77c45bc280d391d285c5d478688bb\": container with ID starting with 94ddf8292a7594d86823c17e01a86c565ba77c45bc280d391d285c5d478688bb not found: ID does not exist" Mar 20 09:09:34.567947 master-0 kubenswrapper[18707]: I0320 09:09:34.567887 18707 scope.go:117] "RemoveContainer" containerID="b50f09fc604465b033012e195b4d71b6f8106ade9dd4d689d929e810eae17345" Mar 20 09:09:34.583899 master-0 kubenswrapper[18707]: I0320 09:09:34.583846 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htdr8\" (UniqueName: \"kubernetes.io/projected/63076b6f-15f6-4802-bdaa-12c3d504d239-kube-api-access-htdr8\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:34.584219 master-0 kubenswrapper[18707]: I0320 09:09:34.584144 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63076b6f-15f6-4802-bdaa-12c3d504d239-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:34.584219 master-0 kubenswrapper[18707]: I0320 09:09:34.584164 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63076b6f-15f6-4802-bdaa-12c3d504d239-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:34.619427 master-0 kubenswrapper[18707]: I0320 09:09:34.619392 18707 scope.go:117] "RemoveContainer" containerID="83d1b48c509ce4f31cd1091e9d10c0ec9cca32614658750f6cffebdf56b92619" Mar 20 09:09:34.949167 master-0 kubenswrapper[18707]: I0320 09:09:34.949050 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cd5f68b57-cbttm"] Mar 20 09:09:35.118343 master-0 kubenswrapper[18707]: I0320 09:09:35.118212 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cd5f68b57-cbttm"] Mar 20 09:09:35.334663 master-0 kubenswrapper[18707]: I0320 09:09:35.334563 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=4.334541561 podStartE2EDuration="4.334541561s" podCreationTimestamp="2026-03-20 09:09:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:09:35.321625606 +0000 UTC m=+1720.477805992" watchObservedRunningTime="2026-03-20 09:09:35.334541561 +0000 UTC m=+1720.490721917" Mar 20 09:09:35.478035 master-0 kubenswrapper[18707]: I0320 09:09:35.477964 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4f123c6c-5e29-43af-9378-b7f8ee76b81c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.128.0.254:8775/\": read tcp 10.128.0.2:43860->10.128.0.254:8775: read: connection reset by peer" Mar 20 09:09:35.478337 master-0 kubenswrapper[18707]: I0320 09:09:35.478306 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4f123c6c-5e29-43af-9378-b7f8ee76b81c" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.128.0.254:8775/\": read tcp 10.128.0.2:43876->10.128.0.254:8775: read: connection reset by peer" Mar 20 09:09:35.539384 master-0 kubenswrapper[18707]: I0320 09:09:35.539300 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" event={"ID":"fba929f9-2dc3-42ed-a9d6-f8785399db16","Type":"ContainerDied","Data":"27d52f72e4e3015e841caf6eb399de2f3e315ba1e3eaa0c1988e9e5961bef9c5"} Mar 20 09:09:35.540643 master-0 kubenswrapper[18707]: I0320 09:09:35.539169 18707 generic.go:334] "Generic (PLEG): container finished" podID="fba929f9-2dc3-42ed-a9d6-f8785399db16" containerID="27d52f72e4e3015e841caf6eb399de2f3e315ba1e3eaa0c1988e9e5961bef9c5" exitCode=0 Mar 20 09:09:35.541242 master-0 kubenswrapper[18707]: I0320 09:09:35.541200 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 09:09:35.645670 master-0 kubenswrapper[18707]: I0320 09:09:35.645387 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 09:09:36.084624 master-0 kubenswrapper[18707]: I0320 09:09:36.084542 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="ff5a931d-3951-482d-88a0-eefac071090a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.0.251:8774/\": read tcp 10.128.0.2:41056->10.128.0.251:8774: read: connection reset by peer" Mar 20 09:09:36.085119 master-0 kubenswrapper[18707]: I0320 09:09:36.084759 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="ff5a931d-3951-482d-88a0-eefac071090a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.0.251:8774/\": read tcp 10.128.0.2:41068->10.128.0.251:8774: read: connection reset by peer" Mar 20 09:09:36.567288 master-0 kubenswrapper[18707]: I0320 09:09:36.567217 18707 generic.go:334] "Generic (PLEG): container finished" podID="4f123c6c-5e29-43af-9378-b7f8ee76b81c" containerID="bab17a153100b5d15c2495320b986a4d103813d15314176f773c1ff3132efa42" exitCode=0 Mar 20 09:09:36.567538 master-0 kubenswrapper[18707]: I0320 09:09:36.567311 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f123c6c-5e29-43af-9378-b7f8ee76b81c","Type":"ContainerDied","Data":"bab17a153100b5d15c2495320b986a4d103813d15314176f773c1ff3132efa42"} Mar 20 09:09:36.569706 master-0 kubenswrapper[18707]: I0320 09:09:36.569662 18707 generic.go:334] "Generic (PLEG): container finished" podID="ff5a931d-3951-482d-88a0-eefac071090a" containerID="b82cd12eb0349521e54c4271e337d2fa464ce2aca64ca51c414337a3711ae639" exitCode=0 Mar 20 09:09:36.569779 master-0 kubenswrapper[18707]: I0320 09:09:36.569724 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff5a931d-3951-482d-88a0-eefac071090a","Type":"ContainerDied","Data":"b82cd12eb0349521e54c4271e337d2fa464ce2aca64ca51c414337a3711ae639"} Mar 20 09:09:36.801313 master-0 kubenswrapper[18707]: I0320 09:09:36.801256 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 09:09:36.960096 master-0 kubenswrapper[18707]: I0320 09:09:36.960021 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f123c6c-5e29-43af-9378-b7f8ee76b81c-config-data\") pod \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " Mar 20 09:09:36.960096 master-0 kubenswrapper[18707]: I0320 09:09:36.960083 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5t2r\" (UniqueName: \"kubernetes.io/projected/4f123c6c-5e29-43af-9378-b7f8ee76b81c-kube-api-access-r5t2r\") pod \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " Mar 20 09:09:36.960342 master-0 kubenswrapper[18707]: I0320 09:09:36.960122 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f123c6c-5e29-43af-9378-b7f8ee76b81c-combined-ca-bundle\") pod \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " Mar 20 09:09:36.960342 master-0 kubenswrapper[18707]: I0320 09:09:36.960292 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f123c6c-5e29-43af-9378-b7f8ee76b81c-logs\") pod \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\" (UID: \"4f123c6c-5e29-43af-9378-b7f8ee76b81c\") " Mar 20 09:09:36.961349 master-0 kubenswrapper[18707]: I0320 09:09:36.961310 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f123c6c-5e29-43af-9378-b7f8ee76b81c-logs" (OuterVolumeSpecName: "logs") pod "4f123c6c-5e29-43af-9378-b7f8ee76b81c" (UID: "4f123c6c-5e29-43af-9378-b7f8ee76b81c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:09:36.965738 master-0 kubenswrapper[18707]: I0320 09:09:36.965701 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f123c6c-5e29-43af-9378-b7f8ee76b81c-kube-api-access-r5t2r" (OuterVolumeSpecName: "kube-api-access-r5t2r") pod "4f123c6c-5e29-43af-9378-b7f8ee76b81c" (UID: "4f123c6c-5e29-43af-9378-b7f8ee76b81c"). InnerVolumeSpecName "kube-api-access-r5t2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:09:36.994095 master-0 kubenswrapper[18707]: I0320 09:09:36.994043 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f123c6c-5e29-43af-9378-b7f8ee76b81c-config-data" (OuterVolumeSpecName: "config-data") pod "4f123c6c-5e29-43af-9378-b7f8ee76b81c" (UID: "4f123c6c-5e29-43af-9378-b7f8ee76b81c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:09:37.046134 master-0 kubenswrapper[18707]: I0320 09:09:37.046072 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f123c6c-5e29-43af-9378-b7f8ee76b81c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f123c6c-5e29-43af-9378-b7f8ee76b81c" (UID: "4f123c6c-5e29-43af-9378-b7f8ee76b81c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:09:37.065529 master-0 kubenswrapper[18707]: I0320 09:09:37.065393 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f123c6c-5e29-43af-9378-b7f8ee76b81c-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:37.065529 master-0 kubenswrapper[18707]: I0320 09:09:37.065461 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5t2r\" (UniqueName: \"kubernetes.io/projected/4f123c6c-5e29-43af-9378-b7f8ee76b81c-kube-api-access-r5t2r\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:37.065529 master-0 kubenswrapper[18707]: I0320 09:09:37.065479 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f123c6c-5e29-43af-9378-b7f8ee76b81c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:37.065529 master-0 kubenswrapper[18707]: I0320 09:09:37.065491 18707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f123c6c-5e29-43af-9378-b7f8ee76b81c-logs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:37.110825 master-0 kubenswrapper[18707]: I0320 09:09:37.110731 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4428e0d9-0da9-4aa4-8422-8baa68054f53" path="/var/lib/kubelet/pods/4428e0d9-0da9-4aa4-8422-8baa68054f53/volumes" Mar 20 09:09:37.111757 master-0 kubenswrapper[18707]: I0320 09:09:37.111716 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63076b6f-15f6-4802-bdaa-12c3d504d239" path="/var/lib/kubelet/pods/63076b6f-15f6-4802-bdaa-12c3d504d239/volumes" Mar 20 09:09:37.268699 master-0 kubenswrapper[18707]: E0320 09:09:37.268608 18707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0a20a321758b89a7197046f590fbf478d8bb8ff29a8be0633a162d93309aede is running failed: container process not found" containerID="c0a20a321758b89a7197046f590fbf478d8bb8ff29a8be0633a162d93309aede" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 09:09:37.269178 master-0 kubenswrapper[18707]: E0320 09:09:37.269151 18707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0a20a321758b89a7197046f590fbf478d8bb8ff29a8be0633a162d93309aede is running failed: container process not found" containerID="c0a20a321758b89a7197046f590fbf478d8bb8ff29a8be0633a162d93309aede" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 09:09:37.269504 master-0 kubenswrapper[18707]: E0320 09:09:37.269467 18707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0a20a321758b89a7197046f590fbf478d8bb8ff29a8be0633a162d93309aede is running failed: container process not found" containerID="c0a20a321758b89a7197046f590fbf478d8bb8ff29a8be0633a162d93309aede" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 09:09:37.269556 master-0 kubenswrapper[18707]: E0320 09:09:37.269508 18707 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c0a20a321758b89a7197046f590fbf478d8bb8ff29a8be0633a162d93309aede is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="21d04a7e-7a88-496f-b2ed-65183508c0af" containerName="nova-scheduler-scheduler" Mar 20 09:09:37.618399 master-0 kubenswrapper[18707]: I0320 09:09:37.617467 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f123c6c-5e29-43af-9378-b7f8ee76b81c","Type":"ContainerDied","Data":"2ac4bf6a41d2e3a09e8eae5764b40c29c2aa296ebeb97630e3f4c5e3833809bb"} Mar 20 09:09:37.618399 master-0 kubenswrapper[18707]: I0320 09:09:37.617541 18707 scope.go:117] "RemoveContainer" containerID="bab17a153100b5d15c2495320b986a4d103813d15314176f773c1ff3132efa42" Mar 20 09:09:37.618399 master-0 kubenswrapper[18707]: I0320 09:09:37.617697 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 09:09:37.644895 master-0 kubenswrapper[18707]: I0320 09:09:37.642592 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" event={"ID":"fba929f9-2dc3-42ed-a9d6-f8785399db16","Type":"ContainerStarted","Data":"d1948cd025b36662905be18dbed106a1b802d859eb449c96f006a262da56b413"} Mar 20 09:09:37.644895 master-0 kubenswrapper[18707]: I0320 09:09:37.643854 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:37.648963 master-0 kubenswrapper[18707]: I0320 09:09:37.647642 18707 generic.go:334] "Generic (PLEG): container finished" podID="21d04a7e-7a88-496f-b2ed-65183508c0af" containerID="c0a20a321758b89a7197046f590fbf478d8bb8ff29a8be0633a162d93309aede" exitCode=0 Mar 20 09:09:37.648963 master-0 kubenswrapper[18707]: I0320 09:09:37.647683 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"21d04a7e-7a88-496f-b2ed-65183508c0af","Type":"ContainerDied","Data":"c0a20a321758b89a7197046f590fbf478d8bb8ff29a8be0633a162d93309aede"} Mar 20 09:09:37.673560 master-0 kubenswrapper[18707]: I0320 09:09:37.673492 18707 scope.go:117] "RemoveContainer" containerID="1aed233d8dd33f8e88f726b178e898cd6aa9c5d65d1656f00f71ca241561a719" Mar 20 09:09:37.788346 master-0 kubenswrapper[18707]: I0320 09:09:37.785443 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 09:09:37.788346 master-0 kubenswrapper[18707]: E0320 09:09:37.787448 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63076b6f-15f6-4802-bdaa-12c3d504d239" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 09:09:37.788346 master-0 kubenswrapper[18707]: I0320 09:09:37.787472 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="63076b6f-15f6-4802-bdaa-12c3d504d239" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 09:09:37.788346 master-0 kubenswrapper[18707]: E0320 09:09:37.787486 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f123c6c-5e29-43af-9378-b7f8ee76b81c" containerName="nova-metadata-log" Mar 20 09:09:37.788346 master-0 kubenswrapper[18707]: I0320 09:09:37.787493 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f123c6c-5e29-43af-9378-b7f8ee76b81c" containerName="nova-metadata-log" Mar 20 09:09:37.788346 master-0 kubenswrapper[18707]: E0320 09:09:37.787540 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4428e0d9-0da9-4aa4-8422-8baa68054f53" containerName="init" Mar 20 09:09:37.788346 master-0 kubenswrapper[18707]: I0320 09:09:37.787549 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4428e0d9-0da9-4aa4-8422-8baa68054f53" containerName="init" Mar 20 09:09:37.788346 master-0 kubenswrapper[18707]: E0320 09:09:37.787564 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4428e0d9-0da9-4aa4-8422-8baa68054f53" containerName="dnsmasq-dns" Mar 20 09:09:37.788346 master-0 kubenswrapper[18707]: I0320 09:09:37.787571 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4428e0d9-0da9-4aa4-8422-8baa68054f53" containerName="dnsmasq-dns" Mar 20 09:09:37.788346 master-0 kubenswrapper[18707]: E0320 09:09:37.787610 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f123c6c-5e29-43af-9378-b7f8ee76b81c" containerName="nova-metadata-metadata" Mar 20 09:09:37.788346 master-0 kubenswrapper[18707]: I0320 09:09:37.787618 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f123c6c-5e29-43af-9378-b7f8ee76b81c" containerName="nova-metadata-metadata" Mar 20 09:09:37.788346 master-0 kubenswrapper[18707]: I0320 09:09:37.787893 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f123c6c-5e29-43af-9378-b7f8ee76b81c" containerName="nova-metadata-log" Mar 20 09:09:37.788346 master-0 kubenswrapper[18707]: I0320 09:09:37.787917 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f123c6c-5e29-43af-9378-b7f8ee76b81c" containerName="nova-metadata-metadata" Mar 20 09:09:37.788346 master-0 kubenswrapper[18707]: I0320 09:09:37.787948 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4428e0d9-0da9-4aa4-8422-8baa68054f53" containerName="dnsmasq-dns" Mar 20 09:09:37.788346 master-0 kubenswrapper[18707]: I0320 09:09:37.787963 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="63076b6f-15f6-4802-bdaa-12c3d504d239" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 09:09:37.793144 master-0 kubenswrapper[18707]: I0320 09:09:37.790446 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:37.799206 master-0 kubenswrapper[18707]: I0320 09:09:37.795563 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 09:09:37.799206 master-0 kubenswrapper[18707]: I0320 09:09:37.795993 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 09:09:37.799206 master-0 kubenswrapper[18707]: I0320 09:09:37.796275 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 09:09:37.906920 master-0 kubenswrapper[18707]: I0320 09:09:37.906879 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxjwd\" (UniqueName: \"kubernetes.io/projected/1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a-kube-api-access-kxjwd\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:37.909337 master-0 kubenswrapper[18707]: I0320 09:09:37.909220 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:37.909337 master-0 kubenswrapper[18707]: I0320 09:09:37.909290 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:37.909437 master-0 kubenswrapper[18707]: I0320 09:09:37.909356 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:37.909854 master-0 kubenswrapper[18707]: I0320 09:09:37.909826 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:37.978462 master-0 kubenswrapper[18707]: I0320 09:09:37.976160 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 09:09:38.012883 master-0 kubenswrapper[18707]: I0320 09:09:38.012799 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:38.013321 master-0 kubenswrapper[18707]: I0320 09:09:38.012912 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxjwd\" (UniqueName: \"kubernetes.io/projected/1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a-kube-api-access-kxjwd\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:38.013321 master-0 kubenswrapper[18707]: I0320 09:09:38.012956 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:38.013321 master-0 kubenswrapper[18707]: I0320 09:09:38.012980 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:38.013321 master-0 kubenswrapper[18707]: I0320 09:09:38.013008 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:38.016368 master-0 kubenswrapper[18707]: I0320 09:09:38.016334 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:38.017211 master-0 kubenswrapper[18707]: I0320 09:09:38.017155 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:38.017899 master-0 kubenswrapper[18707]: I0320 09:09:38.017834 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:38.019667 master-0 kubenswrapper[18707]: I0320 09:09:38.019631 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:38.069364 master-0 kubenswrapper[18707]: I0320 09:09:38.069299 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:09:38.075063 master-0 kubenswrapper[18707]: I0320 09:09:38.075015 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 09:09:38.106794 master-0 kubenswrapper[18707]: I0320 09:09:38.106724 18707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7cd5f68b57-cbttm" podUID="4428e0d9-0da9-4aa4-8422-8baa68054f53" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.227:5353: i/o timeout" Mar 20 09:09:38.215900 master-0 kubenswrapper[18707]: I0320 09:09:38.215752 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5a931d-3951-482d-88a0-eefac071090a-config-data\") pod \"ff5a931d-3951-482d-88a0-eefac071090a\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " Mar 20 09:09:38.215900 master-0 kubenswrapper[18707]: I0320 09:09:38.215845 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d04a7e-7a88-496f-b2ed-65183508c0af-combined-ca-bundle\") pod \"21d04a7e-7a88-496f-b2ed-65183508c0af\" (UID: \"21d04a7e-7a88-496f-b2ed-65183508c0af\") " Mar 20 09:09:38.216634 master-0 kubenswrapper[18707]: I0320 09:09:38.215977 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8fn8\" (UniqueName: \"kubernetes.io/projected/ff5a931d-3951-482d-88a0-eefac071090a-kube-api-access-t8fn8\") pod \"ff5a931d-3951-482d-88a0-eefac071090a\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " Mar 20 09:09:38.216634 master-0 kubenswrapper[18707]: I0320 09:09:38.216157 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d04a7e-7a88-496f-b2ed-65183508c0af-config-data\") pod \"21d04a7e-7a88-496f-b2ed-65183508c0af\" (UID: \"21d04a7e-7a88-496f-b2ed-65183508c0af\") " Mar 20 09:09:38.216634 master-0 kubenswrapper[18707]: I0320 09:09:38.216361 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nd7c\" (UniqueName: \"kubernetes.io/projected/21d04a7e-7a88-496f-b2ed-65183508c0af-kube-api-access-9nd7c\") pod \"21d04a7e-7a88-496f-b2ed-65183508c0af\" (UID: \"21d04a7e-7a88-496f-b2ed-65183508c0af\") " Mar 20 09:09:38.216634 master-0 kubenswrapper[18707]: I0320 09:09:38.216477 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff5a931d-3951-482d-88a0-eefac071090a-logs\") pod \"ff5a931d-3951-482d-88a0-eefac071090a\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " Mar 20 09:09:38.216634 master-0 kubenswrapper[18707]: I0320 09:09:38.216546 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5a931d-3951-482d-88a0-eefac071090a-combined-ca-bundle\") pod \"ff5a931d-3951-482d-88a0-eefac071090a\" (UID: \"ff5a931d-3951-482d-88a0-eefac071090a\") " Mar 20 09:09:38.219962 master-0 kubenswrapper[18707]: I0320 09:09:38.219880 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5a931d-3951-482d-88a0-eefac071090a-logs" (OuterVolumeSpecName: "logs") pod "ff5a931d-3951-482d-88a0-eefac071090a" (UID: "ff5a931d-3951-482d-88a0-eefac071090a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:09:38.221640 master-0 kubenswrapper[18707]: I0320 09:09:38.221590 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d04a7e-7a88-496f-b2ed-65183508c0af-kube-api-access-9nd7c" (OuterVolumeSpecName: "kube-api-access-9nd7c") pod "21d04a7e-7a88-496f-b2ed-65183508c0af" (UID: "21d04a7e-7a88-496f-b2ed-65183508c0af"). InnerVolumeSpecName "kube-api-access-9nd7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:09:38.222632 master-0 kubenswrapper[18707]: I0320 09:09:38.222584 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5a931d-3951-482d-88a0-eefac071090a-kube-api-access-t8fn8" (OuterVolumeSpecName: "kube-api-access-t8fn8") pod "ff5a931d-3951-482d-88a0-eefac071090a" (UID: "ff5a931d-3951-482d-88a0-eefac071090a"). InnerVolumeSpecName "kube-api-access-t8fn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:09:38.245296 master-0 kubenswrapper[18707]: I0320 09:09:38.245211 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5a931d-3951-482d-88a0-eefac071090a-config-data" (OuterVolumeSpecName: "config-data") pod "ff5a931d-3951-482d-88a0-eefac071090a" (UID: "ff5a931d-3951-482d-88a0-eefac071090a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:09:38.246050 master-0 kubenswrapper[18707]: I0320 09:09:38.245979 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d04a7e-7a88-496f-b2ed-65183508c0af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21d04a7e-7a88-496f-b2ed-65183508c0af" (UID: "21d04a7e-7a88-496f-b2ed-65183508c0af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:09:38.247308 master-0 kubenswrapper[18707]: I0320 09:09:38.247145 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5a931d-3951-482d-88a0-eefac071090a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff5a931d-3951-482d-88a0-eefac071090a" (UID: "ff5a931d-3951-482d-88a0-eefac071090a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:09:38.254393 master-0 kubenswrapper[18707]: I0320 09:09:38.254341 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d04a7e-7a88-496f-b2ed-65183508c0af-config-data" (OuterVolumeSpecName: "config-data") pod "21d04a7e-7a88-496f-b2ed-65183508c0af" (UID: "21d04a7e-7a88-496f-b2ed-65183508c0af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:09:38.323533 master-0 kubenswrapper[18707]: I0320 09:09:38.323460 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5a931d-3951-482d-88a0-eefac071090a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:38.323917 master-0 kubenswrapper[18707]: I0320 09:09:38.323903 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5a931d-3951-482d-88a0-eefac071090a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:38.324023 master-0 kubenswrapper[18707]: I0320 09:09:38.324010 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d04a7e-7a88-496f-b2ed-65183508c0af-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:38.324139 master-0 kubenswrapper[18707]: I0320 09:09:38.324122 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8fn8\" (UniqueName: \"kubernetes.io/projected/ff5a931d-3951-482d-88a0-eefac071090a-kube-api-access-t8fn8\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:38.324262 master-0 kubenswrapper[18707]: I0320 09:09:38.324248 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21d04a7e-7a88-496f-b2ed-65183508c0af-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:38.324368 master-0 kubenswrapper[18707]: I0320 09:09:38.324354 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nd7c\" (UniqueName: \"kubernetes.io/projected/21d04a7e-7a88-496f-b2ed-65183508c0af-kube-api-access-9nd7c\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:38.324450 master-0 kubenswrapper[18707]: I0320 09:09:38.324437 18707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff5a931d-3951-482d-88a0-eefac071090a-logs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:38.577799 master-0 kubenswrapper[18707]: I0320 09:09:38.577734 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxjwd\" (UniqueName: \"kubernetes.io/projected/1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a-kube-api-access-kxjwd\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:38.661697 master-0 kubenswrapper[18707]: I0320 09:09:38.661586 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"21d04a7e-7a88-496f-b2ed-65183508c0af","Type":"ContainerDied","Data":"1dc7e271690697646640142269b3ddbfea37abb8641f1e4a912389d098f4a1e3"} Mar 20 09:09:38.661697 master-0 kubenswrapper[18707]: I0320 09:09:38.661714 18707 scope.go:117] "RemoveContainer" containerID="c0a20a321758b89a7197046f590fbf478d8bb8ff29a8be0633a162d93309aede" Mar 20 09:09:38.662416 master-0 kubenswrapper[18707]: I0320 09:09:38.662341 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 09:09:38.665498 master-0 kubenswrapper[18707]: I0320 09:09:38.665440 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff5a931d-3951-482d-88a0-eefac071090a","Type":"ContainerDied","Data":"7cba5c4f3a057c2ea26d1dbec2730e5d9fc72d0f59d4817550c3702d5952856f"} Mar 20 09:09:38.665498 master-0 kubenswrapper[18707]: I0320 09:09:38.665489 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:09:38.687776 master-0 kubenswrapper[18707]: I0320 09:09:38.685790 18707 scope.go:117] "RemoveContainer" containerID="b82cd12eb0349521e54c4271e337d2fa464ce2aca64ca51c414337a3711ae639" Mar 20 09:09:38.716361 master-0 kubenswrapper[18707]: I0320 09:09:38.716179 18707 scope.go:117] "RemoveContainer" containerID="9b66d44c659d2aeda44895720ee5568310584c0e8b0240ec21d2621b8e7d8f46" Mar 20 09:09:38.721115 master-0 kubenswrapper[18707]: I0320 09:09:38.721048 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:42.564143 master-0 kubenswrapper[18707]: I0320 09:09:42.564081 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 09:09:42.908434 master-0 kubenswrapper[18707]: I0320 09:09:42.908367 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:09:42.914237 master-0 kubenswrapper[18707]: W0320 09:09:42.914145 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b3cdd0a_7fd1_4aad_94e0_65bcc1921f1a.slice/crio-38e7a33da31f90f72f7dacba0e655efcef847fa008a63aebf6857aa3bdb25e6b WatchSource:0}: Error finding container 38e7a33da31f90f72f7dacba0e655efcef847fa008a63aebf6857aa3bdb25e6b: Status 404 returned error can't find the container with id 38e7a33da31f90f72f7dacba0e655efcef847fa008a63aebf6857aa3bdb25e6b Mar 20 09:09:42.946728 master-0 kubenswrapper[18707]: I0320 09:09:42.946621 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 09:09:42.977413 master-0 kubenswrapper[18707]: I0320 09:09:42.977358 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" Mar 20 09:09:43.753467 master-0 kubenswrapper[18707]: I0320 09:09:43.753383 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a","Type":"ContainerStarted","Data":"38e7a33da31f90f72f7dacba0e655efcef847fa008a63aebf6857aa3bdb25e6b"} Mar 20 09:09:44.775273 master-0 kubenswrapper[18707]: I0320 09:09:44.771029 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:09:44.801441 master-0 kubenswrapper[18707]: I0320 09:09:44.800001 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b3cdd0a-7fd1-4aad-94e0-65bcc1921f1a","Type":"ContainerStarted","Data":"f0559883c39c00af8319e19b10e2bbbdd0d4468d1a67fdf78c8c3fc5a662164e"} Mar 20 09:09:45.113211 master-0 kubenswrapper[18707]: I0320 09:09:45.112702 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f123c6c-5e29-43af-9378-b7f8ee76b81c" path="/var/lib/kubelet/pods/4f123c6c-5e29-43af-9378-b7f8ee76b81c/volumes" Mar 20 09:09:45.506897 master-0 kubenswrapper[18707]: I0320 09:09:45.506752 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:09:45.507435 master-0 kubenswrapper[18707]: E0320 09:09:45.507391 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5a931d-3951-482d-88a0-eefac071090a" containerName="nova-api-log" Mar 20 09:09:45.507435 master-0 kubenswrapper[18707]: I0320 09:09:45.507417 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5a931d-3951-482d-88a0-eefac071090a" containerName="nova-api-log" Mar 20 09:09:45.507543 master-0 kubenswrapper[18707]: E0320 09:09:45.507450 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5a931d-3951-482d-88a0-eefac071090a" containerName="nova-api-api" Mar 20 09:09:45.507543 master-0 kubenswrapper[18707]: I0320 09:09:45.507458 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5a931d-3951-482d-88a0-eefac071090a" containerName="nova-api-api" Mar 20 09:09:45.507543 master-0 kubenswrapper[18707]: E0320 09:09:45.507478 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d04a7e-7a88-496f-b2ed-65183508c0af" containerName="nova-scheduler-scheduler" Mar 20 09:09:45.507543 master-0 kubenswrapper[18707]: I0320 09:09:45.507487 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d04a7e-7a88-496f-b2ed-65183508c0af" containerName="nova-scheduler-scheduler" Mar 20 09:09:45.507758 master-0 kubenswrapper[18707]: I0320 09:09:45.507736 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5a931d-3951-482d-88a0-eefac071090a" containerName="nova-api-api" Mar 20 09:09:45.507808 master-0 kubenswrapper[18707]: I0320 09:09:45.507787 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5a931d-3951-482d-88a0-eefac071090a" containerName="nova-api-log" Mar 20 09:09:45.507843 master-0 kubenswrapper[18707]: I0320 09:09:45.507815 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="21d04a7e-7a88-496f-b2ed-65183508c0af" containerName="nova-scheduler-scheduler" Mar 20 09:09:45.509272 master-0 kubenswrapper[18707]: I0320 09:09:45.509236 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 09:09:45.511920 master-0 kubenswrapper[18707]: I0320 09:09:45.511860 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 09:09:45.513556 master-0 kubenswrapper[18707]: I0320 09:09:45.513513 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 09:09:45.673202 master-0 kubenswrapper[18707]: I0320 09:09:45.673113 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-64ff997757-xgs48" Mar 20 09:09:46.150050 master-0 kubenswrapper[18707]: I0320 09:09:46.149957 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-config-data\") pod \"nova-metadata-0\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " pod="openstack/nova-metadata-0" Mar 20 09:09:46.151081 master-0 kubenswrapper[18707]: I0320 09:09:46.150077 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-logs\") pod \"nova-metadata-0\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " pod="openstack/nova-metadata-0" Mar 20 09:09:46.151081 master-0 kubenswrapper[18707]: I0320 09:09:46.150120 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " pod="openstack/nova-metadata-0" Mar 20 09:09:46.151081 master-0 kubenswrapper[18707]: I0320 09:09:46.150228 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " pod="openstack/nova-metadata-0" Mar 20 09:09:46.151081 master-0 kubenswrapper[18707]: I0320 09:09:46.150298 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcjdg\" (UniqueName: \"kubernetes.io/projected/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-kube-api-access-lcjdg\") pod \"nova-metadata-0\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " pod="openstack/nova-metadata-0" Mar 20 09:09:46.253932 master-0 kubenswrapper[18707]: I0320 09:09:46.253852 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " pod="openstack/nova-metadata-0" Mar 20 09:09:46.254623 master-0 kubenswrapper[18707]: I0320 09:09:46.254572 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcjdg\" (UniqueName: \"kubernetes.io/projected/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-kube-api-access-lcjdg\") pod \"nova-metadata-0\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " pod="openstack/nova-metadata-0" Mar 20 09:09:46.255274 master-0 kubenswrapper[18707]: I0320 09:09:46.255216 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-config-data\") pod \"nova-metadata-0\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " pod="openstack/nova-metadata-0" Mar 20 09:09:46.255708 master-0 kubenswrapper[18707]: I0320 09:09:46.255660 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-logs\") pod \"nova-metadata-0\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " pod="openstack/nova-metadata-0" Mar 20 09:09:46.255977 master-0 kubenswrapper[18707]: I0320 09:09:46.255936 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " pod="openstack/nova-metadata-0" Mar 20 09:09:46.256324 master-0 kubenswrapper[18707]: I0320 09:09:46.256265 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-logs\") pod \"nova-metadata-0\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " pod="openstack/nova-metadata-0" Mar 20 09:09:46.270123 master-0 kubenswrapper[18707]: I0320 09:09:46.259041 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " pod="openstack/nova-metadata-0" Mar 20 09:09:46.270123 master-0 kubenswrapper[18707]: I0320 09:09:46.259101 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-config-data\") pod \"nova-metadata-0\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " pod="openstack/nova-metadata-0" Mar 20 09:09:46.270123 master-0 kubenswrapper[18707]: I0320 09:09:46.259517 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " pod="openstack/nova-metadata-0" Mar 20 09:09:46.335876 master-0 kubenswrapper[18707]: I0320 09:09:46.334512 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cc86b957c-nqjxf" podStartSLOduration=14.33447694 podStartE2EDuration="14.33447694s" podCreationTimestamp="2026-03-20 09:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:09:46.324515349 +0000 UTC m=+1731.480695715" watchObservedRunningTime="2026-03-20 09:09:46.33447694 +0000 UTC m=+1731.490657316" Mar 20 09:09:46.434901 master-0 kubenswrapper[18707]: I0320 09:09:46.434731 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:09:46.592163 master-0 kubenswrapper[18707]: I0320 09:09:46.592110 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcjdg\" (UniqueName: \"kubernetes.io/projected/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-kube-api-access-lcjdg\") pod \"nova-metadata-0\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " pod="openstack/nova-metadata-0" Mar 20 09:09:46.728961 master-0 kubenswrapper[18707]: I0320 09:09:46.728816 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 09:09:48.696138 master-0 kubenswrapper[18707]: I0320 09:09:48.696080 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:09:48.725476 master-0 kubenswrapper[18707]: I0320 09:09:48.725383 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:48.725476 master-0 kubenswrapper[18707]: I0320 09:09:48.725480 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:48.798564 master-0 kubenswrapper[18707]: I0320 09:09:48.798510 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:49.267009 master-0 kubenswrapper[18707]: I0320 09:09:49.266883 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a","Type":"ContainerStarted","Data":"13a16d3634815ccc54488a78dc7d418e72710f1e274875789138ecbeeb4db79e"} Mar 20 09:09:49.303430 master-0 kubenswrapper[18707]: I0320 09:09:49.303326 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 09:09:49.732456 master-0 kubenswrapper[18707]: I0320 09:09:49.732339 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=14.73232007 podStartE2EDuration="14.73232007s" podCreationTimestamp="2026-03-20 09:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:09:49.708936749 +0000 UTC m=+1734.865117105" watchObservedRunningTime="2026-03-20 09:09:49.73232007 +0000 UTC m=+1734.888500426" Mar 20 09:09:50.187192 master-0 kubenswrapper[18707]: I0320 09:09:50.187097 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:09:50.282807 master-0 kubenswrapper[18707]: I0320 09:09:50.282725 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a","Type":"ContainerStarted","Data":"b524c9b66f9e6cf08139a92b4cf4c4961912ac8abc1448b58ebe3efd6a9f8075"} Mar 20 09:09:50.600480 master-0 kubenswrapper[18707]: I0320 09:09:50.600401 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:09:50.654212 master-0 kubenswrapper[18707]: I0320 09:09:50.653085 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 09:09:50.679696 master-0 kubenswrapper[18707]: I0320 09:09:50.678636 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:09:50.700287 master-0 kubenswrapper[18707]: I0320 09:09:50.695827 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 09:09:50.706281 master-0 kubenswrapper[18707]: I0320 09:09:50.705372 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b8f74d-3043-4582-a645-333b12c6e32b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " pod="openstack/nova-api-0" Mar 20 09:09:50.706281 master-0 kubenswrapper[18707]: I0320 09:09:50.705433 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4b8f74d-3043-4582-a645-333b12c6e32b-logs\") pod \"nova-api-0\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " pod="openstack/nova-api-0" Mar 20 09:09:50.706281 master-0 kubenswrapper[18707]: I0320 09:09:50.705477 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4b8f74d-3043-4582-a645-333b12c6e32b-config-data\") pod \"nova-api-0\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " pod="openstack/nova-api-0" Mar 20 09:09:50.706281 master-0 kubenswrapper[18707]: I0320 09:09:50.705550 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ngcw\" (UniqueName: \"kubernetes.io/projected/c4b8f74d-3043-4582-a645-333b12c6e32b-kube-api-access-8ngcw\") pod \"nova-api-0\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " pod="openstack/nova-api-0" Mar 20 09:09:50.710879 master-0 kubenswrapper[18707]: I0320 09:09:50.710832 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:09:50.770235 master-0 kubenswrapper[18707]: I0320 09:09:50.761297 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:09:50.790409 master-0 kubenswrapper[18707]: I0320 09:09:50.789713 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:09:50.819079 master-0 kubenswrapper[18707]: I0320 09:09:50.819030 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b8f74d-3043-4582-a645-333b12c6e32b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " pod="openstack/nova-api-0" Mar 20 09:09:50.819357 master-0 kubenswrapper[18707]: I0320 09:09:50.819339 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4b8f74d-3043-4582-a645-333b12c6e32b-logs\") pod \"nova-api-0\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " pod="openstack/nova-api-0" Mar 20 09:09:50.819478 master-0 kubenswrapper[18707]: I0320 09:09:50.819465 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4b8f74d-3043-4582-a645-333b12c6e32b-config-data\") pod \"nova-api-0\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " pod="openstack/nova-api-0" Mar 20 09:09:50.824665 master-0 kubenswrapper[18707]: I0320 09:09:50.824637 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ngcw\" (UniqueName: \"kubernetes.io/projected/c4b8f74d-3043-4582-a645-333b12c6e32b-kube-api-access-8ngcw\") pod \"nova-api-0\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " pod="openstack/nova-api-0" Mar 20 09:09:50.835758 master-0 kubenswrapper[18707]: I0320 09:09:50.835721 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b8f74d-3043-4582-a645-333b12c6e32b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " pod="openstack/nova-api-0" Mar 20 09:09:50.842869 master-0 kubenswrapper[18707]: I0320 09:09:50.836245 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4b8f74d-3043-4582-a645-333b12c6e32b-logs\") pod \"nova-api-0\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " pod="openstack/nova-api-0" Mar 20 09:09:50.843156 master-0 kubenswrapper[18707]: I0320 09:09:50.839733 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:09:50.843453 master-0 kubenswrapper[18707]: I0320 09:09:50.843417 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4b8f74d-3043-4582-a645-333b12c6e32b-config-data\") pod \"nova-api-0\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " pod="openstack/nova-api-0" Mar 20 09:09:50.845170 master-0 kubenswrapper[18707]: I0320 09:09:50.845145 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 09:09:50.849419 master-0 kubenswrapper[18707]: I0320 09:09:50.849399 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 09:09:50.878146 master-0 kubenswrapper[18707]: I0320 09:09:50.877009 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:09:51.002346 master-0 kubenswrapper[18707]: I0320 09:09:50.999557 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1297de32-c43b-4300-942c-2fc5cbd197f9-config-data\") pod \"nova-scheduler-0\" (UID: \"1297de32-c43b-4300-942c-2fc5cbd197f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:09:51.002346 master-0 kubenswrapper[18707]: I0320 09:09:50.999675 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92h9p\" (UniqueName: \"kubernetes.io/projected/1297de32-c43b-4300-942c-2fc5cbd197f9-kube-api-access-92h9p\") pod \"nova-scheduler-0\" (UID: \"1297de32-c43b-4300-942c-2fc5cbd197f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:09:51.002346 master-0 kubenswrapper[18707]: I0320 09:09:50.999820 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1297de32-c43b-4300-942c-2fc5cbd197f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1297de32-c43b-4300-942c-2fc5cbd197f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:09:51.044105 master-0 kubenswrapper[18707]: I0320 09:09:51.043961 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ngcw\" (UniqueName: \"kubernetes.io/projected/c4b8f74d-3043-4582-a645-333b12c6e32b-kube-api-access-8ngcw\") pod \"nova-api-0\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " pod="openstack/nova-api-0" Mar 20 09:09:51.077973 master-0 kubenswrapper[18707]: I0320 09:09:51.074384 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75597d7cbf-d9w8g"] Mar 20 09:09:51.077973 master-0 kubenswrapper[18707]: I0320 09:09:51.074705 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" podUID="d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" containerName="dnsmasq-dns" containerID="cri-o://36c08af42bc218440f37751dca61d756d06093a6314c88456efff4cd6b669eb4" gracePeriod=10 Mar 20 09:09:51.106574 master-0 kubenswrapper[18707]: I0320 09:09:51.104827 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:09:51.106574 master-0 kubenswrapper[18707]: I0320 09:09:51.105107 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1297de32-c43b-4300-942c-2fc5cbd197f9-config-data\") pod \"nova-scheduler-0\" (UID: \"1297de32-c43b-4300-942c-2fc5cbd197f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:09:51.106574 master-0 kubenswrapper[18707]: I0320 09:09:51.105253 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92h9p\" (UniqueName: \"kubernetes.io/projected/1297de32-c43b-4300-942c-2fc5cbd197f9-kube-api-access-92h9p\") pod \"nova-scheduler-0\" (UID: \"1297de32-c43b-4300-942c-2fc5cbd197f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:09:51.106574 master-0 kubenswrapper[18707]: I0320 09:09:51.105467 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1297de32-c43b-4300-942c-2fc5cbd197f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1297de32-c43b-4300-942c-2fc5cbd197f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:09:51.131298 master-0 kubenswrapper[18707]: I0320 09:09:51.126999 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1297de32-c43b-4300-942c-2fc5cbd197f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1297de32-c43b-4300-942c-2fc5cbd197f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:09:51.140680 master-0 kubenswrapper[18707]: I0320 09:09:51.139625 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1297de32-c43b-4300-942c-2fc5cbd197f9-config-data\") pod \"nova-scheduler-0\" (UID: \"1297de32-c43b-4300-942c-2fc5cbd197f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:09:51.191623 master-0 kubenswrapper[18707]: I0320 09:09:51.186290 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92h9p\" (UniqueName: \"kubernetes.io/projected/1297de32-c43b-4300-942c-2fc5cbd197f9-kube-api-access-92h9p\") pod \"nova-scheduler-0\" (UID: \"1297de32-c43b-4300-942c-2fc5cbd197f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:09:51.200409 master-0 kubenswrapper[18707]: I0320 09:09:51.197500 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21d04a7e-7a88-496f-b2ed-65183508c0af" path="/var/lib/kubelet/pods/21d04a7e-7a88-496f-b2ed-65183508c0af/volumes" Mar 20 09:09:51.200409 master-0 kubenswrapper[18707]: I0320 09:09:51.198422 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5a931d-3951-482d-88a0-eefac071090a" path="/var/lib/kubelet/pods/ff5a931d-3951-482d-88a0-eefac071090a/volumes" Mar 20 09:09:51.377768 master-0 kubenswrapper[18707]: I0320 09:09:51.377698 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a","Type":"ContainerStarted","Data":"63c14ed08e319b62a1467190401ecc88231b554637467824da26ce44b98481b6"} Mar 20 09:09:51.384382 master-0 kubenswrapper[18707]: I0320 09:09:51.384159 18707 generic.go:334] "Generic (PLEG): container finished" podID="d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" containerID="36c08af42bc218440f37751dca61d756d06093a6314c88456efff4cd6b669eb4" exitCode=0 Mar 20 09:09:51.384382 master-0 kubenswrapper[18707]: I0320 09:09:51.384245 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" event={"ID":"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340","Type":"ContainerDied","Data":"36c08af42bc218440f37751dca61d756d06093a6314c88456efff4cd6b669eb4"} Mar 20 09:09:51.385212 master-0 kubenswrapper[18707]: I0320 09:09:51.385094 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 09:09:51.822602 master-0 kubenswrapper[18707]: I0320 09:09:51.820333 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=7.820302119 podStartE2EDuration="7.820302119s" podCreationTimestamp="2026-03-20 09:09:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:09:51.80404536 +0000 UTC m=+1736.960225726" watchObservedRunningTime="2026-03-20 09:09:51.820302119 +0000 UTC m=+1736.976482475" Mar 20 09:09:52.038120 master-0 kubenswrapper[18707]: I0320 09:09:52.038064 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:09:52.163835 master-0 kubenswrapper[18707]: I0320 09:09:52.163731 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-edpm-b\") pod \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " Mar 20 09:09:52.164129 master-0 kubenswrapper[18707]: I0320 09:09:52.163893 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-dns-swift-storage-0\") pod \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " Mar 20 09:09:52.164129 master-0 kubenswrapper[18707]: I0320 09:09:52.164106 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-ovsdbserver-nb\") pod \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " Mar 20 09:09:52.164297 master-0 kubenswrapper[18707]: I0320 09:09:52.164242 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggxp9\" (UniqueName: \"kubernetes.io/projected/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-kube-api-access-ggxp9\") pod \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " Mar 20 09:09:52.164297 master-0 kubenswrapper[18707]: I0320 09:09:52.164272 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-config\") pod \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " Mar 20 09:09:52.164388 master-0 kubenswrapper[18707]: I0320 09:09:52.164340 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-dns-svc\") pod \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " Mar 20 09:09:52.164388 master-0 kubenswrapper[18707]: I0320 09:09:52.164366 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-edpm-a\") pod \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " Mar 20 09:09:52.164476 master-0 kubenswrapper[18707]: I0320 09:09:52.164400 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-ovsdbserver-sb\") pod \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\" (UID: \"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340\") " Mar 20 09:09:52.179735 master-0 kubenswrapper[18707]: I0320 09:09:52.179606 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-kube-api-access-ggxp9" (OuterVolumeSpecName: "kube-api-access-ggxp9") pod "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" (UID: "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340"). InnerVolumeSpecName "kube-api-access-ggxp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:09:52.246460 master-0 kubenswrapper[18707]: I0320 09:09:52.244338 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-edpm-a" (OuterVolumeSpecName: "edpm-a") pod "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" (UID: "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340"). InnerVolumeSpecName "edpm-a". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:09:52.254682 master-0 kubenswrapper[18707]: I0320 09:09:52.254624 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-edpm-b" (OuterVolumeSpecName: "edpm-b") pod "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" (UID: "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340"). InnerVolumeSpecName "edpm-b". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:09:52.262768 master-0 kubenswrapper[18707]: I0320 09:09:52.262640 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-config" (OuterVolumeSpecName: "config") pod "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" (UID: "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:09:52.267103 master-0 kubenswrapper[18707]: I0320 09:09:52.267049 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" (UID: "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:09:52.267844 master-0 kubenswrapper[18707]: I0320 09:09:52.267812 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:52.267906 master-0 kubenswrapper[18707]: I0320 09:09:52.267845 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggxp9\" (UniqueName: \"kubernetes.io/projected/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-kube-api-access-ggxp9\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:52.267906 master-0 kubenswrapper[18707]: I0320 09:09:52.267859 18707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-config\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:52.267906 master-0 kubenswrapper[18707]: I0320 09:09:52.267870 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-a\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:52.267906 master-0 kubenswrapper[18707]: I0320 09:09:52.267879 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-b\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:52.273873 master-0 kubenswrapper[18707]: I0320 09:09:52.273806 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" (UID: "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:09:52.278558 master-0 kubenswrapper[18707]: I0320 09:09:52.278378 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" (UID: "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:09:52.293566 master-0 kubenswrapper[18707]: I0320 09:09:52.293514 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" (UID: "d9e9f7b0-8bf4-410a-8a1e-030abfbbf340"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:09:52.371180 master-0 kubenswrapper[18707]: I0320 09:09:52.371099 18707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:52.371180 master-0 kubenswrapper[18707]: I0320 09:09:52.371178 18707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:52.371180 master-0 kubenswrapper[18707]: I0320 09:09:52.371224 18707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:09:52.402381 master-0 kubenswrapper[18707]: I0320 09:09:52.402300 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" event={"ID":"d9e9f7b0-8bf4-410a-8a1e-030abfbbf340","Type":"ContainerDied","Data":"fc50cdebf51cd7d051e634fd551cd14c4521e6a5b9d2081bfd63c41b1c0a6082"} Mar 20 09:09:52.402381 master-0 kubenswrapper[18707]: I0320 09:09:52.402388 18707 scope.go:117] "RemoveContainer" containerID="36c08af42bc218440f37751dca61d756d06093a6314c88456efff4cd6b669eb4" Mar 20 09:09:52.402687 master-0 kubenswrapper[18707]: I0320 09:09:52.402467 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75597d7cbf-d9w8g" Mar 20 09:09:52.422965 master-0 kubenswrapper[18707]: I0320 09:09:52.422917 18707 scope.go:117] "RemoveContainer" containerID="61db0defc56b28a464d5616caeffb9183f0071b2d7a89a8035cb4767a9f835c8" Mar 20 09:09:52.883497 master-0 kubenswrapper[18707]: I0320 09:09:52.883444 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-4h8z9"] Mar 20 09:09:52.884105 master-0 kubenswrapper[18707]: E0320 09:09:52.884009 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" containerName="init" Mar 20 09:09:52.884105 master-0 kubenswrapper[18707]: I0320 09:09:52.884025 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" containerName="init" Mar 20 09:09:52.884105 master-0 kubenswrapper[18707]: E0320 09:09:52.884058 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" containerName="dnsmasq-dns" Mar 20 09:09:52.884105 master-0 kubenswrapper[18707]: I0320 09:09:52.884064 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" containerName="dnsmasq-dns" Mar 20 09:09:52.884347 master-0 kubenswrapper[18707]: I0320 09:09:52.884323 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" containerName="dnsmasq-dns" Mar 20 09:09:52.885357 master-0 kubenswrapper[18707]: I0320 09:09:52.885317 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:09:52.887491 master-0 kubenswrapper[18707]: I0320 09:09:52.887457 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 09:09:52.887731 master-0 kubenswrapper[18707]: I0320 09:09:52.887698 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 09:09:52.986240 master-0 kubenswrapper[18707]: I0320 09:09:52.986150 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj2rj\" (UniqueName: \"kubernetes.io/projected/48343162-37bc-41e1-96cb-e5c2b7914c76-kube-api-access-bj2rj\") pod \"nova-cell1-cell-mapping-4h8z9\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:09:52.986240 master-0 kubenswrapper[18707]: I0320 09:09:52.986230 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4h8z9\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:09:52.986578 master-0 kubenswrapper[18707]: I0320 09:09:52.986422 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-scripts\") pod \"nova-cell1-cell-mapping-4h8z9\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:09:52.986578 master-0 kubenswrapper[18707]: I0320 09:09:52.986496 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-config-data\") pod \"nova-cell1-cell-mapping-4h8z9\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:09:53.091086 master-0 kubenswrapper[18707]: I0320 09:09:53.090967 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-scripts\") pod \"nova-cell1-cell-mapping-4h8z9\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:09:53.091334 master-0 kubenswrapper[18707]: I0320 09:09:53.091098 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-config-data\") pod \"nova-cell1-cell-mapping-4h8z9\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:09:53.091334 master-0 kubenswrapper[18707]: I0320 09:09:53.091253 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj2rj\" (UniqueName: \"kubernetes.io/projected/48343162-37bc-41e1-96cb-e5c2b7914c76-kube-api-access-bj2rj\") pod \"nova-cell1-cell-mapping-4h8z9\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:09:53.091334 master-0 kubenswrapper[18707]: I0320 09:09:53.091283 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4h8z9\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:09:53.101344 master-0 kubenswrapper[18707]: I0320 09:09:53.097319 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-config-data\") pod \"nova-cell1-cell-mapping-4h8z9\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:09:53.106734 master-0 kubenswrapper[18707]: I0320 09:09:53.106065 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4h8z9\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:09:53.148442 master-0 kubenswrapper[18707]: W0320 09:09:53.135481 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4b8f74d_3043_4582_a645_333b12c6e32b.slice/crio-c556cec17d5c2a19a3eb62a32a01e330f0d8fd0a4db4505c562f71b6561c1cd8 WatchSource:0}: Error finding container c556cec17d5c2a19a3eb62a32a01e330f0d8fd0a4db4505c562f71b6561c1cd8: Status 404 returned error can't find the container with id c556cec17d5c2a19a3eb62a32a01e330f0d8fd0a4db4505c562f71b6561c1cd8 Mar 20 09:09:53.148442 master-0 kubenswrapper[18707]: I0320 09:09:53.136436 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-scripts\") pod \"nova-cell1-cell-mapping-4h8z9\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:09:53.191981 master-0 kubenswrapper[18707]: I0320 09:09:53.191941 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj2rj\" (UniqueName: \"kubernetes.io/projected/48343162-37bc-41e1-96cb-e5c2b7914c76-kube-api-access-bj2rj\") pod \"nova-cell1-cell-mapping-4h8z9\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:09:53.196166 master-0 kubenswrapper[18707]: I0320 09:09:53.195365 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:09:53.210437 master-0 kubenswrapper[18707]: I0320 09:09:53.210258 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:09:53.249210 master-0 kubenswrapper[18707]: I0320 09:09:53.242217 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:09:53.269618 master-0 kubenswrapper[18707]: I0320 09:09:53.269525 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4h8z9"] Mar 20 09:09:53.351651 master-0 kubenswrapper[18707]: I0320 09:09:53.351570 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75597d7cbf-d9w8g"] Mar 20 09:09:53.382326 master-0 kubenswrapper[18707]: I0320 09:09:53.381581 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75597d7cbf-d9w8g"] Mar 20 09:09:53.446388 master-0 kubenswrapper[18707]: I0320 09:09:53.446280 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4b8f74d-3043-4582-a645-333b12c6e32b","Type":"ContainerStarted","Data":"c556cec17d5c2a19a3eb62a32a01e330f0d8fd0a4db4505c562f71b6561c1cd8"} Mar 20 09:09:53.453884 master-0 kubenswrapper[18707]: I0320 09:09:53.452956 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1297de32-c43b-4300-942c-2fc5cbd197f9","Type":"ContainerStarted","Data":"01a21a89ee874b93366b59cf363848c1d35f05f7e175c9ad67e730350080f05f"} Mar 20 09:09:53.945723 master-0 kubenswrapper[18707]: I0320 09:09:53.945584 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4h8z9"] Mar 20 09:09:54.466308 master-0 kubenswrapper[18707]: I0320 09:09:54.466240 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1297de32-c43b-4300-942c-2fc5cbd197f9","Type":"ContainerStarted","Data":"bb9e82a7de0e5a54084f101aa506a942c723b618c5aa6fddf0bf6079de9023d7"} Mar 20 09:09:54.468801 master-0 kubenswrapper[18707]: I0320 09:09:54.468741 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4b8f74d-3043-4582-a645-333b12c6e32b","Type":"ContainerStarted","Data":"805241a2503d4483a46b854e5daec31e6fb1032139b609fcd9dcefe34730eebd"} Mar 20 09:09:54.468939 master-0 kubenswrapper[18707]: I0320 09:09:54.468814 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4b8f74d-3043-4582-a645-333b12c6e32b","Type":"ContainerStarted","Data":"73eb01fede8e386bcdce6070c6565dc1db2764b6021c348df17ae684fb9a401d"} Mar 20 09:09:54.470346 master-0 kubenswrapper[18707]: I0320 09:09:54.470298 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4h8z9" event={"ID":"48343162-37bc-41e1-96cb-e5c2b7914c76","Type":"ContainerStarted","Data":"8469ccfcd66b36fd78da4cb8fb0d53915013a1b482985cc685a4b5d6c7455171"} Mar 20 09:09:54.470444 master-0 kubenswrapper[18707]: I0320 09:09:54.470349 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4h8z9" event={"ID":"48343162-37bc-41e1-96cb-e5c2b7914c76","Type":"ContainerStarted","Data":"d9757b5af5ae65ef4db4b7e0ffdff27431eab6f29a55f78cb0df55565e243b6d"} Mar 20 09:09:54.507388 master-0 kubenswrapper[18707]: I0320 09:09:54.501413 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.501391794 podStartE2EDuration="4.501391794s" podCreationTimestamp="2026-03-20 09:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:09:54.493367537 +0000 UTC m=+1739.649547893" watchObservedRunningTime="2026-03-20 09:09:54.501391794 +0000 UTC m=+1739.657572150" Mar 20 09:09:54.525409 master-0 kubenswrapper[18707]: I0320 09:09:54.525331 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-4h8z9" podStartSLOduration=3.5253118089999997 podStartE2EDuration="3.525311809s" podCreationTimestamp="2026-03-20 09:09:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:09:54.518863097 +0000 UTC m=+1739.675043453" watchObservedRunningTime="2026-03-20 09:09:54.525311809 +0000 UTC m=+1739.681492165" Mar 20 09:09:54.594558 master-0 kubenswrapper[18707]: I0320 09:09:54.594475 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.594453062 podStartE2EDuration="4.594453062s" podCreationTimestamp="2026-03-20 09:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:09:54.572876403 +0000 UTC m=+1739.729056749" watchObservedRunningTime="2026-03-20 09:09:54.594453062 +0000 UTC m=+1739.750633418" Mar 20 09:09:55.147896 master-0 kubenswrapper[18707]: I0320 09:09:55.147837 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9e9f7b0-8bf4-410a-8a1e-030abfbbf340" path="/var/lib/kubelet/pods/d9e9f7b0-8bf4-410a-8a1e-030abfbbf340/volumes" Mar 20 09:09:55.514681 master-0 kubenswrapper[18707]: I0320 09:09:55.514526 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:09:56.385679 master-0 kubenswrapper[18707]: I0320 09:09:56.385606 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 09:09:56.500164 master-0 kubenswrapper[18707]: I0320 09:09:56.500083 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c4b8f74d-3043-4582-a645-333b12c6e32b" containerName="nova-api-log" containerID="cri-o://73eb01fede8e386bcdce6070c6565dc1db2764b6021c348df17ae684fb9a401d" gracePeriod=30 Mar 20 09:09:56.500164 master-0 kubenswrapper[18707]: I0320 09:09:56.500132 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c4b8f74d-3043-4582-a645-333b12c6e32b" containerName="nova-api-api" containerID="cri-o://805241a2503d4483a46b854e5daec31e6fb1032139b609fcd9dcefe34730eebd" gracePeriod=30 Mar 20 09:09:56.729758 master-0 kubenswrapper[18707]: I0320 09:09:56.729657 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 09:09:56.730003 master-0 kubenswrapper[18707]: I0320 09:09:56.729770 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 09:09:57.515597 master-0 kubenswrapper[18707]: I0320 09:09:57.515493 18707 generic.go:334] "Generic (PLEG): container finished" podID="c4b8f74d-3043-4582-a645-333b12c6e32b" containerID="73eb01fede8e386bcdce6070c6565dc1db2764b6021c348df17ae684fb9a401d" exitCode=143 Mar 20 09:09:57.516151 master-0 kubenswrapper[18707]: I0320 09:09:57.515590 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4b8f74d-3043-4582-a645-333b12c6e32b","Type":"ContainerDied","Data":"73eb01fede8e386bcdce6070c6565dc1db2764b6021c348df17ae684fb9a401d"} Mar 20 09:09:57.772537 master-0 kubenswrapper[18707]: I0320 09:09:57.772420 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.3:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 09:09:57.772537 master-0 kubenswrapper[18707]: I0320 09:09:57.772429 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.3:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 09:09:59.559230 master-0 kubenswrapper[18707]: I0320 09:09:59.559143 18707 generic.go:334] "Generic (PLEG): container finished" podID="c4b8f74d-3043-4582-a645-333b12c6e32b" containerID="805241a2503d4483a46b854e5daec31e6fb1032139b609fcd9dcefe34730eebd" exitCode=0 Mar 20 09:09:59.560021 master-0 kubenswrapper[18707]: I0320 09:09:59.559237 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4b8f74d-3043-4582-a645-333b12c6e32b","Type":"ContainerDied","Data":"805241a2503d4483a46b854e5daec31e6fb1032139b609fcd9dcefe34730eebd"} Mar 20 09:10:01.138334 master-0 kubenswrapper[18707]: I0320 09:10:01.138289 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:10:01.250089 master-0 kubenswrapper[18707]: I0320 09:10:01.250030 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b8f74d-3043-4582-a645-333b12c6e32b-combined-ca-bundle\") pod \"c4b8f74d-3043-4582-a645-333b12c6e32b\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " Mar 20 09:10:01.250522 master-0 kubenswrapper[18707]: I0320 09:10:01.250489 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4b8f74d-3043-4582-a645-333b12c6e32b-logs\") pod \"c4b8f74d-3043-4582-a645-333b12c6e32b\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " Mar 20 09:10:01.250840 master-0 kubenswrapper[18707]: I0320 09:10:01.250809 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4b8f74d-3043-4582-a645-333b12c6e32b-config-data\") pod \"c4b8f74d-3043-4582-a645-333b12c6e32b\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " Mar 20 09:10:01.251097 master-0 kubenswrapper[18707]: I0320 09:10:01.251073 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ngcw\" (UniqueName: \"kubernetes.io/projected/c4b8f74d-3043-4582-a645-333b12c6e32b-kube-api-access-8ngcw\") pod \"c4b8f74d-3043-4582-a645-333b12c6e32b\" (UID: \"c4b8f74d-3043-4582-a645-333b12c6e32b\") " Mar 20 09:10:01.251275 master-0 kubenswrapper[18707]: I0320 09:10:01.250886 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4b8f74d-3043-4582-a645-333b12c6e32b-logs" (OuterVolumeSpecName: "logs") pod "c4b8f74d-3043-4582-a645-333b12c6e32b" (UID: "c4b8f74d-3043-4582-a645-333b12c6e32b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:10:01.252652 master-0 kubenswrapper[18707]: I0320 09:10:01.252622 18707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4b8f74d-3043-4582-a645-333b12c6e32b-logs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:01.282110 master-0 kubenswrapper[18707]: I0320 09:10:01.281509 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4b8f74d-3043-4582-a645-333b12c6e32b-kube-api-access-8ngcw" (OuterVolumeSpecName: "kube-api-access-8ngcw") pod "c4b8f74d-3043-4582-a645-333b12c6e32b" (UID: "c4b8f74d-3043-4582-a645-333b12c6e32b"). InnerVolumeSpecName "kube-api-access-8ngcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:10:01.331259 master-0 kubenswrapper[18707]: I0320 09:10:01.331159 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4b8f74d-3043-4582-a645-333b12c6e32b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4b8f74d-3043-4582-a645-333b12c6e32b" (UID: "c4b8f74d-3043-4582-a645-333b12c6e32b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:10:01.336386 master-0 kubenswrapper[18707]: I0320 09:10:01.336308 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4b8f74d-3043-4582-a645-333b12c6e32b-config-data" (OuterVolumeSpecName: "config-data") pod "c4b8f74d-3043-4582-a645-333b12c6e32b" (UID: "c4b8f74d-3043-4582-a645-333b12c6e32b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:10:01.356474 master-0 kubenswrapper[18707]: I0320 09:10:01.356365 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4b8f74d-3043-4582-a645-333b12c6e32b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:01.356474 master-0 kubenswrapper[18707]: I0320 09:10:01.356462 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4b8f74d-3043-4582-a645-333b12c6e32b-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:01.356474 master-0 kubenswrapper[18707]: I0320 09:10:01.356479 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ngcw\" (UniqueName: \"kubernetes.io/projected/c4b8f74d-3043-4582-a645-333b12c6e32b-kube-api-access-8ngcw\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:01.596779 master-0 kubenswrapper[18707]: I0320 09:10:01.388790 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 09:10:01.597868 master-0 kubenswrapper[18707]: I0320 09:10:01.597779 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c4b8f74d-3043-4582-a645-333b12c6e32b","Type":"ContainerDied","Data":"c556cec17d5c2a19a3eb62a32a01e330f0d8fd0a4db4505c562f71b6561c1cd8"} Mar 20 09:10:01.598004 master-0 kubenswrapper[18707]: I0320 09:10:01.597913 18707 scope.go:117] "RemoveContainer" containerID="805241a2503d4483a46b854e5daec31e6fb1032139b609fcd9dcefe34730eebd" Mar 20 09:10:01.598239 master-0 kubenswrapper[18707]: I0320 09:10:01.598160 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:10:01.664929 master-0 kubenswrapper[18707]: I0320 09:10:01.644225 18707 scope.go:117] "RemoveContainer" containerID="73eb01fede8e386bcdce6070c6565dc1db2764b6021c348df17ae684fb9a401d" Mar 20 09:10:01.664929 master-0 kubenswrapper[18707]: I0320 09:10:01.648609 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 09:10:01.698466 master-0 kubenswrapper[18707]: I0320 09:10:01.698398 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 09:10:04.728991 master-0 kubenswrapper[18707]: I0320 09:10:04.728910 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 09:10:04.730117 master-0 kubenswrapper[18707]: I0320 09:10:04.729646 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 09:10:07.738608 master-0 kubenswrapper[18707]: I0320 09:10:07.738501 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.3:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 09:10:07.739506 master-0 kubenswrapper[18707]: I0320 09:10:07.738530 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.3:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 09:10:09.076517 master-0 kubenswrapper[18707]: I0320 09:10:09.075960 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:10:10.099716 master-0 kubenswrapper[18707]: I0320 09:10:10.099582 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:10:11.114118 master-0 kubenswrapper[18707]: I0320 09:10:11.114028 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4b8f74d-3043-4582-a645-333b12c6e32b" path="/var/lib/kubelet/pods/c4b8f74d-3043-4582-a645-333b12c6e32b/volumes" Mar 20 09:10:15.517242 master-0 kubenswrapper[18707]: I0320 09:10:15.517106 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 09:10:15.518362 master-0 kubenswrapper[18707]: E0320 09:10:15.517709 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b8f74d-3043-4582-a645-333b12c6e32b" containerName="nova-api-log" Mar 20 09:10:15.518362 master-0 kubenswrapper[18707]: I0320 09:10:15.517725 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b8f74d-3043-4582-a645-333b12c6e32b" containerName="nova-api-log" Mar 20 09:10:15.518362 master-0 kubenswrapper[18707]: E0320 09:10:15.517773 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b8f74d-3043-4582-a645-333b12c6e32b" containerName="nova-api-api" Mar 20 09:10:15.518362 master-0 kubenswrapper[18707]: I0320 09:10:15.517780 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b8f74d-3043-4582-a645-333b12c6e32b" containerName="nova-api-api" Mar 20 09:10:15.518362 master-0 kubenswrapper[18707]: I0320 09:10:15.518025 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b8f74d-3043-4582-a645-333b12c6e32b" containerName="nova-api-api" Mar 20 09:10:15.518362 master-0 kubenswrapper[18707]: I0320 09:10:15.518051 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b8f74d-3043-4582-a645-333b12c6e32b" containerName="nova-api-log" Mar 20 09:10:15.519857 master-0 kubenswrapper[18707]: I0320 09:10:15.519825 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:10:15.525809 master-0 kubenswrapper[18707]: I0320 09:10:15.525734 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 09:10:15.526468 master-0 kubenswrapper[18707]: I0320 09:10:15.526418 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 09:10:15.550612 master-0 kubenswrapper[18707]: I0320 09:10:15.550523 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.551226 master-0 kubenswrapper[18707]: I0320 09:10:15.550686 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqmcs\" (UniqueName: \"kubernetes.io/projected/837bb015-07a4-40ec-8588-7597a150a99c-kube-api-access-lqmcs\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.551226 master-0 kubenswrapper[18707]: I0320 09:10:15.550804 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-public-tls-certs\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.551226 master-0 kubenswrapper[18707]: I0320 09:10:15.550867 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.551226 master-0 kubenswrapper[18707]: I0320 09:10:15.550902 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-config-data\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.551226 master-0 kubenswrapper[18707]: I0320 09:10:15.550938 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837bb015-07a4-40ec-8588-7597a150a99c-logs\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.561869 master-0 kubenswrapper[18707]: I0320 09:10:15.561535 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 09:10:15.653862 master-0 kubenswrapper[18707]: I0320 09:10:15.653737 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqmcs\" (UniqueName: \"kubernetes.io/projected/837bb015-07a4-40ec-8588-7597a150a99c-kube-api-access-lqmcs\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.654402 master-0 kubenswrapper[18707]: I0320 09:10:15.654088 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-public-tls-certs\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.654402 master-0 kubenswrapper[18707]: I0320 09:10:15.654358 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.654546 master-0 kubenswrapper[18707]: I0320 09:10:15.654441 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-config-data\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.654546 master-0 kubenswrapper[18707]: I0320 09:10:15.654508 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837bb015-07a4-40ec-8588-7597a150a99c-logs\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.655009 master-0 kubenswrapper[18707]: I0320 09:10:15.654981 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.655170 master-0 kubenswrapper[18707]: I0320 09:10:15.655137 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837bb015-07a4-40ec-8588-7597a150a99c-logs\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.671197 master-0 kubenswrapper[18707]: I0320 09:10:15.671138 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-public-tls-certs\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.671918 master-0 kubenswrapper[18707]: I0320 09:10:15.671852 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.671987 master-0 kubenswrapper[18707]: I0320 09:10:15.671872 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-config-data\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:15.672532 master-0 kubenswrapper[18707]: I0320 09:10:15.672475 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:16.022217 master-0 kubenswrapper[18707]: I0320 09:10:16.007521 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:10:16.734864 master-0 kubenswrapper[18707]: I0320 09:10:16.734773 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 09:10:16.734864 master-0 kubenswrapper[18707]: I0320 09:10:16.734912 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 09:10:16.739132 master-0 kubenswrapper[18707]: I0320 09:10:16.739041 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 09:10:16.739643 master-0 kubenswrapper[18707]: I0320 09:10:16.739607 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 09:10:16.839434 master-0 kubenswrapper[18707]: I0320 09:10:16.839373 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqmcs\" (UniqueName: \"kubernetes.io/projected/837bb015-07a4-40ec-8588-7597a150a99c-kube-api-access-lqmcs\") pod \"nova-api-0\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " pod="openstack/nova-api-0" Mar 20 09:10:17.048880 master-0 kubenswrapper[18707]: I0320 09:10:17.048656 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:10:18.842316 master-0 kubenswrapper[18707]: I0320 09:10:18.817383 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:10:18.903970 master-0 kubenswrapper[18707]: I0320 09:10:18.903810 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"837bb015-07a4-40ec-8588-7597a150a99c","Type":"ContainerStarted","Data":"f3274786ae8cead644eb8e804dedabad4dce33ac22de428763aa6e0c097a86d3"} Mar 20 09:10:19.919555 master-0 kubenswrapper[18707]: I0320 09:10:19.919344 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"837bb015-07a4-40ec-8588-7597a150a99c","Type":"ContainerStarted","Data":"8c1cf568eb0fe34a3686363391b9338b6464092bbbe03c5eb1a757730002aed3"} Mar 20 09:10:20.942008 master-0 kubenswrapper[18707]: I0320 09:10:20.941930 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"837bb015-07a4-40ec-8588-7597a150a99c","Type":"ContainerStarted","Data":"257c45d3bbbe846aac4e51856e841a917ef6940a7502335cdaf002cba0995846"} Mar 20 09:10:24.688414 master-0 kubenswrapper[18707]: I0320 09:10:24.687531 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=12.6875046 podStartE2EDuration="12.6875046s" podCreationTimestamp="2026-03-20 09:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:10:24.6783159 +0000 UTC m=+1769.834496266" watchObservedRunningTime="2026-03-20 09:10:24.6875046 +0000 UTC m=+1769.843684976" Mar 20 09:10:27.049465 master-0 kubenswrapper[18707]: I0320 09:10:27.049397 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 09:10:27.049465 master-0 kubenswrapper[18707]: I0320 09:10:27.049459 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 09:10:27.114636 master-0 kubenswrapper[18707]: I0320 09:10:27.054592 18707 generic.go:334] "Generic (PLEG): container finished" podID="48343162-37bc-41e1-96cb-e5c2b7914c76" containerID="8469ccfcd66b36fd78da4cb8fb0d53915013a1b482985cc685a4b5d6c7455171" exitCode=0 Mar 20 09:10:27.114636 master-0 kubenswrapper[18707]: I0320 09:10:27.054651 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4h8z9" event={"ID":"48343162-37bc-41e1-96cb-e5c2b7914c76","Type":"ContainerDied","Data":"8469ccfcd66b36fd78da4cb8fb0d53915013a1b482985cc685a4b5d6c7455171"} Mar 20 09:10:28.064534 master-0 kubenswrapper[18707]: I0320 09:10:28.064436 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="837bb015-07a4-40ec-8588-7597a150a99c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.7:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 09:10:28.065317 master-0 kubenswrapper[18707]: I0320 09:10:28.064470 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="837bb015-07a4-40ec-8588-7597a150a99c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.7:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 09:10:28.535914 master-0 kubenswrapper[18707]: I0320 09:10:28.535846 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:10:29.100445 master-0 kubenswrapper[18707]: I0320 09:10:29.100373 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4h8z9" Mar 20 09:10:29.116096 master-0 kubenswrapper[18707]: I0320 09:10:29.116041 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4h8z9" event={"ID":"48343162-37bc-41e1-96cb-e5c2b7914c76","Type":"ContainerDied","Data":"d9757b5af5ae65ef4db4b7e0ffdff27431eab6f29a55f78cb0df55565e243b6d"} Mar 20 09:10:29.116096 master-0 kubenswrapper[18707]: I0320 09:10:29.116086 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9757b5af5ae65ef4db4b7e0ffdff27431eab6f29a55f78cb0df55565e243b6d" Mar 20 09:10:29.234616 master-0 kubenswrapper[18707]: I0320 09:10:29.234549 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-combined-ca-bundle\") pod \"48343162-37bc-41e1-96cb-e5c2b7914c76\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " Mar 20 09:10:29.234875 master-0 kubenswrapper[18707]: I0320 09:10:29.234779 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-config-data\") pod \"48343162-37bc-41e1-96cb-e5c2b7914c76\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " Mar 20 09:10:29.235015 master-0 kubenswrapper[18707]: I0320 09:10:29.234987 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-scripts\") pod \"48343162-37bc-41e1-96cb-e5c2b7914c76\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " Mar 20 09:10:29.235082 master-0 kubenswrapper[18707]: I0320 09:10:29.235020 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj2rj\" (UniqueName: \"kubernetes.io/projected/48343162-37bc-41e1-96cb-e5c2b7914c76-kube-api-access-bj2rj\") pod \"48343162-37bc-41e1-96cb-e5c2b7914c76\" (UID: \"48343162-37bc-41e1-96cb-e5c2b7914c76\") " Mar 20 09:10:29.245641 master-0 kubenswrapper[18707]: I0320 09:10:29.245554 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48343162-37bc-41e1-96cb-e5c2b7914c76-kube-api-access-bj2rj" (OuterVolumeSpecName: "kube-api-access-bj2rj") pod "48343162-37bc-41e1-96cb-e5c2b7914c76" (UID: "48343162-37bc-41e1-96cb-e5c2b7914c76"). InnerVolumeSpecName "kube-api-access-bj2rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:10:29.250579 master-0 kubenswrapper[18707]: I0320 09:10:29.250516 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-scripts" (OuterVolumeSpecName: "scripts") pod "48343162-37bc-41e1-96cb-e5c2b7914c76" (UID: "48343162-37bc-41e1-96cb-e5c2b7914c76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:10:29.264806 master-0 kubenswrapper[18707]: I0320 09:10:29.264734 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-config-data" (OuterVolumeSpecName: "config-data") pod "48343162-37bc-41e1-96cb-e5c2b7914c76" (UID: "48343162-37bc-41e1-96cb-e5c2b7914c76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:10:29.273356 master-0 kubenswrapper[18707]: I0320 09:10:29.273315 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48343162-37bc-41e1-96cb-e5c2b7914c76" (UID: "48343162-37bc-41e1-96cb-e5c2b7914c76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:10:29.341586 master-0 kubenswrapper[18707]: I0320 09:10:29.339859 18707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-scripts\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:29.341586 master-0 kubenswrapper[18707]: I0320 09:10:29.340218 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj2rj\" (UniqueName: \"kubernetes.io/projected/48343162-37bc-41e1-96cb-e5c2b7914c76-kube-api-access-bj2rj\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:29.341586 master-0 kubenswrapper[18707]: I0320 09:10:29.340238 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:29.341586 master-0 kubenswrapper[18707]: I0320 09:10:29.340311 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48343162-37bc-41e1-96cb-e5c2b7914c76-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:32.779388 master-0 kubenswrapper[18707]: I0320 09:10:32.779265 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:10:32.784456 master-0 kubenswrapper[18707]: I0320 09:10:32.779597 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="837bb015-07a4-40ec-8588-7597a150a99c" containerName="nova-api-log" containerID="cri-o://8c1cf568eb0fe34a3686363391b9338b6464092bbbe03c5eb1a757730002aed3" gracePeriod=30 Mar 20 09:10:32.784456 master-0 kubenswrapper[18707]: I0320 09:10:32.779694 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="837bb015-07a4-40ec-8588-7597a150a99c" containerName="nova-api-api" containerID="cri-o://257c45d3bbbe846aac4e51856e841a917ef6940a7502335cdaf002cba0995846" gracePeriod=30 Mar 20 09:10:32.902849 master-0 kubenswrapper[18707]: I0320 09:10:32.902796 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:10:32.903080 master-0 kubenswrapper[18707]: I0320 09:10:32.903029 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1297de32-c43b-4300-942c-2fc5cbd197f9" containerName="nova-scheduler-scheduler" containerID="cri-o://bb9e82a7de0e5a54084f101aa506a942c723b618c5aa6fddf0bf6079de9023d7" gracePeriod=30 Mar 20 09:10:33.036494 master-0 kubenswrapper[18707]: I0320 09:10:33.036376 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:10:33.036677 master-0 kubenswrapper[18707]: I0320 09:10:33.036614 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" containerName="nova-metadata-log" containerID="cri-o://b524c9b66f9e6cf08139a92b4cf4c4961912ac8abc1448b58ebe3efd6a9f8075" gracePeriod=30 Mar 20 09:10:33.036831 master-0 kubenswrapper[18707]: I0320 09:10:33.036741 18707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" containerName="nova-metadata-metadata" containerID="cri-o://63c14ed08e319b62a1467190401ecc88231b554637467824da26ce44b98481b6" gracePeriod=30 Mar 20 09:10:33.185369 master-0 kubenswrapper[18707]: I0320 09:10:33.185303 18707 generic.go:334] "Generic (PLEG): container finished" podID="837bb015-07a4-40ec-8588-7597a150a99c" containerID="8c1cf568eb0fe34a3686363391b9338b6464092bbbe03c5eb1a757730002aed3" exitCode=143 Mar 20 09:10:33.185586 master-0 kubenswrapper[18707]: I0320 09:10:33.185386 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"837bb015-07a4-40ec-8588-7597a150a99c","Type":"ContainerDied","Data":"8c1cf568eb0fe34a3686363391b9338b6464092bbbe03c5eb1a757730002aed3"} Mar 20 09:10:33.187587 master-0 kubenswrapper[18707]: I0320 09:10:33.187548 18707 generic.go:334] "Generic (PLEG): container finished" podID="3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" containerID="b524c9b66f9e6cf08139a92b4cf4c4961912ac8abc1448b58ebe3efd6a9f8075" exitCode=143 Mar 20 09:10:33.187587 master-0 kubenswrapper[18707]: I0320 09:10:33.187580 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a","Type":"ContainerDied","Data":"b524c9b66f9e6cf08139a92b4cf4c4961912ac8abc1448b58ebe3efd6a9f8075"} Mar 20 09:10:35.051767 master-0 kubenswrapper[18707]: I0320 09:10:35.051704 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 09:10:35.051767 master-0 kubenswrapper[18707]: I0320 09:10:35.051772 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 09:10:35.213796 master-0 kubenswrapper[18707]: I0320 09:10:35.213731 18707 generic.go:334] "Generic (PLEG): container finished" podID="1297de32-c43b-4300-942c-2fc5cbd197f9" containerID="bb9e82a7de0e5a54084f101aa506a942c723b618c5aa6fddf0bf6079de9023d7" exitCode=0 Mar 20 09:10:35.213796 master-0 kubenswrapper[18707]: I0320 09:10:35.213790 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1297de32-c43b-4300-942c-2fc5cbd197f9","Type":"ContainerDied","Data":"bb9e82a7de0e5a54084f101aa506a942c723b618c5aa6fddf0bf6079de9023d7"} Mar 20 09:10:35.214057 master-0 kubenswrapper[18707]: I0320 09:10:35.213821 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1297de32-c43b-4300-942c-2fc5cbd197f9","Type":"ContainerDied","Data":"01a21a89ee874b93366b59cf363848c1d35f05f7e175c9ad67e730350080f05f"} Mar 20 09:10:35.214057 master-0 kubenswrapper[18707]: I0320 09:10:35.213835 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01a21a89ee874b93366b59cf363848c1d35f05f7e175c9ad67e730350080f05f" Mar 20 09:10:35.283408 master-0 kubenswrapper[18707]: I0320 09:10:35.283341 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 09:10:35.419632 master-0 kubenswrapper[18707]: I0320 09:10:35.419230 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1297de32-c43b-4300-942c-2fc5cbd197f9-combined-ca-bundle\") pod \"1297de32-c43b-4300-942c-2fc5cbd197f9\" (UID: \"1297de32-c43b-4300-942c-2fc5cbd197f9\") " Mar 20 09:10:35.419632 master-0 kubenswrapper[18707]: I0320 09:10:35.419372 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92h9p\" (UniqueName: \"kubernetes.io/projected/1297de32-c43b-4300-942c-2fc5cbd197f9-kube-api-access-92h9p\") pod \"1297de32-c43b-4300-942c-2fc5cbd197f9\" (UID: \"1297de32-c43b-4300-942c-2fc5cbd197f9\") " Mar 20 09:10:35.419632 master-0 kubenswrapper[18707]: I0320 09:10:35.419480 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1297de32-c43b-4300-942c-2fc5cbd197f9-config-data\") pod \"1297de32-c43b-4300-942c-2fc5cbd197f9\" (UID: \"1297de32-c43b-4300-942c-2fc5cbd197f9\") " Mar 20 09:10:35.423850 master-0 kubenswrapper[18707]: I0320 09:10:35.423815 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1297de32-c43b-4300-942c-2fc5cbd197f9-kube-api-access-92h9p" (OuterVolumeSpecName: "kube-api-access-92h9p") pod "1297de32-c43b-4300-942c-2fc5cbd197f9" (UID: "1297de32-c43b-4300-942c-2fc5cbd197f9"). InnerVolumeSpecName "kube-api-access-92h9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:10:35.462416 master-0 kubenswrapper[18707]: I0320 09:10:35.462341 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1297de32-c43b-4300-942c-2fc5cbd197f9-config-data" (OuterVolumeSpecName: "config-data") pod "1297de32-c43b-4300-942c-2fc5cbd197f9" (UID: "1297de32-c43b-4300-942c-2fc5cbd197f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:10:35.477871 master-0 kubenswrapper[18707]: I0320 09:10:35.477804 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1297de32-c43b-4300-942c-2fc5cbd197f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1297de32-c43b-4300-942c-2fc5cbd197f9" (UID: "1297de32-c43b-4300-942c-2fc5cbd197f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:10:35.522764 master-0 kubenswrapper[18707]: I0320 09:10:35.522697 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92h9p\" (UniqueName: \"kubernetes.io/projected/1297de32-c43b-4300-942c-2fc5cbd197f9-kube-api-access-92h9p\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:35.523085 master-0 kubenswrapper[18707]: I0320 09:10:35.523067 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1297de32-c43b-4300-942c-2fc5cbd197f9-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:35.523175 master-0 kubenswrapper[18707]: I0320 09:10:35.523163 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1297de32-c43b-4300-942c-2fc5cbd197f9-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:36.239688 master-0 kubenswrapper[18707]: I0320 09:10:36.239076 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 09:10:36.390567 master-0 kubenswrapper[18707]: I0320 09:10:36.390497 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:10:36.408293 master-0 kubenswrapper[18707]: I0320 09:10:36.407162 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:10:36.661079 master-0 kubenswrapper[18707]: I0320 09:10:36.660997 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:10:36.661639 master-0 kubenswrapper[18707]: E0320 09:10:36.661594 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48343162-37bc-41e1-96cb-e5c2b7914c76" containerName="nova-manage" Mar 20 09:10:36.661639 master-0 kubenswrapper[18707]: I0320 09:10:36.661617 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="48343162-37bc-41e1-96cb-e5c2b7914c76" containerName="nova-manage" Mar 20 09:10:36.661727 master-0 kubenswrapper[18707]: E0320 09:10:36.661644 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1297de32-c43b-4300-942c-2fc5cbd197f9" containerName="nova-scheduler-scheduler" Mar 20 09:10:36.661727 master-0 kubenswrapper[18707]: I0320 09:10:36.661652 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1297de32-c43b-4300-942c-2fc5cbd197f9" containerName="nova-scheduler-scheduler" Mar 20 09:10:36.661999 master-0 kubenswrapper[18707]: I0320 09:10:36.661968 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="48343162-37bc-41e1-96cb-e5c2b7914c76" containerName="nova-manage" Mar 20 09:10:36.662051 master-0 kubenswrapper[18707]: I0320 09:10:36.662014 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1297de32-c43b-4300-942c-2fc5cbd197f9" containerName="nova-scheduler-scheduler" Mar 20 09:10:36.663075 master-0 kubenswrapper[18707]: I0320 09:10:36.663035 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 09:10:36.667947 master-0 kubenswrapper[18707]: I0320 09:10:36.667753 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 09:10:36.723226 master-0 kubenswrapper[18707]: I0320 09:10:36.722370 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:10:36.760725 master-0 kubenswrapper[18707]: I0320 09:10:36.760645 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpctf\" (UniqueName: \"kubernetes.io/projected/5917ff52-41bc-4b19-9727-75016262c3f9-kube-api-access-zpctf\") pod \"nova-scheduler-0\" (UID: \"5917ff52-41bc-4b19-9727-75016262c3f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:10:36.760928 master-0 kubenswrapper[18707]: I0320 09:10:36.760752 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5917ff52-41bc-4b19-9727-75016262c3f9-config-data\") pod \"nova-scheduler-0\" (UID: \"5917ff52-41bc-4b19-9727-75016262c3f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:10:36.760928 master-0 kubenswrapper[18707]: I0320 09:10:36.760852 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5917ff52-41bc-4b19-9727-75016262c3f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5917ff52-41bc-4b19-9727-75016262c3f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:10:36.818552 master-0 kubenswrapper[18707]: I0320 09:10:36.818505 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 09:10:36.862624 master-0 kubenswrapper[18707]: I0320 09:10:36.862577 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcjdg\" (UniqueName: \"kubernetes.io/projected/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-kube-api-access-lcjdg\") pod \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " Mar 20 09:10:36.862949 master-0 kubenswrapper[18707]: I0320 09:10:36.862931 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-config-data\") pod \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " Mar 20 09:10:36.863324 master-0 kubenswrapper[18707]: I0320 09:10:36.863254 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-nova-metadata-tls-certs\") pod \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " Mar 20 09:10:36.863690 master-0 kubenswrapper[18707]: I0320 09:10:36.863669 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-logs\") pod \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " Mar 20 09:10:36.863992 master-0 kubenswrapper[18707]: I0320 09:10:36.863970 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-combined-ca-bundle\") pod \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\" (UID: \"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a\") " Mar 20 09:10:36.864125 master-0 kubenswrapper[18707]: I0320 09:10:36.864068 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-logs" (OuterVolumeSpecName: "logs") pod "3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" (UID: "3578d407-c1f9-45e5-a8b3-bb6b7ed5071a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:10:36.864863 master-0 kubenswrapper[18707]: I0320 09:10:36.864838 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpctf\" (UniqueName: \"kubernetes.io/projected/5917ff52-41bc-4b19-9727-75016262c3f9-kube-api-access-zpctf\") pod \"nova-scheduler-0\" (UID: \"5917ff52-41bc-4b19-9727-75016262c3f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:10:36.865060 master-0 kubenswrapper[18707]: I0320 09:10:36.865040 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5917ff52-41bc-4b19-9727-75016262c3f9-config-data\") pod \"nova-scheduler-0\" (UID: \"5917ff52-41bc-4b19-9727-75016262c3f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:10:36.865309 master-0 kubenswrapper[18707]: I0320 09:10:36.865288 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5917ff52-41bc-4b19-9727-75016262c3f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5917ff52-41bc-4b19-9727-75016262c3f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:10:36.865780 master-0 kubenswrapper[18707]: I0320 09:10:36.865731 18707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-logs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:36.867458 master-0 kubenswrapper[18707]: I0320 09:10:36.867401 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-kube-api-access-lcjdg" (OuterVolumeSpecName: "kube-api-access-lcjdg") pod "3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" (UID: "3578d407-c1f9-45e5-a8b3-bb6b7ed5071a"). InnerVolumeSpecName "kube-api-access-lcjdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:10:36.868872 master-0 kubenswrapper[18707]: I0320 09:10:36.868837 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5917ff52-41bc-4b19-9727-75016262c3f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5917ff52-41bc-4b19-9727-75016262c3f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:10:36.870220 master-0 kubenswrapper[18707]: I0320 09:10:36.870135 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5917ff52-41bc-4b19-9727-75016262c3f9-config-data\") pod \"nova-scheduler-0\" (UID: \"5917ff52-41bc-4b19-9727-75016262c3f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:10:36.897940 master-0 kubenswrapper[18707]: I0320 09:10:36.897858 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" (UID: "3578d407-c1f9-45e5-a8b3-bb6b7ed5071a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:10:36.917828 master-0 kubenswrapper[18707]: I0320 09:10:36.914438 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-config-data" (OuterVolumeSpecName: "config-data") pod "3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" (UID: "3578d407-c1f9-45e5-a8b3-bb6b7ed5071a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:10:36.927785 master-0 kubenswrapper[18707]: I0320 09:10:36.927692 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" (UID: "3578d407-c1f9-45e5-a8b3-bb6b7ed5071a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:10:36.968974 master-0 kubenswrapper[18707]: I0320 09:10:36.967295 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpctf\" (UniqueName: \"kubernetes.io/projected/5917ff52-41bc-4b19-9727-75016262c3f9-kube-api-access-zpctf\") pod \"nova-scheduler-0\" (UID: \"5917ff52-41bc-4b19-9727-75016262c3f9\") " pod="openstack/nova-scheduler-0" Mar 20 09:10:36.973829 master-0 kubenswrapper[18707]: I0320 09:10:36.973765 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:36.973829 master-0 kubenswrapper[18707]: I0320 09:10:36.973823 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcjdg\" (UniqueName: \"kubernetes.io/projected/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-kube-api-access-lcjdg\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:36.974024 master-0 kubenswrapper[18707]: I0320 09:10:36.973844 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:36.974024 master-0 kubenswrapper[18707]: I0320 09:10:36.973857 18707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:36.991975 master-0 kubenswrapper[18707]: I0320 09:10:36.991911 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 09:10:37.130511 master-0 kubenswrapper[18707]: I0320 09:10:37.130417 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1297de32-c43b-4300-942c-2fc5cbd197f9" path="/var/lib/kubelet/pods/1297de32-c43b-4300-942c-2fc5cbd197f9/volumes" Mar 20 09:10:37.284157 master-0 kubenswrapper[18707]: I0320 09:10:37.284082 18707 generic.go:334] "Generic (PLEG): container finished" podID="3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" containerID="63c14ed08e319b62a1467190401ecc88231b554637467824da26ce44b98481b6" exitCode=0 Mar 20 09:10:37.284157 master-0 kubenswrapper[18707]: I0320 09:10:37.284154 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a","Type":"ContainerDied","Data":"63c14ed08e319b62a1467190401ecc88231b554637467824da26ce44b98481b6"} Mar 20 09:10:37.284776 master-0 kubenswrapper[18707]: I0320 09:10:37.284213 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3578d407-c1f9-45e5-a8b3-bb6b7ed5071a","Type":"ContainerDied","Data":"13a16d3634815ccc54488a78dc7d418e72710f1e274875789138ecbeeb4db79e"} Mar 20 09:10:37.284776 master-0 kubenswrapper[18707]: I0320 09:10:37.284238 18707 scope.go:117] "RemoveContainer" containerID="63c14ed08e319b62a1467190401ecc88231b554637467824da26ce44b98481b6" Mar 20 09:10:37.284776 master-0 kubenswrapper[18707]: I0320 09:10:37.284423 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 09:10:37.361795 master-0 kubenswrapper[18707]: I0320 09:10:37.361461 18707 scope.go:117] "RemoveContainer" containerID="b524c9b66f9e6cf08139a92b4cf4c4961912ac8abc1448b58ebe3efd6a9f8075" Mar 20 09:10:37.388121 master-0 kubenswrapper[18707]: I0320 09:10:37.388070 18707 scope.go:117] "RemoveContainer" containerID="63c14ed08e319b62a1467190401ecc88231b554637467824da26ce44b98481b6" Mar 20 09:10:37.388944 master-0 kubenswrapper[18707]: E0320 09:10:37.388872 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63c14ed08e319b62a1467190401ecc88231b554637467824da26ce44b98481b6\": container with ID starting with 63c14ed08e319b62a1467190401ecc88231b554637467824da26ce44b98481b6 not found: ID does not exist" containerID="63c14ed08e319b62a1467190401ecc88231b554637467824da26ce44b98481b6" Mar 20 09:10:37.389025 master-0 kubenswrapper[18707]: I0320 09:10:37.388955 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63c14ed08e319b62a1467190401ecc88231b554637467824da26ce44b98481b6"} err="failed to get container status \"63c14ed08e319b62a1467190401ecc88231b554637467824da26ce44b98481b6\": rpc error: code = NotFound desc = could not find container \"63c14ed08e319b62a1467190401ecc88231b554637467824da26ce44b98481b6\": container with ID starting with 63c14ed08e319b62a1467190401ecc88231b554637467824da26ce44b98481b6 not found: ID does not exist" Mar 20 09:10:37.389025 master-0 kubenswrapper[18707]: I0320 09:10:37.388999 18707 scope.go:117] "RemoveContainer" containerID="b524c9b66f9e6cf08139a92b4cf4c4961912ac8abc1448b58ebe3efd6a9f8075" Mar 20 09:10:37.389737 master-0 kubenswrapper[18707]: E0320 09:10:37.389697 18707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b524c9b66f9e6cf08139a92b4cf4c4961912ac8abc1448b58ebe3efd6a9f8075\": container with ID starting with b524c9b66f9e6cf08139a92b4cf4c4961912ac8abc1448b58ebe3efd6a9f8075 not found: ID does not exist" containerID="b524c9b66f9e6cf08139a92b4cf4c4961912ac8abc1448b58ebe3efd6a9f8075" Mar 20 09:10:37.389814 master-0 kubenswrapper[18707]: I0320 09:10:37.389733 18707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b524c9b66f9e6cf08139a92b4cf4c4961912ac8abc1448b58ebe3efd6a9f8075"} err="failed to get container status \"b524c9b66f9e6cf08139a92b4cf4c4961912ac8abc1448b58ebe3efd6a9f8075\": rpc error: code = NotFound desc = could not find container \"b524c9b66f9e6cf08139a92b4cf4c4961912ac8abc1448b58ebe3efd6a9f8075\": container with ID starting with b524c9b66f9e6cf08139a92b4cf4c4961912ac8abc1448b58ebe3efd6a9f8075 not found: ID does not exist" Mar 20 09:10:37.514293 master-0 kubenswrapper[18707]: I0320 09:10:37.514087 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:10:37.620348 master-0 kubenswrapper[18707]: I0320 09:10:37.620270 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:10:37.755917 master-0 kubenswrapper[18707]: I0320 09:10:37.755809 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:10:37.756729 master-0 kubenswrapper[18707]: E0320 09:10:37.756682 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" containerName="nova-metadata-log" Mar 20 09:10:37.756729 master-0 kubenswrapper[18707]: I0320 09:10:37.756718 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" containerName="nova-metadata-log" Mar 20 09:10:37.756866 master-0 kubenswrapper[18707]: E0320 09:10:37.756817 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" containerName="nova-metadata-metadata" Mar 20 09:10:37.756866 master-0 kubenswrapper[18707]: I0320 09:10:37.756826 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" containerName="nova-metadata-metadata" Mar 20 09:10:37.757416 master-0 kubenswrapper[18707]: I0320 09:10:37.757377 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" containerName="nova-metadata-metadata" Mar 20 09:10:37.757489 master-0 kubenswrapper[18707]: I0320 09:10:37.757419 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" containerName="nova-metadata-log" Mar 20 09:10:37.759867 master-0 kubenswrapper[18707]: I0320 09:10:37.759827 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 09:10:37.762558 master-0 kubenswrapper[18707]: I0320 09:10:37.762526 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 09:10:37.763741 master-0 kubenswrapper[18707]: I0320 09:10:37.763648 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 09:10:37.781696 master-0 kubenswrapper[18707]: I0320 09:10:37.781610 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afda99a5-2da6-4e38-a7f3-ff85ca74752c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"afda99a5-2da6-4e38-a7f3-ff85ca74752c\") " pod="openstack/nova-metadata-0" Mar 20 09:10:37.781869 master-0 kubenswrapper[18707]: I0320 09:10:37.781769 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/afda99a5-2da6-4e38-a7f3-ff85ca74752c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"afda99a5-2da6-4e38-a7f3-ff85ca74752c\") " pod="openstack/nova-metadata-0" Mar 20 09:10:37.781998 master-0 kubenswrapper[18707]: I0320 09:10:37.781961 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsncq\" (UniqueName: \"kubernetes.io/projected/afda99a5-2da6-4e38-a7f3-ff85ca74752c-kube-api-access-jsncq\") pod \"nova-metadata-0\" (UID: \"afda99a5-2da6-4e38-a7f3-ff85ca74752c\") " pod="openstack/nova-metadata-0" Mar 20 09:10:37.782074 master-0 kubenswrapper[18707]: I0320 09:10:37.782047 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afda99a5-2da6-4e38-a7f3-ff85ca74752c-config-data\") pod \"nova-metadata-0\" (UID: \"afda99a5-2da6-4e38-a7f3-ff85ca74752c\") " pod="openstack/nova-metadata-0" Mar 20 09:10:37.782652 master-0 kubenswrapper[18707]: I0320 09:10:37.782552 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afda99a5-2da6-4e38-a7f3-ff85ca74752c-logs\") pod \"nova-metadata-0\" (UID: \"afda99a5-2da6-4e38-a7f3-ff85ca74752c\") " pod="openstack/nova-metadata-0" Mar 20 09:10:37.832323 master-0 kubenswrapper[18707]: I0320 09:10:37.832251 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 09:10:37.885813 master-0 kubenswrapper[18707]: I0320 09:10:37.885692 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsncq\" (UniqueName: \"kubernetes.io/projected/afda99a5-2da6-4e38-a7f3-ff85ca74752c-kube-api-access-jsncq\") pod \"nova-metadata-0\" (UID: \"afda99a5-2da6-4e38-a7f3-ff85ca74752c\") " pod="openstack/nova-metadata-0" Mar 20 09:10:37.886075 master-0 kubenswrapper[18707]: I0320 09:10:37.885931 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afda99a5-2da6-4e38-a7f3-ff85ca74752c-config-data\") pod \"nova-metadata-0\" (UID: \"afda99a5-2da6-4e38-a7f3-ff85ca74752c\") " pod="openstack/nova-metadata-0" Mar 20 09:10:37.899145 master-0 kubenswrapper[18707]: I0320 09:10:37.887130 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afda99a5-2da6-4e38-a7f3-ff85ca74752c-logs\") pod \"nova-metadata-0\" (UID: \"afda99a5-2da6-4e38-a7f3-ff85ca74752c\") " pod="openstack/nova-metadata-0" Mar 20 09:10:37.899145 master-0 kubenswrapper[18707]: I0320 09:10:37.887382 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afda99a5-2da6-4e38-a7f3-ff85ca74752c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"afda99a5-2da6-4e38-a7f3-ff85ca74752c\") " pod="openstack/nova-metadata-0" Mar 20 09:10:37.899145 master-0 kubenswrapper[18707]: I0320 09:10:37.887482 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/afda99a5-2da6-4e38-a7f3-ff85ca74752c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"afda99a5-2da6-4e38-a7f3-ff85ca74752c\") " pod="openstack/nova-metadata-0" Mar 20 09:10:37.899145 master-0 kubenswrapper[18707]: I0320 09:10:37.888177 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afda99a5-2da6-4e38-a7f3-ff85ca74752c-logs\") pod \"nova-metadata-0\" (UID: \"afda99a5-2da6-4e38-a7f3-ff85ca74752c\") " pod="openstack/nova-metadata-0" Mar 20 09:10:37.899145 master-0 kubenswrapper[18707]: I0320 09:10:37.891139 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afda99a5-2da6-4e38-a7f3-ff85ca74752c-config-data\") pod \"nova-metadata-0\" (UID: \"afda99a5-2da6-4e38-a7f3-ff85ca74752c\") " pod="openstack/nova-metadata-0" Mar 20 09:10:37.899145 master-0 kubenswrapper[18707]: I0320 09:10:37.891850 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afda99a5-2da6-4e38-a7f3-ff85ca74752c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"afda99a5-2da6-4e38-a7f3-ff85ca74752c\") " pod="openstack/nova-metadata-0" Mar 20 09:10:37.899145 master-0 kubenswrapper[18707]: I0320 09:10:37.893952 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/afda99a5-2da6-4e38-a7f3-ff85ca74752c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"afda99a5-2da6-4e38-a7f3-ff85ca74752c\") " pod="openstack/nova-metadata-0" Mar 20 09:10:38.018718 master-0 kubenswrapper[18707]: I0320 09:10:38.018609 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:10:38.095275 master-0 kubenswrapper[18707]: I0320 09:10:38.093402 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsncq\" (UniqueName: \"kubernetes.io/projected/afda99a5-2da6-4e38-a7f3-ff85ca74752c-kube-api-access-jsncq\") pod \"nova-metadata-0\" (UID: \"afda99a5-2da6-4e38-a7f3-ff85ca74752c\") " pod="openstack/nova-metadata-0" Mar 20 09:10:38.320602 master-0 kubenswrapper[18707]: I0320 09:10:38.320516 18707 generic.go:334] "Generic (PLEG): container finished" podID="837bb015-07a4-40ec-8588-7597a150a99c" containerID="257c45d3bbbe846aac4e51856e841a917ef6940a7502335cdaf002cba0995846" exitCode=0 Mar 20 09:10:38.321127 master-0 kubenswrapper[18707]: I0320 09:10:38.320640 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"837bb015-07a4-40ec-8588-7597a150a99c","Type":"ContainerDied","Data":"257c45d3bbbe846aac4e51856e841a917ef6940a7502335cdaf002cba0995846"} Mar 20 09:10:38.328199 master-0 kubenswrapper[18707]: I0320 09:10:38.328140 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5917ff52-41bc-4b19-9727-75016262c3f9","Type":"ContainerStarted","Data":"89e9249870714eb82d17a766c90b633a15034094bc5be41a03f71bae073c3294"} Mar 20 09:10:38.328300 master-0 kubenswrapper[18707]: I0320 09:10:38.328285 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5917ff52-41bc-4b19-9727-75016262c3f9","Type":"ContainerStarted","Data":"459d95741b6aefd600a4ba71c93b8dd85f5f3fcfc025cedf0ab3d5e2bd1de0d9"} Mar 20 09:10:38.353845 master-0 kubenswrapper[18707]: I0320 09:10:38.353792 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:10:38.378830 master-0 kubenswrapper[18707]: I0320 09:10:38.378726 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 09:10:38.492966 master-0 kubenswrapper[18707]: I0320 09:10:38.492812 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.492778131 podStartE2EDuration="2.492778131s" podCreationTimestamp="2026-03-20 09:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:10:38.470788 +0000 UTC m=+1783.626968396" watchObservedRunningTime="2026-03-20 09:10:38.492778131 +0000 UTC m=+1783.648958487" Mar 20 09:10:38.528157 master-0 kubenswrapper[18707]: I0320 09:10:38.522273 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837bb015-07a4-40ec-8588-7597a150a99c-logs\") pod \"837bb015-07a4-40ec-8588-7597a150a99c\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " Mar 20 09:10:38.528157 master-0 kubenswrapper[18707]: I0320 09:10:38.522378 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-combined-ca-bundle\") pod \"837bb015-07a4-40ec-8588-7597a150a99c\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " Mar 20 09:10:38.528157 master-0 kubenswrapper[18707]: I0320 09:10:38.522521 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-internal-tls-certs\") pod \"837bb015-07a4-40ec-8588-7597a150a99c\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " Mar 20 09:10:38.528157 master-0 kubenswrapper[18707]: I0320 09:10:38.522685 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqmcs\" (UniqueName: \"kubernetes.io/projected/837bb015-07a4-40ec-8588-7597a150a99c-kube-api-access-lqmcs\") pod \"837bb015-07a4-40ec-8588-7597a150a99c\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " Mar 20 09:10:38.528157 master-0 kubenswrapper[18707]: I0320 09:10:38.522924 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-config-data\") pod \"837bb015-07a4-40ec-8588-7597a150a99c\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " Mar 20 09:10:38.528157 master-0 kubenswrapper[18707]: I0320 09:10:38.522967 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-public-tls-certs\") pod \"837bb015-07a4-40ec-8588-7597a150a99c\" (UID: \"837bb015-07a4-40ec-8588-7597a150a99c\") " Mar 20 09:10:38.528157 master-0 kubenswrapper[18707]: I0320 09:10:38.525426 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837bb015-07a4-40ec-8588-7597a150a99c-logs" (OuterVolumeSpecName: "logs") pod "837bb015-07a4-40ec-8588-7597a150a99c" (UID: "837bb015-07a4-40ec-8588-7597a150a99c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:10:38.528157 master-0 kubenswrapper[18707]: I0320 09:10:38.526018 18707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837bb015-07a4-40ec-8588-7597a150a99c-logs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:38.549957 master-0 kubenswrapper[18707]: I0320 09:10:38.549890 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/837bb015-07a4-40ec-8588-7597a150a99c-kube-api-access-lqmcs" (OuterVolumeSpecName: "kube-api-access-lqmcs") pod "837bb015-07a4-40ec-8588-7597a150a99c" (UID: "837bb015-07a4-40ec-8588-7597a150a99c"). InnerVolumeSpecName "kube-api-access-lqmcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:10:38.570707 master-0 kubenswrapper[18707]: I0320 09:10:38.570544 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "837bb015-07a4-40ec-8588-7597a150a99c" (UID: "837bb015-07a4-40ec-8588-7597a150a99c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:10:38.592209 master-0 kubenswrapper[18707]: I0320 09:10:38.590652 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-config-data" (OuterVolumeSpecName: "config-data") pod "837bb015-07a4-40ec-8588-7597a150a99c" (UID: "837bb015-07a4-40ec-8588-7597a150a99c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:10:38.607010 master-0 kubenswrapper[18707]: I0320 09:10:38.606457 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "837bb015-07a4-40ec-8588-7597a150a99c" (UID: "837bb015-07a4-40ec-8588-7597a150a99c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:10:38.613969 master-0 kubenswrapper[18707]: I0320 09:10:38.613902 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "837bb015-07a4-40ec-8588-7597a150a99c" (UID: "837bb015-07a4-40ec-8588-7597a150a99c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:10:38.631446 master-0 kubenswrapper[18707]: I0320 09:10:38.630515 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:38.631446 master-0 kubenswrapper[18707]: I0320 09:10:38.630621 18707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:38.631446 master-0 kubenswrapper[18707]: I0320 09:10:38.630638 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqmcs\" (UniqueName: \"kubernetes.io/projected/837bb015-07a4-40ec-8588-7597a150a99c-kube-api-access-lqmcs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:38.631446 master-0 kubenswrapper[18707]: I0320 09:10:38.630654 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:38.631446 master-0 kubenswrapper[18707]: I0320 09:10:38.630666 18707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/837bb015-07a4-40ec-8588-7597a150a99c-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:10:39.118601 master-0 kubenswrapper[18707]: I0320 09:10:39.118501 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3578d407-c1f9-45e5-a8b3-bb6b7ed5071a" path="/var/lib/kubelet/pods/3578d407-c1f9-45e5-a8b3-bb6b7ed5071a/volumes" Mar 20 09:10:39.347504 master-0 kubenswrapper[18707]: I0320 09:10:39.345917 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:10:39.347504 master-0 kubenswrapper[18707]: I0320 09:10:39.345987 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"837bb015-07a4-40ec-8588-7597a150a99c","Type":"ContainerDied","Data":"f3274786ae8cead644eb8e804dedabad4dce33ac22de428763aa6e0c097a86d3"} Mar 20 09:10:39.347504 master-0 kubenswrapper[18707]: I0320 09:10:39.346068 18707 scope.go:117] "RemoveContainer" containerID="257c45d3bbbe846aac4e51856e841a917ef6940a7502335cdaf002cba0995846" Mar 20 09:10:39.411830 master-0 kubenswrapper[18707]: I0320 09:10:39.408444 18707 scope.go:117] "RemoveContainer" containerID="8c1cf568eb0fe34a3686363391b9338b6464092bbbe03c5eb1a757730002aed3" Mar 20 09:10:39.575099 master-0 kubenswrapper[18707]: I0320 09:10:39.575036 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 09:10:40.365414 master-0 kubenswrapper[18707]: I0320 09:10:40.365218 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afda99a5-2da6-4e38-a7f3-ff85ca74752c","Type":"ContainerStarted","Data":"5db5b5baaff20a69fca170b2d816a3b2beadcbf01e3519b6650059ddf88c21b6"} Mar 20 09:10:40.365414 master-0 kubenswrapper[18707]: I0320 09:10:40.365286 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afda99a5-2da6-4e38-a7f3-ff85ca74752c","Type":"ContainerStarted","Data":"e2be9f92ff3dc4ffecb6d15de2e6891150a4a19a58f4edad9cfdd94ecfcf5c3f"} Mar 20 09:10:41.327044 master-0 kubenswrapper[18707]: I0320 09:10:41.326997 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:10:41.386358 master-0 kubenswrapper[18707]: I0320 09:10:41.386299 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afda99a5-2da6-4e38-a7f3-ff85ca74752c","Type":"ContainerStarted","Data":"6c9078cc396717146598fb240da9f956b21591d968f13f31d1b4464e0f15559d"} Mar 20 09:10:41.445494 master-0 kubenswrapper[18707]: I0320 09:10:41.445449 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:10:41.624714 master-0 kubenswrapper[18707]: I0320 09:10:41.624538 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.624515844 podStartE2EDuration="4.624515844s" podCreationTimestamp="2026-03-20 09:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:10:41.599887489 +0000 UTC m=+1786.756067855" watchObservedRunningTime="2026-03-20 09:10:41.624515844 +0000 UTC m=+1786.780696200" Mar 20 09:10:41.658501 master-0 kubenswrapper[18707]: I0320 09:10:41.658439 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 09:10:41.659017 master-0 kubenswrapper[18707]: E0320 09:10:41.658991 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837bb015-07a4-40ec-8588-7597a150a99c" containerName="nova-api-log" Mar 20 09:10:41.659017 master-0 kubenswrapper[18707]: I0320 09:10:41.659011 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="837bb015-07a4-40ec-8588-7597a150a99c" containerName="nova-api-log" Mar 20 09:10:41.659145 master-0 kubenswrapper[18707]: E0320 09:10:41.659038 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837bb015-07a4-40ec-8588-7597a150a99c" containerName="nova-api-api" Mar 20 09:10:41.659145 master-0 kubenswrapper[18707]: I0320 09:10:41.659044 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="837bb015-07a4-40ec-8588-7597a150a99c" containerName="nova-api-api" Mar 20 09:10:41.659356 master-0 kubenswrapper[18707]: I0320 09:10:41.659331 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="837bb015-07a4-40ec-8588-7597a150a99c" containerName="nova-api-log" Mar 20 09:10:41.659418 master-0 kubenswrapper[18707]: I0320 09:10:41.659366 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="837bb015-07a4-40ec-8588-7597a150a99c" containerName="nova-api-api" Mar 20 09:10:41.660765 master-0 kubenswrapper[18707]: I0320 09:10:41.660737 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:10:41.664463 master-0 kubenswrapper[18707]: I0320 09:10:41.664420 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 09:10:41.664700 master-0 kubenswrapper[18707]: I0320 09:10:41.664645 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 09:10:41.664753 master-0 kubenswrapper[18707]: I0320 09:10:41.664726 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 09:10:41.738216 master-0 kubenswrapper[18707]: I0320 09:10:41.736941 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/226fb8c1-8582-4aef-91b7-fcdb77486ecd-config-data\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.738216 master-0 kubenswrapper[18707]: I0320 09:10:41.737109 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/226fb8c1-8582-4aef-91b7-fcdb77486ecd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.738216 master-0 kubenswrapper[18707]: I0320 09:10:41.737285 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/226fb8c1-8582-4aef-91b7-fcdb77486ecd-public-tls-certs\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.738216 master-0 kubenswrapper[18707]: I0320 09:10:41.737411 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpw89\" (UniqueName: \"kubernetes.io/projected/226fb8c1-8582-4aef-91b7-fcdb77486ecd-kube-api-access-fpw89\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.738216 master-0 kubenswrapper[18707]: I0320 09:10:41.737476 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226fb8c1-8582-4aef-91b7-fcdb77486ecd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.738216 master-0 kubenswrapper[18707]: I0320 09:10:41.737543 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/226fb8c1-8582-4aef-91b7-fcdb77486ecd-logs\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.740062 master-0 kubenswrapper[18707]: I0320 09:10:41.739824 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:10:41.840157 master-0 kubenswrapper[18707]: I0320 09:10:41.840082 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/226fb8c1-8582-4aef-91b7-fcdb77486ecd-config-data\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.840423 master-0 kubenswrapper[18707]: I0320 09:10:41.840210 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/226fb8c1-8582-4aef-91b7-fcdb77486ecd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.840423 master-0 kubenswrapper[18707]: I0320 09:10:41.840272 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/226fb8c1-8582-4aef-91b7-fcdb77486ecd-public-tls-certs\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.840542 master-0 kubenswrapper[18707]: I0320 09:10:41.840483 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpw89\" (UniqueName: \"kubernetes.io/projected/226fb8c1-8582-4aef-91b7-fcdb77486ecd-kube-api-access-fpw89\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.840688 master-0 kubenswrapper[18707]: I0320 09:10:41.840658 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226fb8c1-8582-4aef-91b7-fcdb77486ecd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.840819 master-0 kubenswrapper[18707]: I0320 09:10:41.840784 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/226fb8c1-8582-4aef-91b7-fcdb77486ecd-logs\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.841301 master-0 kubenswrapper[18707]: I0320 09:10:41.841268 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/226fb8c1-8582-4aef-91b7-fcdb77486ecd-logs\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.843770 master-0 kubenswrapper[18707]: I0320 09:10:41.843691 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/226fb8c1-8582-4aef-91b7-fcdb77486ecd-public-tls-certs\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.844577 master-0 kubenswrapper[18707]: I0320 09:10:41.844537 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/226fb8c1-8582-4aef-91b7-fcdb77486ecd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.844660 master-0 kubenswrapper[18707]: I0320 09:10:41.844617 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226fb8c1-8582-4aef-91b7-fcdb77486ecd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.845881 master-0 kubenswrapper[18707]: I0320 09:10:41.845840 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/226fb8c1-8582-4aef-91b7-fcdb77486ecd-config-data\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:41.992238 master-0 kubenswrapper[18707]: I0320 09:10:41.992073 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 09:10:42.445716 master-0 kubenswrapper[18707]: I0320 09:10:42.445281 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpw89\" (UniqueName: \"kubernetes.io/projected/226fb8c1-8582-4aef-91b7-fcdb77486ecd-kube-api-access-fpw89\") pod \"nova-api-0\" (UID: \"226fb8c1-8582-4aef-91b7-fcdb77486ecd\") " pod="openstack/nova-api-0" Mar 20 09:10:42.588368 master-0 kubenswrapper[18707]: I0320 09:10:42.584954 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 09:10:43.119304 master-0 kubenswrapper[18707]: I0320 09:10:43.118274 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="837bb015-07a4-40ec-8588-7597a150a99c" path="/var/lib/kubelet/pods/837bb015-07a4-40ec-8588-7597a150a99c/volumes" Mar 20 09:10:43.138308 master-0 kubenswrapper[18707]: I0320 09:10:43.138256 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 09:10:43.442627 master-0 kubenswrapper[18707]: I0320 09:10:43.442556 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"226fb8c1-8582-4aef-91b7-fcdb77486ecd","Type":"ContainerStarted","Data":"60848ac33938dc8b3635021ea3c5035b469c16d60c12f3e3ee8de30e68a0811d"} Mar 20 09:10:44.455886 master-0 kubenswrapper[18707]: I0320 09:10:44.455820 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"226fb8c1-8582-4aef-91b7-fcdb77486ecd","Type":"ContainerStarted","Data":"de6c51b6087130bab89b4ce1caa7e55de6cf60ec860e478554a8929ce50ef050"} Mar 20 09:10:44.455886 master-0 kubenswrapper[18707]: I0320 09:10:44.455891 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"226fb8c1-8582-4aef-91b7-fcdb77486ecd","Type":"ContainerStarted","Data":"09d6727ad8cd12737c9ec1a7af26c44b0de21d45ab439f6ba485ccbf5756023a"} Mar 20 09:10:44.603732 master-0 kubenswrapper[18707]: I0320 09:10:44.603543 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.603516764 podStartE2EDuration="3.603516764s" podCreationTimestamp="2026-03-20 09:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:10:44.594598792 +0000 UTC m=+1789.750779158" watchObservedRunningTime="2026-03-20 09:10:44.603516764 +0000 UTC m=+1789.759697110" Mar 20 09:10:46.992829 master-0 kubenswrapper[18707]: I0320 09:10:46.992760 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 09:10:47.023272 master-0 kubenswrapper[18707]: I0320 09:10:47.023208 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 09:10:47.522344 master-0 kubenswrapper[18707]: I0320 09:10:47.522294 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 09:10:48.379795 master-0 kubenswrapper[18707]: I0320 09:10:48.379544 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 09:10:48.379795 master-0 kubenswrapper[18707]: I0320 09:10:48.379655 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 09:10:49.401588 master-0 kubenswrapper[18707]: I0320 09:10:49.401516 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="afda99a5-2da6-4e38-a7f3-ff85ca74752c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.9:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 09:10:49.402462 master-0 kubenswrapper[18707]: I0320 09:10:49.401612 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="afda99a5-2da6-4e38-a7f3-ff85ca74752c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.9:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 09:10:52.587198 master-0 kubenswrapper[18707]: I0320 09:10:52.587081 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 09:10:52.587198 master-0 kubenswrapper[18707]: I0320 09:10:52.587146 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 09:10:53.602547 master-0 kubenswrapper[18707]: I0320 09:10:53.602464 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="226fb8c1-8582-4aef-91b7-fcdb77486ecd" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.10:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 09:10:53.603140 master-0 kubenswrapper[18707]: I0320 09:10:53.602640 18707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="226fb8c1-8582-4aef-91b7-fcdb77486ecd" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.10:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 09:10:56.379982 master-0 kubenswrapper[18707]: I0320 09:10:56.379895 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 09:10:56.380791 master-0 kubenswrapper[18707]: I0320 09:10:56.380327 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 09:10:58.389259 master-0 kubenswrapper[18707]: I0320 09:10:58.388896 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 09:10:58.389259 master-0 kubenswrapper[18707]: I0320 09:10:58.389017 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 09:10:58.395629 master-0 kubenswrapper[18707]: I0320 09:10:58.395586 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 09:10:58.400492 master-0 kubenswrapper[18707]: I0320 09:10:58.400230 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 09:11:00.586288 master-0 kubenswrapper[18707]: I0320 09:11:00.586208 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 09:11:00.586288 master-0 kubenswrapper[18707]: I0320 09:11:00.586333 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 09:11:01.677102 master-0 kubenswrapper[18707]: I0320 09:11:01.677027 18707 scope.go:117] "RemoveContainer" containerID="a780de2adb2d97b70bb6d278df777093845f8f03cf60be9083754d2854bebe5b" Mar 20 09:11:01.723199 master-0 kubenswrapper[18707]: I0320 09:11:01.723130 18707 scope.go:117] "RemoveContainer" containerID="6ad23b2a7e70eee9f718200f4669404d729789cbf65e935f164c88a96c881295" Mar 20 09:11:01.836257 master-0 kubenswrapper[18707]: I0320 09:11:01.833583 18707 scope.go:117] "RemoveContainer" containerID="b1c27633889ab8ca2c9f4bf4862b62f21fdef8e75ce59e6db5bb48fa212a7c6e" Mar 20 09:11:01.861641 master-0 kubenswrapper[18707]: I0320 09:11:01.861572 18707 scope.go:117] "RemoveContainer" containerID="20e1fb47458174b393c284c7a68a9ab748298f53e07a47be1792318c06503872" Mar 20 09:11:02.592764 master-0 kubenswrapper[18707]: I0320 09:11:02.592696 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 09:11:02.600942 master-0 kubenswrapper[18707]: I0320 09:11:02.600867 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 09:11:02.602876 master-0 kubenswrapper[18707]: I0320 09:11:02.602831 18707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 09:11:02.682275 master-0 kubenswrapper[18707]: I0320 09:11:02.682212 18707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 09:12:02.046976 master-0 kubenswrapper[18707]: I0320 09:12:02.046852 18707 scope.go:117] "RemoveContainer" containerID="9db4bb7612f42841bc0ef5f709b1e5c9ca169b392a89638c8e9ae654d3b7eeb3" Mar 20 09:12:02.077928 master-0 kubenswrapper[18707]: I0320 09:12:02.077851 18707 scope.go:117] "RemoveContainer" containerID="75255fec69697873217f7a9f3186e9a333342e288a617b7f33cf0db5508b6b0a" Mar 20 09:12:02.106896 master-0 kubenswrapper[18707]: I0320 09:12:02.106843 18707 scope.go:117] "RemoveContainer" containerID="2683bb71581f4bac53a3e5cfa71a8a6fc713dacb8ebfb502a6d2471a9b2714f1" Mar 20 09:12:02.148598 master-0 kubenswrapper[18707]: I0320 09:12:02.148561 18707 scope.go:117] "RemoveContainer" containerID="01276e1df3863b98d6c4a0c4fd58f71f07c78c8d74fb4c327558a43a8ec346e8" Mar 20 09:13:02.271112 master-0 kubenswrapper[18707]: I0320 09:13:02.271031 18707 scope.go:117] "RemoveContainer" containerID="46f83be4dcad0c9ec2646ca57056a179e12d5be75f45fb0f9d9f3db320d4583c" Mar 20 09:13:02.309519 master-0 kubenswrapper[18707]: I0320 09:13:02.309422 18707 scope.go:117] "RemoveContainer" containerID="61ce944c1e7a2281812d8c4fcb8f666a853916bb2320df168db63548e092bf3a" Mar 20 09:13:02.349602 master-0 kubenswrapper[18707]: I0320 09:13:02.349547 18707 scope.go:117] "RemoveContainer" containerID="92aef4a1e7ac42f70a425304674348620192c74ae72ba60e80e61452607b81ec" Mar 20 09:15:13.068110 master-0 kubenswrapper[18707]: I0320 09:15:13.068045 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-cslxs"] Mar 20 09:15:13.081321 master-0 kubenswrapper[18707]: I0320 09:15:13.081261 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-ffa9-account-create-update-22wx5"] Mar 20 09:15:13.117115 master-0 kubenswrapper[18707]: I0320 09:15:13.116942 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-cslxs"] Mar 20 09:15:13.117358 master-0 kubenswrapper[18707]: I0320 09:15:13.117154 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-ffa9-account-create-update-22wx5"] Mar 20 09:15:14.045691 master-0 kubenswrapper[18707]: I0320 09:15:14.045619 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-dqlc9"] Mar 20 09:15:14.075646 master-0 kubenswrapper[18707]: I0320 09:15:14.075579 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-dqlc9"] Mar 20 09:15:14.095370 master-0 kubenswrapper[18707]: I0320 09:15:14.094715 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ec41-account-create-update-wlbg9"] Mar 20 09:15:14.107648 master-0 kubenswrapper[18707]: I0320 09:15:14.107577 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ec41-account-create-update-wlbg9"] Mar 20 09:15:15.141800 master-0 kubenswrapper[18707]: I0320 09:15:15.141640 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a482121-8e91-4338-94d1-216c258dda1f" path="/var/lib/kubelet/pods/2a482121-8e91-4338-94d1-216c258dda1f/volumes" Mar 20 09:15:15.148535 master-0 kubenswrapper[18707]: I0320 09:15:15.148477 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51e675f7-a40f-4fc7-8e3f-227d95698d5d" path="/var/lib/kubelet/pods/51e675f7-a40f-4fc7-8e3f-227d95698d5d/volumes" Mar 20 09:15:15.159825 master-0 kubenswrapper[18707]: I0320 09:15:15.159755 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1badf9b-3bad-4267-aecb-cc8bf311cf07" path="/var/lib/kubelet/pods/c1badf9b-3bad-4267-aecb-cc8bf311cf07/volumes" Mar 20 09:15:15.162717 master-0 kubenswrapper[18707]: I0320 09:15:15.162674 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d63fe82e-6808-4fe3-a55c-681f01ea78da" path="/var/lib/kubelet/pods/d63fe82e-6808-4fe3-a55c-681f01ea78da/volumes" Mar 20 09:15:16.087841 master-0 kubenswrapper[18707]: I0320 09:15:16.087762 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-l87pv"] Mar 20 09:15:16.109603 master-0 kubenswrapper[18707]: I0320 09:15:16.109493 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-fb00-account-create-update-9vcp7"] Mar 20 09:15:16.120531 master-0 kubenswrapper[18707]: I0320 09:15:16.120440 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-l87pv"] Mar 20 09:15:16.133368 master-0 kubenswrapper[18707]: I0320 09:15:16.133302 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-fb00-account-create-update-9vcp7"] Mar 20 09:15:17.117305 master-0 kubenswrapper[18707]: I0320 09:15:17.117167 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cc50577-daf4-4a55-b1aa-2ae0b52e04d3" path="/var/lib/kubelet/pods/1cc50577-daf4-4a55-b1aa-2ae0b52e04d3/volumes" Mar 20 09:15:17.119083 master-0 kubenswrapper[18707]: I0320 09:15:17.118788 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a" path="/var/lib/kubelet/pods/bd5e99e8-7c13-4881-9edb-fc3ed2ab0f0a/volumes" Mar 20 09:15:45.076046 master-0 kubenswrapper[18707]: I0320 09:15:45.075969 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ndh4m"] Mar 20 09:15:45.093299 master-0 kubenswrapper[18707]: I0320 09:15:45.093220 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ndh4m"] Mar 20 09:15:45.114966 master-0 kubenswrapper[18707]: I0320 09:15:45.114898 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8751adb-f918-4ff0-b087-5ced3219f41a" path="/var/lib/kubelet/pods/c8751adb-f918-4ff0-b087-5ced3219f41a/volumes" Mar 20 09:15:47.084562 master-0 kubenswrapper[18707]: I0320 09:15:47.084483 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-662t6"] Mar 20 09:15:47.110871 master-0 kubenswrapper[18707]: I0320 09:15:47.110822 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-662t6"] Mar 20 09:15:49.132439 master-0 kubenswrapper[18707]: I0320 09:15:49.131812 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd5f8067-9874-4ff3-a9fb-a3251cc3622d" path="/var/lib/kubelet/pods/dd5f8067-9874-4ff3-a9fb-a3251cc3622d/volumes" Mar 20 09:15:56.049219 master-0 kubenswrapper[18707]: I0320 09:15:56.046253 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-3e52-account-create-update-qlzvf"] Mar 20 09:15:56.062211 master-0 kubenswrapper[18707]: I0320 09:15:56.062032 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-3e52-account-create-update-qlzvf"] Mar 20 09:15:57.091587 master-0 kubenswrapper[18707]: I0320 09:15:57.089646 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-bbvbz"] Mar 20 09:15:57.112164 master-0 kubenswrapper[18707]: I0320 09:15:57.112097 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a238c22f-d559-4c69-aeb0-c16f67befc9e" path="/var/lib/kubelet/pods/a238c22f-d559-4c69-aeb0-c16f67befc9e/volumes" Mar 20 09:15:57.112905 master-0 kubenswrapper[18707]: I0320 09:15:57.112860 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3574-account-create-update-mw5s6"] Mar 20 09:15:57.113926 master-0 kubenswrapper[18707]: I0320 09:15:57.113902 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-vtvmv"] Mar 20 09:15:57.123854 master-0 kubenswrapper[18707]: I0320 09:15:57.123790 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-bbvbz"] Mar 20 09:15:57.133428 master-0 kubenswrapper[18707]: I0320 09:15:57.133377 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3574-account-create-update-mw5s6"] Mar 20 09:15:57.142498 master-0 kubenswrapper[18707]: I0320 09:15:57.142416 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-vtvmv"] Mar 20 09:15:59.109412 master-0 kubenswrapper[18707]: I0320 09:15:59.109337 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd5b27d-6c79-43fb-aa80-d104a32c95c1" path="/var/lib/kubelet/pods/3fd5b27d-6c79-43fb-aa80-d104a32c95c1/volumes" Mar 20 09:15:59.110974 master-0 kubenswrapper[18707]: I0320 09:15:59.110891 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52457bb1-e081-4589-bbe2-3aaadfb92b31" path="/var/lib/kubelet/pods/52457bb1-e081-4589-bbe2-3aaadfb92b31/volumes" Mar 20 09:15:59.112590 master-0 kubenswrapper[18707]: I0320 09:15:59.112526 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c87d5ca7-5cc3-4027-8725-0c42afdef9e9" path="/var/lib/kubelet/pods/c87d5ca7-5cc3-4027-8725-0c42afdef9e9/volumes" Mar 20 09:16:02.550014 master-0 kubenswrapper[18707]: I0320 09:16:02.549103 18707 scope.go:117] "RemoveContainer" containerID="ab43a5bdeb2dcfd47f72404e9a05ff319ff68d1b2595cae8c0932abb9637f51c" Mar 20 09:16:02.573610 master-0 kubenswrapper[18707]: I0320 09:16:02.573556 18707 scope.go:117] "RemoveContainer" containerID="4d9f7b2cd1e6116f3e6dd74bac6715ca5e47aa39a73a16abdd32a9c48b8d1101" Mar 20 09:16:02.595446 master-0 kubenswrapper[18707]: I0320 09:16:02.595406 18707 scope.go:117] "RemoveContainer" containerID="7bdb6c9c4298089409f05a4eee45f5ad939101002f891a0863f338d29754115c" Mar 20 09:16:02.628668 master-0 kubenswrapper[18707]: I0320 09:16:02.628561 18707 scope.go:117] "RemoveContainer" containerID="4d88c8c581a9c364031cb4d4336c34f4cc762f582c79aac4aa41cc2b4dca5dee" Mar 20 09:16:02.655560 master-0 kubenswrapper[18707]: I0320 09:16:02.654594 18707 scope.go:117] "RemoveContainer" containerID="53c19949948cfde5dd778064ab56e6214cf9de8fac4728709b7e8b6eee84d38a" Mar 20 09:16:02.686567 master-0 kubenswrapper[18707]: I0320 09:16:02.686512 18707 scope.go:117] "RemoveContainer" containerID="bd79e7edae84856717132d6459c141670b40f5a54976050bc3ff49b70a15251c" Mar 20 09:16:02.715154 master-0 kubenswrapper[18707]: I0320 09:16:02.715083 18707 scope.go:117] "RemoveContainer" containerID="36a774bdc7bdad7f2095a1a3885ebe4a35dffffc0d6b0d33093bde6063a71735" Mar 20 09:16:02.743326 master-0 kubenswrapper[18707]: I0320 09:16:02.743283 18707 scope.go:117] "RemoveContainer" containerID="ca2b9e4f1c759dd4223cd6cee17729e7fad04b7f2a0a483eab4b178b5ea8e69e" Mar 20 09:16:02.766711 master-0 kubenswrapper[18707]: I0320 09:16:02.766665 18707 scope.go:117] "RemoveContainer" containerID="43eee35e759a3c13cec9ebfdf2378e0939f277f66cac06ecb8cc9f317e767305" Mar 20 09:16:02.791817 master-0 kubenswrapper[18707]: I0320 09:16:02.791735 18707 scope.go:117] "RemoveContainer" containerID="7f665346c34b1ed859296c42869b694dd76f834a391e6df93c790ecda567f987" Mar 20 09:16:02.812993 master-0 kubenswrapper[18707]: I0320 09:16:02.812913 18707 scope.go:117] "RemoveContainer" containerID="d99e9e87acc39b2aae4283d979f6e27f6734207a08061f99f0210ba24143eb5f" Mar 20 09:16:02.836609 master-0 kubenswrapper[18707]: I0320 09:16:02.836508 18707 scope.go:117] "RemoveContainer" containerID="bb9e82a7de0e5a54084f101aa506a942c723b618c5aa6fddf0bf6079de9023d7" Mar 20 09:16:02.867214 master-0 kubenswrapper[18707]: I0320 09:16:02.867166 18707 scope.go:117] "RemoveContainer" containerID="cd55f7a32d0ec9054689727e6da502d46386a9de188dea7b2368cf48e3cd6ac4" Mar 20 09:16:06.058327 master-0 kubenswrapper[18707]: I0320 09:16:06.058253 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jg5hc"] Mar 20 09:16:06.072294 master-0 kubenswrapper[18707]: I0320 09:16:06.072232 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jg5hc"] Mar 20 09:16:07.110685 master-0 kubenswrapper[18707]: I0320 09:16:07.110626 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23011402-5090-4dcc-a36e-f9dcb3db8946" path="/var/lib/kubelet/pods/23011402-5090-4dcc-a36e-f9dcb3db8946/volumes" Mar 20 09:16:40.056544 master-0 kubenswrapper[18707]: I0320 09:16:40.056458 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-8jqss"] Mar 20 09:16:40.068360 master-0 kubenswrapper[18707]: I0320 09:16:40.067447 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-8jqss"] Mar 20 09:16:41.110462 master-0 kubenswrapper[18707]: I0320 09:16:41.110367 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01502368-38fe-40cc-9325-a7b83996fea1" path="/var/lib/kubelet/pods/01502368-38fe-40cc-9325-a7b83996fea1/volumes" Mar 20 09:16:51.838284 master-0 kubenswrapper[18707]: I0320 09:16:51.837511 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-x7bjf"] Mar 20 09:16:52.128854 master-0 kubenswrapper[18707]: I0320 09:16:52.128675 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-q6j55"] Mar 20 09:16:52.420135 master-0 kubenswrapper[18707]: I0320 09:16:52.419945 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-x7bjf"] Mar 20 09:16:52.438464 master-0 kubenswrapper[18707]: I0320 09:16:52.435375 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c920a-db-sync-5sx9b"] Mar 20 09:16:52.447251 master-0 kubenswrapper[18707]: I0320 09:16:52.447158 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-q6j55"] Mar 20 09:16:52.475482 master-0 kubenswrapper[18707]: I0320 09:16:52.475416 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-c920a-db-sync-5sx9b"] Mar 20 09:16:53.112480 master-0 kubenswrapper[18707]: I0320 09:16:53.112356 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a" path="/var/lib/kubelet/pods/7f0eaaa4-197c-49d2-a7a7-1d8c2edee22a/volumes" Mar 20 09:16:53.117711 master-0 kubenswrapper[18707]: I0320 09:16:53.117663 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d739ce18-cf6c-4eeb-a609-7ab1acac00d2" path="/var/lib/kubelet/pods/d739ce18-cf6c-4eeb-a609-7ab1acac00d2/volumes" Mar 20 09:16:53.118986 master-0 kubenswrapper[18707]: I0320 09:16:53.118940 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6" path="/var/lib/kubelet/pods/eb57b0ac-62b8-4f9e-bb35-4b2097bafcc6/volumes" Mar 20 09:17:03.195136 master-0 kubenswrapper[18707]: I0320 09:17:03.195069 18707 scope.go:117] "RemoveContainer" containerID="8abfaa08ff527dda1a412f16cc815c17755a014e017c686cc1bf1fc69b773326" Mar 20 09:17:03.217534 master-0 kubenswrapper[18707]: I0320 09:17:03.217503 18707 scope.go:117] "RemoveContainer" containerID="caf385a86308743d6ea997839039856b2382a85322e22d2510752a140219e721" Mar 20 09:17:03.318310 master-0 kubenswrapper[18707]: I0320 09:17:03.318259 18707 scope.go:117] "RemoveContainer" containerID="f8f53fce1724ee6b0341e8dfcfa88d2d2109f71ae5fa72bfb98e12e6f054c67a" Mar 20 09:17:03.404924 master-0 kubenswrapper[18707]: I0320 09:17:03.404872 18707 scope.go:117] "RemoveContainer" containerID="a07bd35b6065a17b0625937319043507f44d30ff119dcba2e53672991382ccf2" Mar 20 09:17:03.462441 master-0 kubenswrapper[18707]: I0320 09:17:03.462392 18707 scope.go:117] "RemoveContainer" containerID="9c1abde5f519d612e85bab52eac717cd2c5a4a1ca596cc149088edc0fbbbd35d" Mar 20 09:17:15.182793 master-0 kubenswrapper[18707]: I0320 09:17:15.182711 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/edpm-b-provisionserver-checksum-discovery-mf2jz"] Mar 20 09:17:15.318881 master-0 kubenswrapper[18707]: I0320 09:17:15.318809 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/edpm-b-provisionserver-checksum-discovery-mf2jz"] Mar 20 09:17:16.045208 master-0 kubenswrapper[18707]: I0320 09:17:16.044292 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/edpm-a-provisionserver-checksum-discovery-49xmz"] Mar 20 09:17:16.061217 master-0 kubenswrapper[18707]: I0320 09:17:16.057350 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/edpm-a-provisionserver-checksum-discovery-49xmz"] Mar 20 09:17:17.110888 master-0 kubenswrapper[18707]: I0320 09:17:17.110810 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d44d4b7-ce01-4aa4-9155-8338ad17b404" path="/var/lib/kubelet/pods/4d44d4b7-ce01-4aa4-9155-8338ad17b404/volumes" Mar 20 09:17:17.112966 master-0 kubenswrapper[18707]: I0320 09:17:17.112928 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bae03209-559e-4828-bcb6-70eb73ff61dc" path="/var/lib/kubelet/pods/bae03209-559e-4828-bcb6-70eb73ff61dc/volumes" Mar 20 09:17:48.204702 master-0 kubenswrapper[18707]: I0320 09:17:48.204633 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-dataplane-step-1-edpm-a-54cs7"] Mar 20 09:17:48.213387 master-0 kubenswrapper[18707]: I0320 09:17:48.213331 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:17:48.218217 master-0 kubenswrapper[18707]: I0320 09:17:48.216754 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn"] Mar 20 09:17:48.219673 master-0 kubenswrapper[18707]: I0320 09:17:48.219395 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:17:48.220450 master-0 kubenswrapper[18707]: I0320 09:17:48.220418 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:17:48.220960 master-0 kubenswrapper[18707]: I0320 09:17:48.220935 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:17:48.222144 master-0 kubenswrapper[18707]: I0320 09:17:48.222120 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 09:17:48.223232 master-0 kubenswrapper[18707]: I0320 09:17:48.223210 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-b" Mar 20 09:17:48.229176 master-0 kubenswrapper[18707]: I0320 09:17:48.229114 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-dataplane-step-1-edpm-a-54cs7"] Mar 20 09:17:48.269394 master-0 kubenswrapper[18707]: I0320 09:17:48.269349 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn"] Mar 20 09:17:48.410410 master-0 kubenswrapper[18707]: I0320 09:17:48.410304 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-inventory\") pod \"bootstrap-dataplane-step-1-edpm-b-k2lgn\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:17:48.410665 master-0 kubenswrapper[18707]: I0320 09:17:48.410442 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-ssh-key-edpm-b\") pod \"bootstrap-dataplane-step-1-edpm-b-k2lgn\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:17:48.410665 master-0 kubenswrapper[18707]: I0320 09:17:48.410496 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plq2v\" (UniqueName: \"kubernetes.io/projected/482c1c34-d034-484b-9777-e44d27b06f2a-kube-api-access-plq2v\") pod \"bootstrap-dataplane-step-1-edpm-b-k2lgn\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:17:48.410665 master-0 kubenswrapper[18707]: I0320 09:17:48.410527 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-bootstrap-combined-ca-bundle\") pod \"bootstrap-dataplane-step-1-edpm-b-k2lgn\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:17:48.410864 master-0 kubenswrapper[18707]: I0320 09:17:48.410754 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-ssh-key-edpm-a\") pod \"bootstrap-dataplane-step-1-edpm-a-54cs7\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:17:48.410901 master-0 kubenswrapper[18707]: I0320 09:17:48.410890 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-inventory\") pod \"bootstrap-dataplane-step-1-edpm-a-54cs7\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:17:48.411021 master-0 kubenswrapper[18707]: I0320 09:17:48.410967 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-bootstrap-combined-ca-bundle\") pod \"bootstrap-dataplane-step-1-edpm-a-54cs7\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:17:48.411081 master-0 kubenswrapper[18707]: I0320 09:17:48.411046 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7dx5\" (UniqueName: \"kubernetes.io/projected/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-kube-api-access-m7dx5\") pod \"bootstrap-dataplane-step-1-edpm-a-54cs7\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:17:48.513408 master-0 kubenswrapper[18707]: I0320 09:17:48.513284 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-inventory\") pod \"bootstrap-dataplane-step-1-edpm-b-k2lgn\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:17:48.513709 master-0 kubenswrapper[18707]: I0320 09:17:48.513694 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-ssh-key-edpm-b\") pod \"bootstrap-dataplane-step-1-edpm-b-k2lgn\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:17:48.513852 master-0 kubenswrapper[18707]: I0320 09:17:48.513834 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plq2v\" (UniqueName: \"kubernetes.io/projected/482c1c34-d034-484b-9777-e44d27b06f2a-kube-api-access-plq2v\") pod \"bootstrap-dataplane-step-1-edpm-b-k2lgn\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:17:48.513957 master-0 kubenswrapper[18707]: I0320 09:17:48.513943 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-bootstrap-combined-ca-bundle\") pod \"bootstrap-dataplane-step-1-edpm-b-k2lgn\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:17:48.514073 master-0 kubenswrapper[18707]: I0320 09:17:48.514061 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-ssh-key-edpm-a\") pod \"bootstrap-dataplane-step-1-edpm-a-54cs7\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:17:48.514175 master-0 kubenswrapper[18707]: I0320 09:17:48.514162 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-inventory\") pod \"bootstrap-dataplane-step-1-edpm-a-54cs7\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:17:48.514287 master-0 kubenswrapper[18707]: I0320 09:17:48.514272 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-bootstrap-combined-ca-bundle\") pod \"bootstrap-dataplane-step-1-edpm-a-54cs7\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:17:48.514445 master-0 kubenswrapper[18707]: I0320 09:17:48.514427 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7dx5\" (UniqueName: \"kubernetes.io/projected/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-kube-api-access-m7dx5\") pod \"bootstrap-dataplane-step-1-edpm-a-54cs7\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:17:48.519882 master-0 kubenswrapper[18707]: I0320 09:17:48.519791 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-ssh-key-edpm-a\") pod \"bootstrap-dataplane-step-1-edpm-a-54cs7\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:17:48.523136 master-0 kubenswrapper[18707]: I0320 09:17:48.520320 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-inventory\") pod \"bootstrap-dataplane-step-1-edpm-b-k2lgn\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:17:48.523136 master-0 kubenswrapper[18707]: I0320 09:17:48.520440 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-bootstrap-combined-ca-bundle\") pod \"bootstrap-dataplane-step-1-edpm-b-k2lgn\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:17:48.523136 master-0 kubenswrapper[18707]: I0320 09:17:48.520515 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-ssh-key-edpm-b\") pod \"bootstrap-dataplane-step-1-edpm-b-k2lgn\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:17:48.524543 master-0 kubenswrapper[18707]: I0320 09:17:48.524489 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-inventory\") pod \"bootstrap-dataplane-step-1-edpm-a-54cs7\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:17:48.524841 master-0 kubenswrapper[18707]: I0320 09:17:48.524804 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-bootstrap-combined-ca-bundle\") pod \"bootstrap-dataplane-step-1-edpm-a-54cs7\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:17:48.530255 master-0 kubenswrapper[18707]: I0320 09:17:48.529943 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plq2v\" (UniqueName: \"kubernetes.io/projected/482c1c34-d034-484b-9777-e44d27b06f2a-kube-api-access-plq2v\") pod \"bootstrap-dataplane-step-1-edpm-b-k2lgn\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:17:48.533575 master-0 kubenswrapper[18707]: I0320 09:17:48.533538 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7dx5\" (UniqueName: \"kubernetes.io/projected/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-kube-api-access-m7dx5\") pod \"bootstrap-dataplane-step-1-edpm-a-54cs7\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:17:48.549735 master-0 kubenswrapper[18707]: I0320 09:17:48.549678 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:17:48.567600 master-0 kubenswrapper[18707]: I0320 09:17:48.567522 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:17:49.291159 master-0 kubenswrapper[18707]: I0320 09:17:49.291101 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-dataplane-step-1-edpm-a-54cs7"] Mar 20 09:17:49.295684 master-0 kubenswrapper[18707]: I0320 09:17:49.295618 18707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:17:49.368463 master-0 kubenswrapper[18707]: W0320 09:17:49.368403 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod482c1c34_d034_484b_9777_e44d27b06f2a.slice/crio-e5d7a65469282ccb45752d2c86c51997b8bb81cc5d1e185bdb39c45b71c012df WatchSource:0}: Error finding container e5d7a65469282ccb45752d2c86c51997b8bb81cc5d1e185bdb39c45b71c012df: Status 404 returned error can't find the container with id e5d7a65469282ccb45752d2c86c51997b8bb81cc5d1e185bdb39c45b71c012df Mar 20 09:17:49.368663 master-0 kubenswrapper[18707]: I0320 09:17:49.368506 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn"] Mar 20 09:17:49.614604 master-0 kubenswrapper[18707]: I0320 09:17:49.614453 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" event={"ID":"b3d27304-e0eb-4819-81f6-43d30cfd1c8b","Type":"ContainerStarted","Data":"60e17b9636361e3b3cfc2475fe868e6d7f23863dbea2d48b387619e4c794edcb"} Mar 20 09:17:49.615824 master-0 kubenswrapper[18707]: I0320 09:17:49.615746 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" event={"ID":"482c1c34-d034-484b-9777-e44d27b06f2a","Type":"ContainerStarted","Data":"e5d7a65469282ccb45752d2c86c51997b8bb81cc5d1e185bdb39c45b71c012df"} Mar 20 09:18:00.581291 master-0 kubenswrapper[18707]: I0320 09:18:00.581237 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:18:01.755105 master-0 kubenswrapper[18707]: I0320 09:18:01.754991 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" event={"ID":"b3d27304-e0eb-4819-81f6-43d30cfd1c8b","Type":"ContainerStarted","Data":"1b997db012872986fca20822e447d3181f2506696d40c301e27431e5ad70ef3c"} Mar 20 09:18:01.759497 master-0 kubenswrapper[18707]: I0320 09:18:01.759435 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" event={"ID":"482c1c34-d034-484b-9777-e44d27b06f2a","Type":"ContainerStarted","Data":"3a344d4232c0ace55fa821cb80faef4825f035faa9921b9c2123570779b22ce2"} Mar 20 09:18:01.805369 master-0 kubenswrapper[18707]: I0320 09:18:01.799610 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" podStartSLOduration=2.513570439 podStartE2EDuration="13.799581496s" podCreationTimestamp="2026-03-20 09:17:48 +0000 UTC" firstStartedPulling="2026-03-20 09:17:49.291896433 +0000 UTC m=+2214.448076789" lastFinishedPulling="2026-03-20 09:18:00.57790749 +0000 UTC m=+2225.734087846" observedRunningTime="2026-03-20 09:18:01.789889082 +0000 UTC m=+2226.946069448" watchObservedRunningTime="2026-03-20 09:18:01.799581496 +0000 UTC m=+2226.955761872" Mar 20 09:18:01.851453 master-0 kubenswrapper[18707]: I0320 09:18:01.851320 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" podStartSLOduration=2.454631914 podStartE2EDuration="13.851292657s" podCreationTimestamp="2026-03-20 09:17:48 +0000 UTC" firstStartedPulling="2026-03-20 09:17:49.370984309 +0000 UTC m=+2214.527164665" lastFinishedPulling="2026-03-20 09:18:00.767645052 +0000 UTC m=+2225.923825408" observedRunningTime="2026-03-20 09:18:01.848092347 +0000 UTC m=+2227.004272713" watchObservedRunningTime="2026-03-20 09:18:01.851292657 +0000 UTC m=+2227.007473023" Mar 20 09:18:03.597639 master-0 kubenswrapper[18707]: I0320 09:18:03.597566 18707 scope.go:117] "RemoveContainer" containerID="ace777ac4645bf819154a3e7c1bf57c4aea08bbc52fca5b1847b2780c95798bc" Mar 20 09:18:03.646341 master-0 kubenswrapper[18707]: I0320 09:18:03.646267 18707 scope.go:117] "RemoveContainer" containerID="ec656880600d950a9ab66c0cf523cb4ea3121c5a64decd1830a980cbbedb372e" Mar 20 09:18:03.675129 master-0 kubenswrapper[18707]: I0320 09:18:03.675066 18707 scope.go:117] "RemoveContainer" containerID="7e7cea23c7318292592a7d106763c1c985071971334264d512fa9e9df258faa2" Mar 20 09:18:03.709277 master-0 kubenswrapper[18707]: I0320 09:18:03.705960 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-11a2-account-create-update-gkhvg"] Mar 20 09:18:03.719914 master-0 kubenswrapper[18707]: I0320 09:18:03.719882 18707 scope.go:117] "RemoveContainer" containerID="38f7b6ca8042ae518fe70e8a22137198f9493b653bb668a681f315aeade3b616" Mar 20 09:18:03.733693 master-0 kubenswrapper[18707]: I0320 09:18:03.733603 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-11a2-account-create-update-gkhvg"] Mar 20 09:18:04.179811 master-0 kubenswrapper[18707]: I0320 09:18:04.179735 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-af59-account-create-update-tk2tl"] Mar 20 09:18:04.259560 master-0 kubenswrapper[18707]: I0320 09:18:04.249833 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-fh7jr"] Mar 20 09:18:04.265606 master-0 kubenswrapper[18707]: I0320 09:18:04.265567 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hdkgz"] Mar 20 09:18:04.277954 master-0 kubenswrapper[18707]: I0320 09:18:04.277910 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-th2rs"] Mar 20 09:18:04.298262 master-0 kubenswrapper[18707]: I0320 09:18:04.293115 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3995-account-create-update-7pfxc"] Mar 20 09:18:04.305056 master-0 kubenswrapper[18707]: I0320 09:18:04.304969 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-af59-account-create-update-tk2tl"] Mar 20 09:18:04.396485 master-0 kubenswrapper[18707]: I0320 09:18:04.396420 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-fh7jr"] Mar 20 09:18:04.416689 master-0 kubenswrapper[18707]: I0320 09:18:04.416613 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-th2rs"] Mar 20 09:18:04.430286 master-0 kubenswrapper[18707]: I0320 09:18:04.430127 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3995-account-create-update-7pfxc"] Mar 20 09:18:04.442637 master-0 kubenswrapper[18707]: I0320 09:18:04.442565 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hdkgz"] Mar 20 09:18:05.120882 master-0 kubenswrapper[18707]: I0320 09:18:05.120807 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00e361f2-f12a-4354-b907-8575930aee6b" path="/var/lib/kubelet/pods/00e361f2-f12a-4354-b907-8575930aee6b/volumes" Mar 20 09:18:05.122058 master-0 kubenswrapper[18707]: I0320 09:18:05.121605 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a210b09-ea3d-4841-9cb3-f86a7f93985e" path="/var/lib/kubelet/pods/2a210b09-ea3d-4841-9cb3-f86a7f93985e/volumes" Mar 20 09:18:05.122464 master-0 kubenswrapper[18707]: I0320 09:18:05.122422 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="322c240c-fbd4-40ea-80bf-cc8bf2611394" path="/var/lib/kubelet/pods/322c240c-fbd4-40ea-80bf-cc8bf2611394/volumes" Mar 20 09:18:05.124151 master-0 kubenswrapper[18707]: I0320 09:18:05.124102 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a88fa32-c356-4dfe-abb9-d342391d52de" path="/var/lib/kubelet/pods/7a88fa32-c356-4dfe-abb9-d342391d52de/volumes" Mar 20 09:18:05.125406 master-0 kubenswrapper[18707]: I0320 09:18:05.125356 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c91fb20b-d4f7-4c69-b78c-458e9122718d" path="/var/lib/kubelet/pods/c91fb20b-d4f7-4c69-b78c-458e9122718d/volumes" Mar 20 09:18:05.126096 master-0 kubenswrapper[18707]: I0320 09:18:05.126053 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2bb4fbf-4157-4051-bfbb-0b0a011fbc49" path="/var/lib/kubelet/pods/e2bb4fbf-4157-4051-bfbb-0b0a011fbc49/volumes" Mar 20 09:18:43.062911 master-0 kubenswrapper[18707]: I0320 09:18:43.062839 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zzsdf"] Mar 20 09:18:43.081027 master-0 kubenswrapper[18707]: I0320 09:18:43.080960 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zzsdf"] Mar 20 09:18:43.133108 master-0 kubenswrapper[18707]: I0320 09:18:43.133055 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84cb3618-9229-4d84-9041-e632f4e9709e" path="/var/lib/kubelet/pods/84cb3618-9229-4d84-9041-e632f4e9709e/volumes" Mar 20 09:19:03.828928 master-0 kubenswrapper[18707]: I0320 09:19:03.828780 18707 scope.go:117] "RemoveContainer" containerID="75b6c5368bf2ea838b4e8fc4f30d310cf7658e15c2f592fe7eb4ed10519d5bde" Mar 20 09:19:03.853805 master-0 kubenswrapper[18707]: I0320 09:19:03.853676 18707 scope.go:117] "RemoveContainer" containerID="3781b977803f4f55d01defab452e08559aa9c24cf54e36e6cda6472c873010f9" Mar 20 09:19:03.885665 master-0 kubenswrapper[18707]: I0320 09:19:03.885571 18707 scope.go:117] "RemoveContainer" containerID="0b0a32a280f6f9c928c440482a6960695f1264a73f0430e6bc1d433ea50c9e0d" Mar 20 09:19:03.907205 master-0 kubenswrapper[18707]: I0320 09:19:03.907134 18707 scope.go:117] "RemoveContainer" containerID="b1b3eb57ad082df997cd23e2063a58bf55379fb798326e0a02544c62fe451c2f" Mar 20 09:19:03.935694 master-0 kubenswrapper[18707]: I0320 09:19:03.935660 18707 scope.go:117] "RemoveContainer" containerID="3292113ebdf4562c8aa9de4a76cec8de9c8fa34dcde6904df11f1fe55a43489d" Mar 20 09:19:03.962835 master-0 kubenswrapper[18707]: I0320 09:19:03.962665 18707 scope.go:117] "RemoveContainer" containerID="196ab9ce3249d543464a89f1e5cd6ee5417006111bbdfa2c75f3f300afd473f8" Mar 20 09:19:04.005028 master-0 kubenswrapper[18707]: I0320 09:19:04.004957 18707 scope.go:117] "RemoveContainer" containerID="efadccc7335ef5746b901dfd94c7b880f33f291b80645babf0af8bc0f20da7a6" Mar 20 09:19:22.522112 master-0 kubenswrapper[18707]: I0320 09:19:22.522043 18707 trace.go:236] Trace[683228082]: "Calculate volume metrics of glance for pod openstack/glance-27086-default-internal-api-0" (20-Mar-2026 09:19:21.341) (total time: 1176ms): Mar 20 09:19:22.522112 master-0 kubenswrapper[18707]: Trace[683228082]: [1.176572912s] [1.176572912s] END Mar 20 09:19:31.967211 master-0 kubenswrapper[18707]: I0320 09:19:31.966241 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bcpbl"] Mar 20 09:19:32.188740 master-0 kubenswrapper[18707]: I0320 09:19:32.188640 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-czjjp"] Mar 20 09:19:32.213991 master-0 kubenswrapper[18707]: I0320 09:19:32.210085 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-czjjp"] Mar 20 09:19:32.225687 master-0 kubenswrapper[18707]: I0320 09:19:32.225562 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bcpbl"] Mar 20 09:19:33.112744 master-0 kubenswrapper[18707]: I0320 09:19:33.112671 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f93103b-ad0e-4911-a6a3-003be4b823f4" path="/var/lib/kubelet/pods/1f93103b-ad0e-4911-a6a3-003be4b823f4/volumes" Mar 20 09:19:33.113567 master-0 kubenswrapper[18707]: I0320 09:19:33.113517 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f8abfc-e320-410d-bb6b-b5055c9fc454" path="/var/lib/kubelet/pods/c5f8abfc-e320-410d-bb6b-b5055c9fc454/volumes" Mar 20 09:19:53.153846 master-0 kubenswrapper[18707]: I0320 09:19:53.153440 18707 generic.go:334] "Generic (PLEG): container finished" podID="b3d27304-e0eb-4819-81f6-43d30cfd1c8b" containerID="1b997db012872986fca20822e447d3181f2506696d40c301e27431e5ad70ef3c" exitCode=0 Mar 20 09:19:53.153846 master-0 kubenswrapper[18707]: I0320 09:19:53.153548 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" event={"ID":"b3d27304-e0eb-4819-81f6-43d30cfd1c8b","Type":"ContainerDied","Data":"1b997db012872986fca20822e447d3181f2506696d40c301e27431e5ad70ef3c"} Mar 20 09:19:54.899272 master-0 kubenswrapper[18707]: I0320 09:19:54.886808 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:19:55.025587 master-0 kubenswrapper[18707]: I0320 09:19:55.025479 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-inventory\") pod \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " Mar 20 09:19:55.025860 master-0 kubenswrapper[18707]: I0320 09:19:55.025662 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7dx5\" (UniqueName: \"kubernetes.io/projected/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-kube-api-access-m7dx5\") pod \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " Mar 20 09:19:55.025974 master-0 kubenswrapper[18707]: I0320 09:19:55.025930 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-bootstrap-combined-ca-bundle\") pod \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " Mar 20 09:19:55.026322 master-0 kubenswrapper[18707]: I0320 09:19:55.026296 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-ssh-key-edpm-a\") pod \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\" (UID: \"b3d27304-e0eb-4819-81f6-43d30cfd1c8b\") " Mar 20 09:19:55.032290 master-0 kubenswrapper[18707]: I0320 09:19:55.031238 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-kube-api-access-m7dx5" (OuterVolumeSpecName: "kube-api-access-m7dx5") pod "b3d27304-e0eb-4819-81f6-43d30cfd1c8b" (UID: "b3d27304-e0eb-4819-81f6-43d30cfd1c8b"). InnerVolumeSpecName "kube-api-access-m7dx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:19:55.032290 master-0 kubenswrapper[18707]: I0320 09:19:55.031304 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b3d27304-e0eb-4819-81f6-43d30cfd1c8b" (UID: "b3d27304-e0eb-4819-81f6-43d30cfd1c8b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:55.058481 master-0 kubenswrapper[18707]: I0320 09:19:55.058294 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "b3d27304-e0eb-4819-81f6-43d30cfd1c8b" (UID: "b3d27304-e0eb-4819-81f6-43d30cfd1c8b"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:55.066537 master-0 kubenswrapper[18707]: I0320 09:19:55.066455 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-inventory" (OuterVolumeSpecName: "inventory") pod "b3d27304-e0eb-4819-81f6-43d30cfd1c8b" (UID: "b3d27304-e0eb-4819-81f6-43d30cfd1c8b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:19:55.129682 master-0 kubenswrapper[18707]: I0320 09:19:55.129595 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7dx5\" (UniqueName: \"kubernetes.io/projected/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-kube-api-access-m7dx5\") on node \"master-0\" DevicePath \"\"" Mar 20 09:19:55.129682 master-0 kubenswrapper[18707]: I0320 09:19:55.129633 18707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-bootstrap-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:19:55.129682 master-0 kubenswrapper[18707]: I0320 09:19:55.129646 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:19:55.129682 master-0 kubenswrapper[18707]: I0320 09:19:55.129659 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3d27304-e0eb-4819-81f6-43d30cfd1c8b-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:19:55.180095 master-0 kubenswrapper[18707]: I0320 09:19:55.180031 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" event={"ID":"b3d27304-e0eb-4819-81f6-43d30cfd1c8b","Type":"ContainerDied","Data":"60e17b9636361e3b3cfc2475fe868e6d7f23863dbea2d48b387619e4c794edcb"} Mar 20 09:19:55.180095 master-0 kubenswrapper[18707]: I0320 09:19:55.180085 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60e17b9636361e3b3cfc2475fe868e6d7f23863dbea2d48b387619e4c794edcb" Mar 20 09:19:55.180357 master-0 kubenswrapper[18707]: I0320 09:19:55.180123 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-dataplane-step-1-edpm-a-54cs7" Mar 20 09:19:55.601572 master-0 kubenswrapper[18707]: I0320 09:19:55.601485 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-dataplane-step-1-edpm-a-4hslh"] Mar 20 09:19:55.602307 master-0 kubenswrapper[18707]: E0320 09:19:55.602092 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d27304-e0eb-4819-81f6-43d30cfd1c8b" containerName="bootstrap-dataplane-step-1-edpm-a" Mar 20 09:19:55.602307 master-0 kubenswrapper[18707]: I0320 09:19:55.602111 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d27304-e0eb-4819-81f6-43d30cfd1c8b" containerName="bootstrap-dataplane-step-1-edpm-a" Mar 20 09:19:55.602496 master-0 kubenswrapper[18707]: I0320 09:19:55.602472 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d27304-e0eb-4819-81f6-43d30cfd1c8b" containerName="bootstrap-dataplane-step-1-edpm-a" Mar 20 09:19:55.612602 master-0 kubenswrapper[18707]: I0320 09:19:55.611034 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" Mar 20 09:19:55.624027 master-0 kubenswrapper[18707]: I0320 09:19:55.620166 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:19:55.624027 master-0 kubenswrapper[18707]: I0320 09:19:55.621283 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-dataplane-step-1-edpm-a-4hslh"] Mar 20 09:19:55.746411 master-0 kubenswrapper[18707]: I0320 09:19:55.746222 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl4xl\" (UniqueName: \"kubernetes.io/projected/5f0f884f-ea18-4cfb-8867-625393c5b847-kube-api-access-nl4xl\") pod \"configure-network-dataplane-step-1-edpm-a-4hslh\" (UID: \"5f0f884f-ea18-4cfb-8867-625393c5b847\") " pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" Mar 20 09:19:55.746411 master-0 kubenswrapper[18707]: I0320 09:19:55.746347 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/5f0f884f-ea18-4cfb-8867-625393c5b847-ssh-key-edpm-a\") pod \"configure-network-dataplane-step-1-edpm-a-4hslh\" (UID: \"5f0f884f-ea18-4cfb-8867-625393c5b847\") " pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" Mar 20 09:19:55.746885 master-0 kubenswrapper[18707]: I0320 09:19:55.746515 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f0f884f-ea18-4cfb-8867-625393c5b847-inventory\") pod \"configure-network-dataplane-step-1-edpm-a-4hslh\" (UID: \"5f0f884f-ea18-4cfb-8867-625393c5b847\") " pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" Mar 20 09:19:55.848527 master-0 kubenswrapper[18707]: I0320 09:19:55.848441 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl4xl\" (UniqueName: \"kubernetes.io/projected/5f0f884f-ea18-4cfb-8867-625393c5b847-kube-api-access-nl4xl\") pod \"configure-network-dataplane-step-1-edpm-a-4hslh\" (UID: \"5f0f884f-ea18-4cfb-8867-625393c5b847\") " pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" Mar 20 09:19:55.848770 master-0 kubenswrapper[18707]: I0320 09:19:55.848547 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/5f0f884f-ea18-4cfb-8867-625393c5b847-ssh-key-edpm-a\") pod \"configure-network-dataplane-step-1-edpm-a-4hslh\" (UID: \"5f0f884f-ea18-4cfb-8867-625393c5b847\") " pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" Mar 20 09:19:55.848770 master-0 kubenswrapper[18707]: I0320 09:19:55.848706 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f0f884f-ea18-4cfb-8867-625393c5b847-inventory\") pod \"configure-network-dataplane-step-1-edpm-a-4hslh\" (UID: \"5f0f884f-ea18-4cfb-8867-625393c5b847\") " pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" Mar 20 09:19:55.855379 master-0 kubenswrapper[18707]: I0320 09:19:55.853201 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/5f0f884f-ea18-4cfb-8867-625393c5b847-ssh-key-edpm-a\") pod \"configure-network-dataplane-step-1-edpm-a-4hslh\" (UID: \"5f0f884f-ea18-4cfb-8867-625393c5b847\") " pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" Mar 20 09:19:55.855379 master-0 kubenswrapper[18707]: I0320 09:19:55.853904 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f0f884f-ea18-4cfb-8867-625393c5b847-inventory\") pod \"configure-network-dataplane-step-1-edpm-a-4hslh\" (UID: \"5f0f884f-ea18-4cfb-8867-625393c5b847\") " pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" Mar 20 09:19:55.867283 master-0 kubenswrapper[18707]: I0320 09:19:55.867129 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl4xl\" (UniqueName: \"kubernetes.io/projected/5f0f884f-ea18-4cfb-8867-625393c5b847-kube-api-access-nl4xl\") pod \"configure-network-dataplane-step-1-edpm-a-4hslh\" (UID: \"5f0f884f-ea18-4cfb-8867-625393c5b847\") " pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" Mar 20 09:19:55.946514 master-0 kubenswrapper[18707]: I0320 09:19:55.946420 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" Mar 20 09:19:56.952527 master-0 kubenswrapper[18707]: I0320 09:19:56.952409 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-dataplane-step-1-edpm-a-4hslh"] Mar 20 09:19:57.206098 master-0 kubenswrapper[18707]: I0320 09:19:57.205906 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" event={"ID":"5f0f884f-ea18-4cfb-8867-625393c5b847","Type":"ContainerStarted","Data":"b1e18d24de295f9a3375f174a8a4a8923fa2071bb7a1890f29de0d762fa0ca42"} Mar 20 09:19:58.226347 master-0 kubenswrapper[18707]: I0320 09:19:58.226174 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" event={"ID":"5f0f884f-ea18-4cfb-8867-625393c5b847","Type":"ContainerStarted","Data":"657f1ec0070e428e694e802ce06672def7e7e468117d2af0c442556d696de999"} Mar 20 09:20:01.264100 master-0 kubenswrapper[18707]: I0320 09:20:01.263961 18707 generic.go:334] "Generic (PLEG): container finished" podID="482c1c34-d034-484b-9777-e44d27b06f2a" containerID="3a344d4232c0ace55fa821cb80faef4825f035faa9921b9c2123570779b22ce2" exitCode=0 Mar 20 09:20:01.264100 master-0 kubenswrapper[18707]: I0320 09:20:01.264006 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" event={"ID":"482c1c34-d034-484b-9777-e44d27b06f2a","Type":"ContainerDied","Data":"3a344d4232c0ace55fa821cb80faef4825f035faa9921b9c2123570779b22ce2"} Mar 20 09:20:01.292219 master-0 kubenswrapper[18707]: I0320 09:20:01.292066 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" podStartSLOduration=5.762307003 podStartE2EDuration="6.292042205s" podCreationTimestamp="2026-03-20 09:19:55 +0000 UTC" firstStartedPulling="2026-03-20 09:19:56.959826637 +0000 UTC m=+2342.116007003" lastFinishedPulling="2026-03-20 09:19:57.489561849 +0000 UTC m=+2342.645742205" observedRunningTime="2026-03-20 09:19:58.262579026 +0000 UTC m=+2343.418759392" watchObservedRunningTime="2026-03-20 09:20:01.292042205 +0000 UTC m=+2346.448222561" Mar 20 09:20:02.793996 master-0 kubenswrapper[18707]: I0320 09:20:02.793908 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:20:02.853889 master-0 kubenswrapper[18707]: I0320 09:20:02.853799 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plq2v\" (UniqueName: \"kubernetes.io/projected/482c1c34-d034-484b-9777-e44d27b06f2a-kube-api-access-plq2v\") pod \"482c1c34-d034-484b-9777-e44d27b06f2a\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " Mar 20 09:20:02.854099 master-0 kubenswrapper[18707]: I0320 09:20:02.854005 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-inventory\") pod \"482c1c34-d034-484b-9777-e44d27b06f2a\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " Mar 20 09:20:02.854099 master-0 kubenswrapper[18707]: I0320 09:20:02.854045 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-ssh-key-edpm-b\") pod \"482c1c34-d034-484b-9777-e44d27b06f2a\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " Mar 20 09:20:02.854099 master-0 kubenswrapper[18707]: I0320 09:20:02.854075 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-bootstrap-combined-ca-bundle\") pod \"482c1c34-d034-484b-9777-e44d27b06f2a\" (UID: \"482c1c34-d034-484b-9777-e44d27b06f2a\") " Mar 20 09:20:02.868369 master-0 kubenswrapper[18707]: I0320 09:20:02.866420 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/482c1c34-d034-484b-9777-e44d27b06f2a-kube-api-access-plq2v" (OuterVolumeSpecName: "kube-api-access-plq2v") pod "482c1c34-d034-484b-9777-e44d27b06f2a" (UID: "482c1c34-d034-484b-9777-e44d27b06f2a"). InnerVolumeSpecName "kube-api-access-plq2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:20:02.879462 master-0 kubenswrapper[18707]: I0320 09:20:02.878038 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "482c1c34-d034-484b-9777-e44d27b06f2a" (UID: "482c1c34-d034-484b-9777-e44d27b06f2a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:02.919531 master-0 kubenswrapper[18707]: I0320 09:20:02.917918 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "482c1c34-d034-484b-9777-e44d27b06f2a" (UID: "482c1c34-d034-484b-9777-e44d27b06f2a"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:02.937358 master-0 kubenswrapper[18707]: I0320 09:20:02.933775 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-inventory" (OuterVolumeSpecName: "inventory") pod "482c1c34-d034-484b-9777-e44d27b06f2a" (UID: "482c1c34-d034-484b-9777-e44d27b06f2a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:02.960238 master-0 kubenswrapper[18707]: I0320 09:20:02.960135 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plq2v\" (UniqueName: \"kubernetes.io/projected/482c1c34-d034-484b-9777-e44d27b06f2a-kube-api-access-plq2v\") on node \"master-0\" DevicePath \"\"" Mar 20 09:20:02.960238 master-0 kubenswrapper[18707]: I0320 09:20:02.960200 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:20:02.960238 master-0 kubenswrapper[18707]: I0320 09:20:02.960222 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:20:02.960238 master-0 kubenswrapper[18707]: I0320 09:20:02.960238 18707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/482c1c34-d034-484b-9777-e44d27b06f2a-bootstrap-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:20:03.287205 master-0 kubenswrapper[18707]: I0320 09:20:03.287027 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" event={"ID":"482c1c34-d034-484b-9777-e44d27b06f2a","Type":"ContainerDied","Data":"e5d7a65469282ccb45752d2c86c51997b8bb81cc5d1e185bdb39c45b71c012df"} Mar 20 09:20:03.287205 master-0 kubenswrapper[18707]: I0320 09:20:03.287084 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5d7a65469282ccb45752d2c86c51997b8bb81cc5d1e185bdb39c45b71c012df" Mar 20 09:20:03.287205 master-0 kubenswrapper[18707]: I0320 09:20:03.287142 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-dataplane-step-1-edpm-b-k2lgn" Mar 20 09:20:03.368069 master-0 kubenswrapper[18707]: I0320 09:20:03.368010 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-dataplane-step-1-edpm-b-5xnfd"] Mar 20 09:20:03.368588 master-0 kubenswrapper[18707]: E0320 09:20:03.368564 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482c1c34-d034-484b-9777-e44d27b06f2a" containerName="bootstrap-dataplane-step-1-edpm-b" Mar 20 09:20:03.368588 master-0 kubenswrapper[18707]: I0320 09:20:03.368584 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="482c1c34-d034-484b-9777-e44d27b06f2a" containerName="bootstrap-dataplane-step-1-edpm-b" Mar 20 09:20:03.368865 master-0 kubenswrapper[18707]: I0320 09:20:03.368842 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="482c1c34-d034-484b-9777-e44d27b06f2a" containerName="bootstrap-dataplane-step-1-edpm-b" Mar 20 09:20:03.369729 master-0 kubenswrapper[18707]: I0320 09:20:03.369703 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" Mar 20 09:20:03.374634 master-0 kubenswrapper[18707]: I0320 09:20:03.374577 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-b" Mar 20 09:20:03.384397 master-0 kubenswrapper[18707]: I0320 09:20:03.384336 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-dataplane-step-1-edpm-b-5xnfd"] Mar 20 09:20:03.497003 master-0 kubenswrapper[18707]: I0320 09:20:03.495517 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5kq2\" (UniqueName: \"kubernetes.io/projected/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-kube-api-access-t5kq2\") pod \"configure-network-dataplane-step-1-edpm-b-5xnfd\" (UID: \"fd304e6f-7b09-48da-8b83-7ba610a5d9d0\") " pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" Mar 20 09:20:03.497003 master-0 kubenswrapper[18707]: I0320 09:20:03.495689 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-ssh-key-edpm-b\") pod \"configure-network-dataplane-step-1-edpm-b-5xnfd\" (UID: \"fd304e6f-7b09-48da-8b83-7ba610a5d9d0\") " pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" Mar 20 09:20:03.497003 master-0 kubenswrapper[18707]: I0320 09:20:03.495843 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-inventory\") pod \"configure-network-dataplane-step-1-edpm-b-5xnfd\" (UID: \"fd304e6f-7b09-48da-8b83-7ba610a5d9d0\") " pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" Mar 20 09:20:03.599577 master-0 kubenswrapper[18707]: I0320 09:20:03.599473 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5kq2\" (UniqueName: \"kubernetes.io/projected/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-kube-api-access-t5kq2\") pod \"configure-network-dataplane-step-1-edpm-b-5xnfd\" (UID: \"fd304e6f-7b09-48da-8b83-7ba610a5d9d0\") " pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" Mar 20 09:20:03.599850 master-0 kubenswrapper[18707]: I0320 09:20:03.599677 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-ssh-key-edpm-b\") pod \"configure-network-dataplane-step-1-edpm-b-5xnfd\" (UID: \"fd304e6f-7b09-48da-8b83-7ba610a5d9d0\") " pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" Mar 20 09:20:03.599850 master-0 kubenswrapper[18707]: I0320 09:20:03.599815 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-inventory\") pod \"configure-network-dataplane-step-1-edpm-b-5xnfd\" (UID: \"fd304e6f-7b09-48da-8b83-7ba610a5d9d0\") " pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" Mar 20 09:20:03.603360 master-0 kubenswrapper[18707]: I0320 09:20:03.603300 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-ssh-key-edpm-b\") pod \"configure-network-dataplane-step-1-edpm-b-5xnfd\" (UID: \"fd304e6f-7b09-48da-8b83-7ba610a5d9d0\") " pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" Mar 20 09:20:03.604656 master-0 kubenswrapper[18707]: I0320 09:20:03.604612 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-inventory\") pod \"configure-network-dataplane-step-1-edpm-b-5xnfd\" (UID: \"fd304e6f-7b09-48da-8b83-7ba610a5d9d0\") " pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" Mar 20 09:20:03.617833 master-0 kubenswrapper[18707]: I0320 09:20:03.617768 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5kq2\" (UniqueName: \"kubernetes.io/projected/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-kube-api-access-t5kq2\") pod \"configure-network-dataplane-step-1-edpm-b-5xnfd\" (UID: \"fd304e6f-7b09-48da-8b83-7ba610a5d9d0\") " pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" Mar 20 09:20:03.685682 master-0 kubenswrapper[18707]: I0320 09:20:03.685610 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" Mar 20 09:20:04.170089 master-0 kubenswrapper[18707]: I0320 09:20:04.170007 18707 scope.go:117] "RemoveContainer" containerID="fe7fd02d21ebaa61cbde631fde06c10b7a747b5bcbd963dbb611920c73fc8549" Mar 20 09:20:04.215439 master-0 kubenswrapper[18707]: I0320 09:20:04.215379 18707 scope.go:117] "RemoveContainer" containerID="b49b9299c3c3b8fbbf401298ef6ad785f7952b5e47ff00f1fd406349be7a18b2" Mar 20 09:20:04.275346 master-0 kubenswrapper[18707]: I0320 09:20:04.275270 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-dataplane-step-1-edpm-b-5xnfd"] Mar 20 09:20:05.331645 master-0 kubenswrapper[18707]: I0320 09:20:05.331573 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" event={"ID":"fd304e6f-7b09-48da-8b83-7ba610a5d9d0","Type":"ContainerStarted","Data":"b1cd43b004606ad8b83a2d9d77c83d39de25ad3b4bd0b59ea048fa1dc41789dd"} Mar 20 09:20:06.357139 master-0 kubenswrapper[18707]: I0320 09:20:06.357036 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" event={"ID":"fd304e6f-7b09-48da-8b83-7ba610a5d9d0","Type":"ContainerStarted","Data":"2d5b356eb4617a26f61f58d8827117eb408c06c8514797f8d04229ce2239dd1f"} Mar 20 09:20:06.395055 master-0 kubenswrapper[18707]: I0320 09:20:06.394912 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" podStartSLOduration=2.177118145 podStartE2EDuration="3.394878522s" podCreationTimestamp="2026-03-20 09:20:03 +0000 UTC" firstStartedPulling="2026-03-20 09:20:04.31005342 +0000 UTC m=+2349.466233776" lastFinishedPulling="2026-03-20 09:20:05.527813807 +0000 UTC m=+2350.683994153" observedRunningTime="2026-03-20 09:20:06.380949079 +0000 UTC m=+2351.537129435" watchObservedRunningTime="2026-03-20 09:20:06.394878522 +0000 UTC m=+2351.551058908" Mar 20 09:20:31.065599 master-0 kubenswrapper[18707]: I0320 09:20:31.065530 18707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-4h8z9"] Mar 20 09:20:31.083048 master-0 kubenswrapper[18707]: I0320 09:20:31.082992 18707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-4h8z9"] Mar 20 09:20:31.112291 master-0 kubenswrapper[18707]: I0320 09:20:31.112177 18707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48343162-37bc-41e1-96cb-e5c2b7914c76" path="/var/lib/kubelet/pods/48343162-37bc-41e1-96cb-e5c2b7914c76/volumes" Mar 20 09:20:38.765263 master-0 kubenswrapper[18707]: I0320 09:20:38.765163 18707 generic.go:334] "Generic (PLEG): container finished" podID="5f0f884f-ea18-4cfb-8867-625393c5b847" containerID="657f1ec0070e428e694e802ce06672def7e7e468117d2af0c442556d696de999" exitCode=0 Mar 20 09:20:38.765263 master-0 kubenswrapper[18707]: I0320 09:20:38.765238 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" event={"ID":"5f0f884f-ea18-4cfb-8867-625393c5b847","Type":"ContainerDied","Data":"657f1ec0070e428e694e802ce06672def7e7e468117d2af0c442556d696de999"} Mar 20 09:20:40.275891 master-0 kubenswrapper[18707]: I0320 09:20:40.275789 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" Mar 20 09:20:40.332785 master-0 kubenswrapper[18707]: I0320 09:20:40.332689 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/5f0f884f-ea18-4cfb-8867-625393c5b847-ssh-key-edpm-a\") pod \"5f0f884f-ea18-4cfb-8867-625393c5b847\" (UID: \"5f0f884f-ea18-4cfb-8867-625393c5b847\") " Mar 20 09:20:40.333025 master-0 kubenswrapper[18707]: I0320 09:20:40.332821 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f0f884f-ea18-4cfb-8867-625393c5b847-inventory\") pod \"5f0f884f-ea18-4cfb-8867-625393c5b847\" (UID: \"5f0f884f-ea18-4cfb-8867-625393c5b847\") " Mar 20 09:20:40.333025 master-0 kubenswrapper[18707]: I0320 09:20:40.332969 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl4xl\" (UniqueName: \"kubernetes.io/projected/5f0f884f-ea18-4cfb-8867-625393c5b847-kube-api-access-nl4xl\") pod \"5f0f884f-ea18-4cfb-8867-625393c5b847\" (UID: \"5f0f884f-ea18-4cfb-8867-625393c5b847\") " Mar 20 09:20:40.337291 master-0 kubenswrapper[18707]: I0320 09:20:40.337238 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f0f884f-ea18-4cfb-8867-625393c5b847-kube-api-access-nl4xl" (OuterVolumeSpecName: "kube-api-access-nl4xl") pod "5f0f884f-ea18-4cfb-8867-625393c5b847" (UID: "5f0f884f-ea18-4cfb-8867-625393c5b847"). InnerVolumeSpecName "kube-api-access-nl4xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:20:40.360978 master-0 kubenswrapper[18707]: I0320 09:20:40.360898 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f0f884f-ea18-4cfb-8867-625393c5b847-inventory" (OuterVolumeSpecName: "inventory") pod "5f0f884f-ea18-4cfb-8867-625393c5b847" (UID: "5f0f884f-ea18-4cfb-8867-625393c5b847"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:40.371904 master-0 kubenswrapper[18707]: I0320 09:20:40.371780 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f0f884f-ea18-4cfb-8867-625393c5b847-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "5f0f884f-ea18-4cfb-8867-625393c5b847" (UID: "5f0f884f-ea18-4cfb-8867-625393c5b847"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:40.437416 master-0 kubenswrapper[18707]: I0320 09:20:40.437318 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl4xl\" (UniqueName: \"kubernetes.io/projected/5f0f884f-ea18-4cfb-8867-625393c5b847-kube-api-access-nl4xl\") on node \"master-0\" DevicePath \"\"" Mar 20 09:20:40.437416 master-0 kubenswrapper[18707]: I0320 09:20:40.437406 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/5f0f884f-ea18-4cfb-8867-625393c5b847-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:20:40.437416 master-0 kubenswrapper[18707]: I0320 09:20:40.437426 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f0f884f-ea18-4cfb-8867-625393c5b847-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:20:40.788169 master-0 kubenswrapper[18707]: I0320 09:20:40.788034 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" event={"ID":"5f0f884f-ea18-4cfb-8867-625393c5b847","Type":"ContainerDied","Data":"b1e18d24de295f9a3375f174a8a4a8923fa2071bb7a1890f29de0d762fa0ca42"} Mar 20 09:20:40.788449 master-0 kubenswrapper[18707]: I0320 09:20:40.788428 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1e18d24de295f9a3375f174a8a4a8923fa2071bb7a1890f29de0d762fa0ca42" Mar 20 09:20:40.788554 master-0 kubenswrapper[18707]: I0320 09:20:40.788162 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-dataplane-step-1-edpm-a-4hslh" Mar 20 09:20:40.902018 master-0 kubenswrapper[18707]: I0320 09:20:40.901959 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-dataplane-step-1-edpm-a-b7h26"] Mar 20 09:20:40.902593 master-0 kubenswrapper[18707]: E0320 09:20:40.902560 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0f884f-ea18-4cfb-8867-625393c5b847" containerName="configure-network-dataplane-step-1-edpm-a" Mar 20 09:20:40.902593 master-0 kubenswrapper[18707]: I0320 09:20:40.902585 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0f884f-ea18-4cfb-8867-625393c5b847" containerName="configure-network-dataplane-step-1-edpm-a" Mar 20 09:20:40.902960 master-0 kubenswrapper[18707]: I0320 09:20:40.902926 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0f884f-ea18-4cfb-8867-625393c5b847" containerName="configure-network-dataplane-step-1-edpm-a" Mar 20 09:20:40.903933 master-0 kubenswrapper[18707]: I0320 09:20:40.903898 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" Mar 20 09:20:40.913982 master-0 kubenswrapper[18707]: I0320 09:20:40.913790 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-dataplane-step-1-edpm-a-b7h26"] Mar 20 09:20:40.914221 master-0 kubenswrapper[18707]: I0320 09:20:40.914119 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:20:40.948499 master-0 kubenswrapper[18707]: I0320 09:20:40.948454 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01a3e39d-9403-419b-9937-6345579aadb3-inventory\") pod \"validate-network-dataplane-step-1-edpm-a-b7h26\" (UID: \"01a3e39d-9403-419b-9937-6345579aadb3\") " pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" Mar 20 09:20:40.948734 master-0 kubenswrapper[18707]: I0320 09:20:40.948565 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/01a3e39d-9403-419b-9937-6345579aadb3-ssh-key-edpm-a\") pod \"validate-network-dataplane-step-1-edpm-a-b7h26\" (UID: \"01a3e39d-9403-419b-9937-6345579aadb3\") " pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" Mar 20 09:20:40.948734 master-0 kubenswrapper[18707]: I0320 09:20:40.948616 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q56ms\" (UniqueName: \"kubernetes.io/projected/01a3e39d-9403-419b-9937-6345579aadb3-kube-api-access-q56ms\") pod \"validate-network-dataplane-step-1-edpm-a-b7h26\" (UID: \"01a3e39d-9403-419b-9937-6345579aadb3\") " pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" Mar 20 09:20:41.051090 master-0 kubenswrapper[18707]: I0320 09:20:41.050995 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/01a3e39d-9403-419b-9937-6345579aadb3-ssh-key-edpm-a\") pod \"validate-network-dataplane-step-1-edpm-a-b7h26\" (UID: \"01a3e39d-9403-419b-9937-6345579aadb3\") " pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" Mar 20 09:20:41.051422 master-0 kubenswrapper[18707]: I0320 09:20:41.051133 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q56ms\" (UniqueName: \"kubernetes.io/projected/01a3e39d-9403-419b-9937-6345579aadb3-kube-api-access-q56ms\") pod \"validate-network-dataplane-step-1-edpm-a-b7h26\" (UID: \"01a3e39d-9403-419b-9937-6345579aadb3\") " pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" Mar 20 09:20:41.051422 master-0 kubenswrapper[18707]: I0320 09:20:41.051278 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01a3e39d-9403-419b-9937-6345579aadb3-inventory\") pod \"validate-network-dataplane-step-1-edpm-a-b7h26\" (UID: \"01a3e39d-9403-419b-9937-6345579aadb3\") " pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" Mar 20 09:20:41.055694 master-0 kubenswrapper[18707]: I0320 09:20:41.055647 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/01a3e39d-9403-419b-9937-6345579aadb3-ssh-key-edpm-a\") pod \"validate-network-dataplane-step-1-edpm-a-b7h26\" (UID: \"01a3e39d-9403-419b-9937-6345579aadb3\") " pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" Mar 20 09:20:41.056396 master-0 kubenswrapper[18707]: I0320 09:20:41.056336 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01a3e39d-9403-419b-9937-6345579aadb3-inventory\") pod \"validate-network-dataplane-step-1-edpm-a-b7h26\" (UID: \"01a3e39d-9403-419b-9937-6345579aadb3\") " pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" Mar 20 09:20:41.077405 master-0 kubenswrapper[18707]: I0320 09:20:41.077334 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q56ms\" (UniqueName: \"kubernetes.io/projected/01a3e39d-9403-419b-9937-6345579aadb3-kube-api-access-q56ms\") pod \"validate-network-dataplane-step-1-edpm-a-b7h26\" (UID: \"01a3e39d-9403-419b-9937-6345579aadb3\") " pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" Mar 20 09:20:41.260892 master-0 kubenswrapper[18707]: I0320 09:20:41.260792 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" Mar 20 09:20:41.827772 master-0 kubenswrapper[18707]: I0320 09:20:41.825441 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-dataplane-step-1-edpm-a-b7h26"] Mar 20 09:20:42.825372 master-0 kubenswrapper[18707]: I0320 09:20:42.825224 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" event={"ID":"01a3e39d-9403-419b-9937-6345579aadb3","Type":"ContainerStarted","Data":"33f6063f492454b2752501b7a17e00b3c4c2ebe020aeaae6ce120c4f94dadcb7"} Mar 20 09:20:42.825372 master-0 kubenswrapper[18707]: I0320 09:20:42.825291 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" event={"ID":"01a3e39d-9403-419b-9937-6345579aadb3","Type":"ContainerStarted","Data":"e893566899e15afe23b249650dfabc9b47e4cc49b75bc271c8905dda60d7cd1c"} Mar 20 09:20:42.853208 master-0 kubenswrapper[18707]: I0320 09:20:42.853106 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" podStartSLOduration=2.398073658 podStartE2EDuration="2.853085327s" podCreationTimestamp="2026-03-20 09:20:40 +0000 UTC" firstStartedPulling="2026-03-20 09:20:41.825448225 +0000 UTC m=+2386.981628591" lastFinishedPulling="2026-03-20 09:20:42.280459914 +0000 UTC m=+2387.436640260" observedRunningTime="2026-03-20 09:20:42.845988266 +0000 UTC m=+2388.002168622" watchObservedRunningTime="2026-03-20 09:20:42.853085327 +0000 UTC m=+2388.009265683" Mar 20 09:20:46.874063 master-0 kubenswrapper[18707]: I0320 09:20:46.873999 18707 generic.go:334] "Generic (PLEG): container finished" podID="fd304e6f-7b09-48da-8b83-7ba610a5d9d0" containerID="2d5b356eb4617a26f61f58d8827117eb408c06c8514797f8d04229ce2239dd1f" exitCode=0 Mar 20 09:20:46.874063 master-0 kubenswrapper[18707]: I0320 09:20:46.874065 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" event={"ID":"fd304e6f-7b09-48da-8b83-7ba610a5d9d0","Type":"ContainerDied","Data":"2d5b356eb4617a26f61f58d8827117eb408c06c8514797f8d04229ce2239dd1f"} Mar 20 09:20:48.457800 master-0 kubenswrapper[18707]: I0320 09:20:48.457138 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" Mar 20 09:20:48.471103 master-0 kubenswrapper[18707]: I0320 09:20:48.471037 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-ssh-key-edpm-b\") pod \"fd304e6f-7b09-48da-8b83-7ba610a5d9d0\" (UID: \"fd304e6f-7b09-48da-8b83-7ba610a5d9d0\") " Mar 20 09:20:48.471338 master-0 kubenswrapper[18707]: I0320 09:20:48.471239 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-inventory\") pod \"fd304e6f-7b09-48da-8b83-7ba610a5d9d0\" (UID: \"fd304e6f-7b09-48da-8b83-7ba610a5d9d0\") " Mar 20 09:20:48.471842 master-0 kubenswrapper[18707]: I0320 09:20:48.471402 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5kq2\" (UniqueName: \"kubernetes.io/projected/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-kube-api-access-t5kq2\") pod \"fd304e6f-7b09-48da-8b83-7ba610a5d9d0\" (UID: \"fd304e6f-7b09-48da-8b83-7ba610a5d9d0\") " Mar 20 09:20:48.474779 master-0 kubenswrapper[18707]: I0320 09:20:48.474727 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-kube-api-access-t5kq2" (OuterVolumeSpecName: "kube-api-access-t5kq2") pod "fd304e6f-7b09-48da-8b83-7ba610a5d9d0" (UID: "fd304e6f-7b09-48da-8b83-7ba610a5d9d0"). InnerVolumeSpecName "kube-api-access-t5kq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:20:48.507149 master-0 kubenswrapper[18707]: I0320 09:20:48.507083 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-inventory" (OuterVolumeSpecName: "inventory") pod "fd304e6f-7b09-48da-8b83-7ba610a5d9d0" (UID: "fd304e6f-7b09-48da-8b83-7ba610a5d9d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:48.519671 master-0 kubenswrapper[18707]: I0320 09:20:48.519598 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "fd304e6f-7b09-48da-8b83-7ba610a5d9d0" (UID: "fd304e6f-7b09-48da-8b83-7ba610a5d9d0"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:48.575208 master-0 kubenswrapper[18707]: I0320 09:20:48.575138 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:20:48.575208 master-0 kubenswrapper[18707]: I0320 09:20:48.575202 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5kq2\" (UniqueName: \"kubernetes.io/projected/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-kube-api-access-t5kq2\") on node \"master-0\" DevicePath \"\"" Mar 20 09:20:48.575208 master-0 kubenswrapper[18707]: I0320 09:20:48.575219 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/fd304e6f-7b09-48da-8b83-7ba610a5d9d0-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:20:48.900209 master-0 kubenswrapper[18707]: I0320 09:20:48.896840 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" event={"ID":"fd304e6f-7b09-48da-8b83-7ba610a5d9d0","Type":"ContainerDied","Data":"b1cd43b004606ad8b83a2d9d77c83d39de25ad3b4bd0b59ea048fa1dc41789dd"} Mar 20 09:20:48.900209 master-0 kubenswrapper[18707]: I0320 09:20:48.896900 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1cd43b004606ad8b83a2d9d77c83d39de25ad3b4bd0b59ea048fa1dc41789dd" Mar 20 09:20:48.900209 master-0 kubenswrapper[18707]: I0320 09:20:48.896869 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-dataplane-step-1-edpm-b-5xnfd" Mar 20 09:20:49.187941 master-0 kubenswrapper[18707]: I0320 09:20:49.187807 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-dataplane-step-1-edpm-b-x6qvv"] Mar 20 09:20:49.188395 master-0 kubenswrapper[18707]: E0320 09:20:49.188356 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd304e6f-7b09-48da-8b83-7ba610a5d9d0" containerName="configure-network-dataplane-step-1-edpm-b" Mar 20 09:20:49.188395 master-0 kubenswrapper[18707]: I0320 09:20:49.188383 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd304e6f-7b09-48da-8b83-7ba610a5d9d0" containerName="configure-network-dataplane-step-1-edpm-b" Mar 20 09:20:49.188711 master-0 kubenswrapper[18707]: I0320 09:20:49.188677 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd304e6f-7b09-48da-8b83-7ba610a5d9d0" containerName="configure-network-dataplane-step-1-edpm-b" Mar 20 09:20:49.189841 master-0 kubenswrapper[18707]: I0320 09:20:49.189710 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" Mar 20 09:20:49.192419 master-0 kubenswrapper[18707]: I0320 09:20:49.192358 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-b" Mar 20 09:20:49.198935 master-0 kubenswrapper[18707]: I0320 09:20:49.198847 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-dataplane-step-1-edpm-b-x6qvv"] Mar 20 09:20:49.295650 master-0 kubenswrapper[18707]: I0320 09:20:49.295573 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w6xb\" (UniqueName: \"kubernetes.io/projected/35577f53-56c5-41cc-82f6-5651a10eff72-kube-api-access-7w6xb\") pod \"validate-network-dataplane-step-1-edpm-b-x6qvv\" (UID: \"35577f53-56c5-41cc-82f6-5651a10eff72\") " pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" Mar 20 09:20:49.296057 master-0 kubenswrapper[18707]: I0320 09:20:49.296037 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35577f53-56c5-41cc-82f6-5651a10eff72-inventory\") pod \"validate-network-dataplane-step-1-edpm-b-x6qvv\" (UID: \"35577f53-56c5-41cc-82f6-5651a10eff72\") " pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" Mar 20 09:20:49.296319 master-0 kubenswrapper[18707]: I0320 09:20:49.296292 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/35577f53-56c5-41cc-82f6-5651a10eff72-ssh-key-edpm-b\") pod \"validate-network-dataplane-step-1-edpm-b-x6qvv\" (UID: \"35577f53-56c5-41cc-82f6-5651a10eff72\") " pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" Mar 20 09:20:49.409212 master-0 kubenswrapper[18707]: I0320 09:20:49.409123 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w6xb\" (UniqueName: \"kubernetes.io/projected/35577f53-56c5-41cc-82f6-5651a10eff72-kube-api-access-7w6xb\") pod \"validate-network-dataplane-step-1-edpm-b-x6qvv\" (UID: \"35577f53-56c5-41cc-82f6-5651a10eff72\") " pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" Mar 20 09:20:49.409457 master-0 kubenswrapper[18707]: I0320 09:20:49.409308 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35577f53-56c5-41cc-82f6-5651a10eff72-inventory\") pod \"validate-network-dataplane-step-1-edpm-b-x6qvv\" (UID: \"35577f53-56c5-41cc-82f6-5651a10eff72\") " pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" Mar 20 09:20:49.409457 master-0 kubenswrapper[18707]: I0320 09:20:49.409398 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/35577f53-56c5-41cc-82f6-5651a10eff72-ssh-key-edpm-b\") pod \"validate-network-dataplane-step-1-edpm-b-x6qvv\" (UID: \"35577f53-56c5-41cc-82f6-5651a10eff72\") " pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" Mar 20 09:20:49.413264 master-0 kubenswrapper[18707]: I0320 09:20:49.412858 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/35577f53-56c5-41cc-82f6-5651a10eff72-ssh-key-edpm-b\") pod \"validate-network-dataplane-step-1-edpm-b-x6qvv\" (UID: \"35577f53-56c5-41cc-82f6-5651a10eff72\") " pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" Mar 20 09:20:49.413596 master-0 kubenswrapper[18707]: I0320 09:20:49.413539 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35577f53-56c5-41cc-82f6-5651a10eff72-inventory\") pod \"validate-network-dataplane-step-1-edpm-b-x6qvv\" (UID: \"35577f53-56c5-41cc-82f6-5651a10eff72\") " pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" Mar 20 09:20:49.426495 master-0 kubenswrapper[18707]: I0320 09:20:49.426443 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w6xb\" (UniqueName: \"kubernetes.io/projected/35577f53-56c5-41cc-82f6-5651a10eff72-kube-api-access-7w6xb\") pod \"validate-network-dataplane-step-1-edpm-b-x6qvv\" (UID: \"35577f53-56c5-41cc-82f6-5651a10eff72\") " pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" Mar 20 09:20:49.510789 master-0 kubenswrapper[18707]: I0320 09:20:49.510604 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" Mar 20 09:20:50.063975 master-0 kubenswrapper[18707]: W0320 09:20:50.063911 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35577f53_56c5_41cc_82f6_5651a10eff72.slice/crio-5cfca968867185f177102d2662bab1b85e6cc2a5c8c30fc99fb079fddc01774c WatchSource:0}: Error finding container 5cfca968867185f177102d2662bab1b85e6cc2a5c8c30fc99fb079fddc01774c: Status 404 returned error can't find the container with id 5cfca968867185f177102d2662bab1b85e6cc2a5c8c30fc99fb079fddc01774c Mar 20 09:20:50.068470 master-0 kubenswrapper[18707]: I0320 09:20:50.068421 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-dataplane-step-1-edpm-b-x6qvv"] Mar 20 09:20:50.925378 master-0 kubenswrapper[18707]: I0320 09:20:50.925194 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" event={"ID":"35577f53-56c5-41cc-82f6-5651a10eff72","Type":"ContainerStarted","Data":"ca58c8793310c81051531f4c1a97510844dce90f78ab2bcd7a13db0e6cfad990"} Mar 20 09:20:50.925378 master-0 kubenswrapper[18707]: I0320 09:20:50.925251 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" event={"ID":"35577f53-56c5-41cc-82f6-5651a10eff72","Type":"ContainerStarted","Data":"5cfca968867185f177102d2662bab1b85e6cc2a5c8c30fc99fb079fddc01774c"} Mar 20 09:20:50.966829 master-0 kubenswrapper[18707]: I0320 09:20:50.966701 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" podStartSLOduration=1.467450634 podStartE2EDuration="1.966676104s" podCreationTimestamp="2026-03-20 09:20:49 +0000 UTC" firstStartedPulling="2026-03-20 09:20:50.066788711 +0000 UTC m=+2395.222969067" lastFinishedPulling="2026-03-20 09:20:50.566014161 +0000 UTC m=+2395.722194537" observedRunningTime="2026-03-20 09:20:50.944528988 +0000 UTC m=+2396.100709364" watchObservedRunningTime="2026-03-20 09:20:50.966676104 +0000 UTC m=+2396.122856470" Mar 20 09:20:55.984665 master-0 kubenswrapper[18707]: I0320 09:20:55.984590 18707 generic.go:334] "Generic (PLEG): container finished" podID="01a3e39d-9403-419b-9937-6345579aadb3" containerID="33f6063f492454b2752501b7a17e00b3c4c2ebe020aeaae6ce120c4f94dadcb7" exitCode=0 Mar 20 09:20:55.984665 master-0 kubenswrapper[18707]: I0320 09:20:55.984657 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" event={"ID":"01a3e39d-9403-419b-9937-6345579aadb3","Type":"ContainerDied","Data":"33f6063f492454b2752501b7a17e00b3c4c2ebe020aeaae6ce120c4f94dadcb7"} Mar 20 09:20:57.777257 master-0 kubenswrapper[18707]: I0320 09:20:57.777117 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" Mar 20 09:20:57.942475 master-0 kubenswrapper[18707]: I0320 09:20:57.942385 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q56ms\" (UniqueName: \"kubernetes.io/projected/01a3e39d-9403-419b-9937-6345579aadb3-kube-api-access-q56ms\") pod \"01a3e39d-9403-419b-9937-6345579aadb3\" (UID: \"01a3e39d-9403-419b-9937-6345579aadb3\") " Mar 20 09:20:57.942827 master-0 kubenswrapper[18707]: I0320 09:20:57.942798 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01a3e39d-9403-419b-9937-6345579aadb3-inventory\") pod \"01a3e39d-9403-419b-9937-6345579aadb3\" (UID: \"01a3e39d-9403-419b-9937-6345579aadb3\") " Mar 20 09:20:57.943322 master-0 kubenswrapper[18707]: I0320 09:20:57.943294 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/01a3e39d-9403-419b-9937-6345579aadb3-ssh-key-edpm-a\") pod \"01a3e39d-9403-419b-9937-6345579aadb3\" (UID: \"01a3e39d-9403-419b-9937-6345579aadb3\") " Mar 20 09:20:57.947558 master-0 kubenswrapper[18707]: I0320 09:20:57.947492 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01a3e39d-9403-419b-9937-6345579aadb3-kube-api-access-q56ms" (OuterVolumeSpecName: "kube-api-access-q56ms") pod "01a3e39d-9403-419b-9937-6345579aadb3" (UID: "01a3e39d-9403-419b-9937-6345579aadb3"). InnerVolumeSpecName "kube-api-access-q56ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:20:57.979610 master-0 kubenswrapper[18707]: I0320 09:20:57.979518 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01a3e39d-9403-419b-9937-6345579aadb3-inventory" (OuterVolumeSpecName: "inventory") pod "01a3e39d-9403-419b-9937-6345579aadb3" (UID: "01a3e39d-9403-419b-9937-6345579aadb3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:57.985078 master-0 kubenswrapper[18707]: I0320 09:20:57.985032 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01a3e39d-9403-419b-9937-6345579aadb3-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "01a3e39d-9403-419b-9937-6345579aadb3" (UID: "01a3e39d-9403-419b-9937-6345579aadb3"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:20:58.013277 master-0 kubenswrapper[18707]: I0320 09:20:58.013224 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" event={"ID":"01a3e39d-9403-419b-9937-6345579aadb3","Type":"ContainerDied","Data":"e893566899e15afe23b249650dfabc9b47e4cc49b75bc271c8905dda60d7cd1c"} Mar 20 09:20:58.013402 master-0 kubenswrapper[18707]: I0320 09:20:58.013289 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e893566899e15afe23b249650dfabc9b47e4cc49b75bc271c8905dda60d7cd1c" Mar 20 09:20:58.013402 master-0 kubenswrapper[18707]: I0320 09:20:58.013365 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-dataplane-step-1-edpm-a-b7h26" Mar 20 09:20:58.047874 master-0 kubenswrapper[18707]: I0320 09:20:58.047744 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q56ms\" (UniqueName: \"kubernetes.io/projected/01a3e39d-9403-419b-9937-6345579aadb3-kube-api-access-q56ms\") on node \"master-0\" DevicePath \"\"" Mar 20 09:20:58.047874 master-0 kubenswrapper[18707]: I0320 09:20:58.047877 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01a3e39d-9403-419b-9937-6345579aadb3-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:20:58.048296 master-0 kubenswrapper[18707]: I0320 09:20:58.047891 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/01a3e39d-9403-419b-9937-6345579aadb3-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:20:58.487173 master-0 kubenswrapper[18707]: I0320 09:20:58.487101 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-dataplane-step-1-edpm-a-4zvvk"] Mar 20 09:20:58.487798 master-0 kubenswrapper[18707]: E0320 09:20:58.487762 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a3e39d-9403-419b-9937-6345579aadb3" containerName="validate-network-dataplane-step-1-edpm-a" Mar 20 09:20:58.487798 master-0 kubenswrapper[18707]: I0320 09:20:58.487792 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a3e39d-9403-419b-9937-6345579aadb3" containerName="validate-network-dataplane-step-1-edpm-a" Mar 20 09:20:58.488210 master-0 kubenswrapper[18707]: I0320 09:20:58.488161 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a3e39d-9403-419b-9937-6345579aadb3" containerName="validate-network-dataplane-step-1-edpm-a" Mar 20 09:20:58.489260 master-0 kubenswrapper[18707]: I0320 09:20:58.489198 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" Mar 20 09:20:58.511583 master-0 kubenswrapper[18707]: I0320 09:20:58.495725 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:20:58.544025 master-0 kubenswrapper[18707]: I0320 09:20:58.543965 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-dataplane-step-1-edpm-a-4zvvk"] Mar 20 09:20:58.569725 master-0 kubenswrapper[18707]: I0320 09:20:58.569641 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ce2b5c0-654c-4d31-8a43-991c60897f3f-inventory\") pod \"install-os-dataplane-step-1-edpm-a-4zvvk\" (UID: \"0ce2b5c0-654c-4d31-8a43-991c60897f3f\") " pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" Mar 20 09:20:58.569725 master-0 kubenswrapper[18707]: I0320 09:20:58.569704 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb694\" (UniqueName: \"kubernetes.io/projected/0ce2b5c0-654c-4d31-8a43-991c60897f3f-kube-api-access-mb694\") pod \"install-os-dataplane-step-1-edpm-a-4zvvk\" (UID: \"0ce2b5c0-654c-4d31-8a43-991c60897f3f\") " pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" Mar 20 09:20:58.570001 master-0 kubenswrapper[18707]: I0320 09:20:58.569803 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/0ce2b5c0-654c-4d31-8a43-991c60897f3f-ssh-key-edpm-a\") pod \"install-os-dataplane-step-1-edpm-a-4zvvk\" (UID: \"0ce2b5c0-654c-4d31-8a43-991c60897f3f\") " pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" Mar 20 09:20:58.670800 master-0 kubenswrapper[18707]: I0320 09:20:58.670718 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ce2b5c0-654c-4d31-8a43-991c60897f3f-inventory\") pod \"install-os-dataplane-step-1-edpm-a-4zvvk\" (UID: \"0ce2b5c0-654c-4d31-8a43-991c60897f3f\") " pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" Mar 20 09:20:58.670800 master-0 kubenswrapper[18707]: I0320 09:20:58.670783 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb694\" (UniqueName: \"kubernetes.io/projected/0ce2b5c0-654c-4d31-8a43-991c60897f3f-kube-api-access-mb694\") pod \"install-os-dataplane-step-1-edpm-a-4zvvk\" (UID: \"0ce2b5c0-654c-4d31-8a43-991c60897f3f\") " pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" Mar 20 09:20:58.671053 master-0 kubenswrapper[18707]: I0320 09:20:58.670857 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/0ce2b5c0-654c-4d31-8a43-991c60897f3f-ssh-key-edpm-a\") pod \"install-os-dataplane-step-1-edpm-a-4zvvk\" (UID: \"0ce2b5c0-654c-4d31-8a43-991c60897f3f\") " pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" Mar 20 09:20:58.675116 master-0 kubenswrapper[18707]: I0320 09:20:58.675072 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ce2b5c0-654c-4d31-8a43-991c60897f3f-inventory\") pod \"install-os-dataplane-step-1-edpm-a-4zvvk\" (UID: \"0ce2b5c0-654c-4d31-8a43-991c60897f3f\") " pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" Mar 20 09:20:58.681682 master-0 kubenswrapper[18707]: I0320 09:20:58.681633 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/0ce2b5c0-654c-4d31-8a43-991c60897f3f-ssh-key-edpm-a\") pod \"install-os-dataplane-step-1-edpm-a-4zvvk\" (UID: \"0ce2b5c0-654c-4d31-8a43-991c60897f3f\") " pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" Mar 20 09:20:58.719013 master-0 kubenswrapper[18707]: I0320 09:20:58.718903 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb694\" (UniqueName: \"kubernetes.io/projected/0ce2b5c0-654c-4d31-8a43-991c60897f3f-kube-api-access-mb694\") pod \"install-os-dataplane-step-1-edpm-a-4zvvk\" (UID: \"0ce2b5c0-654c-4d31-8a43-991c60897f3f\") " pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" Mar 20 09:20:58.809274 master-0 kubenswrapper[18707]: I0320 09:20:58.809206 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" Mar 20 09:20:59.372084 master-0 kubenswrapper[18707]: I0320 09:20:59.372019 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-dataplane-step-1-edpm-a-4zvvk"] Mar 20 09:21:00.045926 master-0 kubenswrapper[18707]: I0320 09:21:00.045851 18707 generic.go:334] "Generic (PLEG): container finished" podID="35577f53-56c5-41cc-82f6-5651a10eff72" containerID="ca58c8793310c81051531f4c1a97510844dce90f78ab2bcd7a13db0e6cfad990" exitCode=0 Mar 20 09:21:00.046828 master-0 kubenswrapper[18707]: I0320 09:21:00.045922 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" event={"ID":"35577f53-56c5-41cc-82f6-5651a10eff72","Type":"ContainerDied","Data":"ca58c8793310c81051531f4c1a97510844dce90f78ab2bcd7a13db0e6cfad990"} Mar 20 09:21:00.048377 master-0 kubenswrapper[18707]: I0320 09:21:00.048170 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" event={"ID":"0ce2b5c0-654c-4d31-8a43-991c60897f3f","Type":"ContainerStarted","Data":"4469d1363cbfdaab70f3d1ca3d90b1d28e65a75d10c86ce6173aee82217ec0ac"} Mar 20 09:21:01.060445 master-0 kubenswrapper[18707]: I0320 09:21:01.060363 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" event={"ID":"0ce2b5c0-654c-4d31-8a43-991c60897f3f","Type":"ContainerStarted","Data":"b009f06e37c72ffb498e44dfdacee0c30cf1ed5c34eedba8e92e90fb060ae4fe"} Mar 20 09:21:01.128063 master-0 kubenswrapper[18707]: I0320 09:21:01.127948 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" podStartSLOduration=2.671680977 podStartE2EDuration="3.127924832s" podCreationTimestamp="2026-03-20 09:20:58 +0000 UTC" firstStartedPulling="2026-03-20 09:20:59.385279202 +0000 UTC m=+2404.541459558" lastFinishedPulling="2026-03-20 09:20:59.841523057 +0000 UTC m=+2404.997703413" observedRunningTime="2026-03-20 09:21:01.108291097 +0000 UTC m=+2406.264471473" watchObservedRunningTime="2026-03-20 09:21:01.127924832 +0000 UTC m=+2406.284105188" Mar 20 09:21:01.976637 master-0 kubenswrapper[18707]: I0320 09:21:01.976578 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" Mar 20 09:21:02.059244 master-0 kubenswrapper[18707]: I0320 09:21:02.056788 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35577f53-56c5-41cc-82f6-5651a10eff72-inventory\") pod \"35577f53-56c5-41cc-82f6-5651a10eff72\" (UID: \"35577f53-56c5-41cc-82f6-5651a10eff72\") " Mar 20 09:21:02.059244 master-0 kubenswrapper[18707]: I0320 09:21:02.056844 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/35577f53-56c5-41cc-82f6-5651a10eff72-ssh-key-edpm-b\") pod \"35577f53-56c5-41cc-82f6-5651a10eff72\" (UID: \"35577f53-56c5-41cc-82f6-5651a10eff72\") " Mar 20 09:21:02.059244 master-0 kubenswrapper[18707]: I0320 09:21:02.056944 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w6xb\" (UniqueName: \"kubernetes.io/projected/35577f53-56c5-41cc-82f6-5651a10eff72-kube-api-access-7w6xb\") pod \"35577f53-56c5-41cc-82f6-5651a10eff72\" (UID: \"35577f53-56c5-41cc-82f6-5651a10eff72\") " Mar 20 09:21:02.064668 master-0 kubenswrapper[18707]: I0320 09:21:02.060638 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35577f53-56c5-41cc-82f6-5651a10eff72-kube-api-access-7w6xb" (OuterVolumeSpecName: "kube-api-access-7w6xb") pod "35577f53-56c5-41cc-82f6-5651a10eff72" (UID: "35577f53-56c5-41cc-82f6-5651a10eff72"). InnerVolumeSpecName "kube-api-access-7w6xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:21:02.085369 master-0 kubenswrapper[18707]: I0320 09:21:02.082678 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" event={"ID":"35577f53-56c5-41cc-82f6-5651a10eff72","Type":"ContainerDied","Data":"5cfca968867185f177102d2662bab1b85e6cc2a5c8c30fc99fb079fddc01774c"} Mar 20 09:21:02.085369 master-0 kubenswrapper[18707]: I0320 09:21:02.082743 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cfca968867185f177102d2662bab1b85e6cc2a5c8c30fc99fb079fddc01774c" Mar 20 09:21:02.085369 master-0 kubenswrapper[18707]: I0320 09:21:02.082706 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-dataplane-step-1-edpm-b-x6qvv" Mar 20 09:21:02.100256 master-0 kubenswrapper[18707]: I0320 09:21:02.100103 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35577f53-56c5-41cc-82f6-5651a10eff72-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "35577f53-56c5-41cc-82f6-5651a10eff72" (UID: "35577f53-56c5-41cc-82f6-5651a10eff72"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:21:02.112361 master-0 kubenswrapper[18707]: I0320 09:21:02.112288 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35577f53-56c5-41cc-82f6-5651a10eff72-inventory" (OuterVolumeSpecName: "inventory") pod "35577f53-56c5-41cc-82f6-5651a10eff72" (UID: "35577f53-56c5-41cc-82f6-5651a10eff72"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:21:02.160439 master-0 kubenswrapper[18707]: I0320 09:21:02.160293 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/35577f53-56c5-41cc-82f6-5651a10eff72-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:21:02.160439 master-0 kubenswrapper[18707]: I0320 09:21:02.160359 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w6xb\" (UniqueName: \"kubernetes.io/projected/35577f53-56c5-41cc-82f6-5651a10eff72-kube-api-access-7w6xb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:21:02.160648 master-0 kubenswrapper[18707]: I0320 09:21:02.160459 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35577f53-56c5-41cc-82f6-5651a10eff72-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:21:02.551772 master-0 kubenswrapper[18707]: I0320 09:21:02.551710 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-dataplane-step-1-edpm-b-6gmw5"] Mar 20 09:21:02.552258 master-0 kubenswrapper[18707]: E0320 09:21:02.552236 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35577f53-56c5-41cc-82f6-5651a10eff72" containerName="validate-network-dataplane-step-1-edpm-b" Mar 20 09:21:02.552258 master-0 kubenswrapper[18707]: I0320 09:21:02.552255 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35577f53-56c5-41cc-82f6-5651a10eff72" containerName="validate-network-dataplane-step-1-edpm-b" Mar 20 09:21:02.552595 master-0 kubenswrapper[18707]: I0320 09:21:02.552573 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="35577f53-56c5-41cc-82f6-5651a10eff72" containerName="validate-network-dataplane-step-1-edpm-b" Mar 20 09:21:02.553396 master-0 kubenswrapper[18707]: I0320 09:21:02.553372 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" Mar 20 09:21:02.557519 master-0 kubenswrapper[18707]: I0320 09:21:02.557463 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-b" Mar 20 09:21:02.573316 master-0 kubenswrapper[18707]: I0320 09:21:02.572240 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4gvf\" (UniqueName: \"kubernetes.io/projected/893073df-8e96-4106-ac53-246e4487ce6c-kube-api-access-k4gvf\") pod \"install-os-dataplane-step-1-edpm-b-6gmw5\" (UID: \"893073df-8e96-4106-ac53-246e4487ce6c\") " pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" Mar 20 09:21:02.573316 master-0 kubenswrapper[18707]: I0320 09:21:02.572824 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/893073df-8e96-4106-ac53-246e4487ce6c-inventory\") pod \"install-os-dataplane-step-1-edpm-b-6gmw5\" (UID: \"893073df-8e96-4106-ac53-246e4487ce6c\") " pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" Mar 20 09:21:02.573316 master-0 kubenswrapper[18707]: I0320 09:21:02.572941 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/893073df-8e96-4106-ac53-246e4487ce6c-ssh-key-edpm-b\") pod \"install-os-dataplane-step-1-edpm-b-6gmw5\" (UID: \"893073df-8e96-4106-ac53-246e4487ce6c\") " pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" Mar 20 09:21:02.675549 master-0 kubenswrapper[18707]: I0320 09:21:02.675453 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/893073df-8e96-4106-ac53-246e4487ce6c-inventory\") pod \"install-os-dataplane-step-1-edpm-b-6gmw5\" (UID: \"893073df-8e96-4106-ac53-246e4487ce6c\") " pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" Mar 20 09:21:02.676322 master-0 kubenswrapper[18707]: I0320 09:21:02.675646 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/893073df-8e96-4106-ac53-246e4487ce6c-ssh-key-edpm-b\") pod \"install-os-dataplane-step-1-edpm-b-6gmw5\" (UID: \"893073df-8e96-4106-ac53-246e4487ce6c\") " pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" Mar 20 09:21:02.677330 master-0 kubenswrapper[18707]: I0320 09:21:02.676894 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4gvf\" (UniqueName: \"kubernetes.io/projected/893073df-8e96-4106-ac53-246e4487ce6c-kube-api-access-k4gvf\") pod \"install-os-dataplane-step-1-edpm-b-6gmw5\" (UID: \"893073df-8e96-4106-ac53-246e4487ce6c\") " pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" Mar 20 09:21:02.678971 master-0 kubenswrapper[18707]: I0320 09:21:02.678920 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/893073df-8e96-4106-ac53-246e4487ce6c-inventory\") pod \"install-os-dataplane-step-1-edpm-b-6gmw5\" (UID: \"893073df-8e96-4106-ac53-246e4487ce6c\") " pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" Mar 20 09:21:02.681903 master-0 kubenswrapper[18707]: I0320 09:21:02.681845 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/893073df-8e96-4106-ac53-246e4487ce6c-ssh-key-edpm-b\") pod \"install-os-dataplane-step-1-edpm-b-6gmw5\" (UID: \"893073df-8e96-4106-ac53-246e4487ce6c\") " pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" Mar 20 09:21:02.726613 master-0 kubenswrapper[18707]: I0320 09:21:02.726554 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-dataplane-step-1-edpm-b-6gmw5"] Mar 20 09:21:02.750925 master-0 kubenswrapper[18707]: I0320 09:21:02.749726 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4gvf\" (UniqueName: \"kubernetes.io/projected/893073df-8e96-4106-ac53-246e4487ce6c-kube-api-access-k4gvf\") pod \"install-os-dataplane-step-1-edpm-b-6gmw5\" (UID: \"893073df-8e96-4106-ac53-246e4487ce6c\") " pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" Mar 20 09:21:02.877758 master-0 kubenswrapper[18707]: I0320 09:21:02.877614 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" Mar 20 09:21:03.512919 master-0 kubenswrapper[18707]: I0320 09:21:03.512875 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-dataplane-step-1-edpm-b-6gmw5"] Mar 20 09:21:04.122394 master-0 kubenswrapper[18707]: I0320 09:21:04.122336 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" event={"ID":"893073df-8e96-4106-ac53-246e4487ce6c","Type":"ContainerStarted","Data":"c3958671f2fb20ac98e5b4cc5614b220bebea2bd77b1d7473bdd70a43ef28426"} Mar 20 09:21:04.342782 master-0 kubenswrapper[18707]: I0320 09:21:04.342654 18707 scope.go:117] "RemoveContainer" containerID="8469ccfcd66b36fd78da4cb8fb0d53915013a1b482985cc685a4b5d6c7455171" Mar 20 09:21:05.137405 master-0 kubenswrapper[18707]: I0320 09:21:05.137341 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" event={"ID":"893073df-8e96-4106-ac53-246e4487ce6c","Type":"ContainerStarted","Data":"de5b8263ee9fe4628a95da1977d23c15059c96a5d444c5b611c705fce8912f39"} Mar 20 09:21:05.169300 master-0 kubenswrapper[18707]: I0320 09:21:05.169203 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" podStartSLOduration=2.593796336 podStartE2EDuration="3.169163227s" podCreationTimestamp="2026-03-20 09:21:02 +0000 UTC" firstStartedPulling="2026-03-20 09:21:03.515679207 +0000 UTC m=+2408.671859563" lastFinishedPulling="2026-03-20 09:21:04.091046098 +0000 UTC m=+2409.247226454" observedRunningTime="2026-03-20 09:21:05.159985808 +0000 UTC m=+2410.316166194" watchObservedRunningTime="2026-03-20 09:21:05.169163227 +0000 UTC m=+2410.325343593" Mar 20 09:21:31.491916 master-0 kubenswrapper[18707]: I0320 09:21:31.491843 18707 generic.go:334] "Generic (PLEG): container finished" podID="0ce2b5c0-654c-4d31-8a43-991c60897f3f" containerID="b009f06e37c72ffb498e44dfdacee0c30cf1ed5c34eedba8e92e90fb060ae4fe" exitCode=0 Mar 20 09:21:31.492674 master-0 kubenswrapper[18707]: I0320 09:21:31.492075 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" event={"ID":"0ce2b5c0-654c-4d31-8a43-991c60897f3f","Type":"ContainerDied","Data":"b009f06e37c72ffb498e44dfdacee0c30cf1ed5c34eedba8e92e90fb060ae4fe"} Mar 20 09:21:33.007162 master-0 kubenswrapper[18707]: I0320 09:21:33.007110 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" Mar 20 09:21:33.092873 master-0 kubenswrapper[18707]: I0320 09:21:33.092800 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/0ce2b5c0-654c-4d31-8a43-991c60897f3f-ssh-key-edpm-a\") pod \"0ce2b5c0-654c-4d31-8a43-991c60897f3f\" (UID: \"0ce2b5c0-654c-4d31-8a43-991c60897f3f\") " Mar 20 09:21:33.093245 master-0 kubenswrapper[18707]: I0320 09:21:33.092956 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ce2b5c0-654c-4d31-8a43-991c60897f3f-inventory\") pod \"0ce2b5c0-654c-4d31-8a43-991c60897f3f\" (UID: \"0ce2b5c0-654c-4d31-8a43-991c60897f3f\") " Mar 20 09:21:33.093245 master-0 kubenswrapper[18707]: I0320 09:21:33.092991 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb694\" (UniqueName: \"kubernetes.io/projected/0ce2b5c0-654c-4d31-8a43-991c60897f3f-kube-api-access-mb694\") pod \"0ce2b5c0-654c-4d31-8a43-991c60897f3f\" (UID: \"0ce2b5c0-654c-4d31-8a43-991c60897f3f\") " Mar 20 09:21:33.117007 master-0 kubenswrapper[18707]: I0320 09:21:33.096024 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce2b5c0-654c-4d31-8a43-991c60897f3f-kube-api-access-mb694" (OuterVolumeSpecName: "kube-api-access-mb694") pod "0ce2b5c0-654c-4d31-8a43-991c60897f3f" (UID: "0ce2b5c0-654c-4d31-8a43-991c60897f3f"). InnerVolumeSpecName "kube-api-access-mb694". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:21:33.122214 master-0 kubenswrapper[18707]: I0320 09:21:33.121522 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce2b5c0-654c-4d31-8a43-991c60897f3f-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "0ce2b5c0-654c-4d31-8a43-991c60897f3f" (UID: "0ce2b5c0-654c-4d31-8a43-991c60897f3f"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:21:33.131064 master-0 kubenswrapper[18707]: I0320 09:21:33.130639 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce2b5c0-654c-4d31-8a43-991c60897f3f-inventory" (OuterVolumeSpecName: "inventory") pod "0ce2b5c0-654c-4d31-8a43-991c60897f3f" (UID: "0ce2b5c0-654c-4d31-8a43-991c60897f3f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:21:33.196629 master-0 kubenswrapper[18707]: I0320 09:21:33.196559 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/0ce2b5c0-654c-4d31-8a43-991c60897f3f-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:21:33.196629 master-0 kubenswrapper[18707]: I0320 09:21:33.196624 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ce2b5c0-654c-4d31-8a43-991c60897f3f-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:21:33.196761 master-0 kubenswrapper[18707]: I0320 09:21:33.196639 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb694\" (UniqueName: \"kubernetes.io/projected/0ce2b5c0-654c-4d31-8a43-991c60897f3f-kube-api-access-mb694\") on node \"master-0\" DevicePath \"\"" Mar 20 09:21:33.526884 master-0 kubenswrapper[18707]: I0320 09:21:33.526733 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" event={"ID":"0ce2b5c0-654c-4d31-8a43-991c60897f3f","Type":"ContainerDied","Data":"4469d1363cbfdaab70f3d1ca3d90b1d28e65a75d10c86ce6173aee82217ec0ac"} Mar 20 09:21:33.526884 master-0 kubenswrapper[18707]: I0320 09:21:33.526788 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4469d1363cbfdaab70f3d1ca3d90b1d28e65a75d10c86ce6173aee82217ec0ac" Mar 20 09:21:33.526884 master-0 kubenswrapper[18707]: I0320 09:21:33.526844 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-dataplane-step-1-edpm-a-4zvvk" Mar 20 09:21:33.619226 master-0 kubenswrapper[18707]: I0320 09:21:33.618366 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-dataplane-step-1-edpm-a-sgpm2"] Mar 20 09:21:33.619226 master-0 kubenswrapper[18707]: E0320 09:21:33.618894 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce2b5c0-654c-4d31-8a43-991c60897f3f" containerName="install-os-dataplane-step-1-edpm-a" Mar 20 09:21:33.619226 master-0 kubenswrapper[18707]: I0320 09:21:33.618909 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce2b5c0-654c-4d31-8a43-991c60897f3f" containerName="install-os-dataplane-step-1-edpm-a" Mar 20 09:21:33.619538 master-0 kubenswrapper[18707]: I0320 09:21:33.619242 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce2b5c0-654c-4d31-8a43-991c60897f3f" containerName="install-os-dataplane-step-1-edpm-a" Mar 20 09:21:33.623212 master-0 kubenswrapper[18707]: I0320 09:21:33.620056 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" Mar 20 09:21:33.631216 master-0 kubenswrapper[18707]: I0320 09:21:33.625172 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:21:33.664333 master-0 kubenswrapper[18707]: I0320 09:21:33.662591 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-dataplane-step-1-edpm-a-sgpm2"] Mar 20 09:21:33.709878 master-0 kubenswrapper[18707]: I0320 09:21:33.709802 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/7a3517ff-d93c-4eb3-8400-7d6232546814-ssh-key-edpm-a\") pod \"configure-os-dataplane-step-1-edpm-a-sgpm2\" (UID: \"7a3517ff-d93c-4eb3-8400-7d6232546814\") " pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" Mar 20 09:21:33.710151 master-0 kubenswrapper[18707]: I0320 09:21:33.709902 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a3517ff-d93c-4eb3-8400-7d6232546814-inventory\") pod \"configure-os-dataplane-step-1-edpm-a-sgpm2\" (UID: \"7a3517ff-d93c-4eb3-8400-7d6232546814\") " pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" Mar 20 09:21:33.710151 master-0 kubenswrapper[18707]: I0320 09:21:33.710051 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm5kx\" (UniqueName: \"kubernetes.io/projected/7a3517ff-d93c-4eb3-8400-7d6232546814-kube-api-access-zm5kx\") pod \"configure-os-dataplane-step-1-edpm-a-sgpm2\" (UID: \"7a3517ff-d93c-4eb3-8400-7d6232546814\") " pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" Mar 20 09:21:33.813131 master-0 kubenswrapper[18707]: I0320 09:21:33.813065 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/7a3517ff-d93c-4eb3-8400-7d6232546814-ssh-key-edpm-a\") pod \"configure-os-dataplane-step-1-edpm-a-sgpm2\" (UID: \"7a3517ff-d93c-4eb3-8400-7d6232546814\") " pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" Mar 20 09:21:33.813131 master-0 kubenswrapper[18707]: I0320 09:21:33.813135 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a3517ff-d93c-4eb3-8400-7d6232546814-inventory\") pod \"configure-os-dataplane-step-1-edpm-a-sgpm2\" (UID: \"7a3517ff-d93c-4eb3-8400-7d6232546814\") " pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" Mar 20 09:21:33.813451 master-0 kubenswrapper[18707]: I0320 09:21:33.813379 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm5kx\" (UniqueName: \"kubernetes.io/projected/7a3517ff-d93c-4eb3-8400-7d6232546814-kube-api-access-zm5kx\") pod \"configure-os-dataplane-step-1-edpm-a-sgpm2\" (UID: \"7a3517ff-d93c-4eb3-8400-7d6232546814\") " pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" Mar 20 09:21:33.817081 master-0 kubenswrapper[18707]: I0320 09:21:33.816330 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a3517ff-d93c-4eb3-8400-7d6232546814-inventory\") pod \"configure-os-dataplane-step-1-edpm-a-sgpm2\" (UID: \"7a3517ff-d93c-4eb3-8400-7d6232546814\") " pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" Mar 20 09:21:33.824758 master-0 kubenswrapper[18707]: I0320 09:21:33.824723 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/7a3517ff-d93c-4eb3-8400-7d6232546814-ssh-key-edpm-a\") pod \"configure-os-dataplane-step-1-edpm-a-sgpm2\" (UID: \"7a3517ff-d93c-4eb3-8400-7d6232546814\") " pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" Mar 20 09:21:33.836228 master-0 kubenswrapper[18707]: I0320 09:21:33.836163 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm5kx\" (UniqueName: \"kubernetes.io/projected/7a3517ff-d93c-4eb3-8400-7d6232546814-kube-api-access-zm5kx\") pod \"configure-os-dataplane-step-1-edpm-a-sgpm2\" (UID: \"7a3517ff-d93c-4eb3-8400-7d6232546814\") " pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" Mar 20 09:21:33.961610 master-0 kubenswrapper[18707]: I0320 09:21:33.960830 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" Mar 20 09:21:34.566381 master-0 kubenswrapper[18707]: I0320 09:21:34.560109 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-dataplane-step-1-edpm-a-sgpm2"] Mar 20 09:21:34.566381 master-0 kubenswrapper[18707]: W0320 09:21:34.560468 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a3517ff_d93c_4eb3_8400_7d6232546814.slice/crio-ad67907e7e938a7cb431181a00d2a74b576308a9258eca1d995c587a477f8402 WatchSource:0}: Error finding container ad67907e7e938a7cb431181a00d2a74b576308a9258eca1d995c587a477f8402: Status 404 returned error can't find the container with id ad67907e7e938a7cb431181a00d2a74b576308a9258eca1d995c587a477f8402 Mar 20 09:21:35.552765 master-0 kubenswrapper[18707]: I0320 09:21:35.552439 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" event={"ID":"7a3517ff-d93c-4eb3-8400-7d6232546814","Type":"ContainerStarted","Data":"9e739557801372d51e0599c1791e9d201c92ce4eecd1e77e5e08bafdf85cc0d1"} Mar 20 09:21:35.552765 master-0 kubenswrapper[18707]: I0320 09:21:35.552493 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" event={"ID":"7a3517ff-d93c-4eb3-8400-7d6232546814","Type":"ContainerStarted","Data":"ad67907e7e938a7cb431181a00d2a74b576308a9258eca1d995c587a477f8402"} Mar 20 09:21:35.585036 master-0 kubenswrapper[18707]: I0320 09:21:35.584934 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" podStartSLOduration=2.024702578 podStartE2EDuration="2.584909571s" podCreationTimestamp="2026-03-20 09:21:33 +0000 UTC" firstStartedPulling="2026-03-20 09:21:34.563210195 +0000 UTC m=+2439.719390551" lastFinishedPulling="2026-03-20 09:21:35.123417188 +0000 UTC m=+2440.279597544" observedRunningTime="2026-03-20 09:21:35.574126806 +0000 UTC m=+2440.730307172" watchObservedRunningTime="2026-03-20 09:21:35.584909571 +0000 UTC m=+2440.741089927" Mar 20 09:21:36.566773 master-0 kubenswrapper[18707]: I0320 09:21:36.566676 18707 generic.go:334] "Generic (PLEG): container finished" podID="893073df-8e96-4106-ac53-246e4487ce6c" containerID="de5b8263ee9fe4628a95da1977d23c15059c96a5d444c5b611c705fce8912f39" exitCode=0 Mar 20 09:21:36.567782 master-0 kubenswrapper[18707]: I0320 09:21:36.567372 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" event={"ID":"893073df-8e96-4106-ac53-246e4487ce6c","Type":"ContainerDied","Data":"de5b8263ee9fe4628a95da1977d23c15059c96a5d444c5b611c705fce8912f39"} Mar 20 09:21:38.111131 master-0 kubenswrapper[18707]: I0320 09:21:38.111075 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" Mar 20 09:21:38.138311 master-0 kubenswrapper[18707]: I0320 09:21:38.138237 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/893073df-8e96-4106-ac53-246e4487ce6c-inventory\") pod \"893073df-8e96-4106-ac53-246e4487ce6c\" (UID: \"893073df-8e96-4106-ac53-246e4487ce6c\") " Mar 20 09:21:38.138536 master-0 kubenswrapper[18707]: I0320 09:21:38.138348 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4gvf\" (UniqueName: \"kubernetes.io/projected/893073df-8e96-4106-ac53-246e4487ce6c-kube-api-access-k4gvf\") pod \"893073df-8e96-4106-ac53-246e4487ce6c\" (UID: \"893073df-8e96-4106-ac53-246e4487ce6c\") " Mar 20 09:21:38.138854 master-0 kubenswrapper[18707]: I0320 09:21:38.138816 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/893073df-8e96-4106-ac53-246e4487ce6c-ssh-key-edpm-b\") pod \"893073df-8e96-4106-ac53-246e4487ce6c\" (UID: \"893073df-8e96-4106-ac53-246e4487ce6c\") " Mar 20 09:21:38.141684 master-0 kubenswrapper[18707]: I0320 09:21:38.141639 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/893073df-8e96-4106-ac53-246e4487ce6c-kube-api-access-k4gvf" (OuterVolumeSpecName: "kube-api-access-k4gvf") pod "893073df-8e96-4106-ac53-246e4487ce6c" (UID: "893073df-8e96-4106-ac53-246e4487ce6c"). InnerVolumeSpecName "kube-api-access-k4gvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:21:38.166414 master-0 kubenswrapper[18707]: I0320 09:21:38.166346 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/893073df-8e96-4106-ac53-246e4487ce6c-inventory" (OuterVolumeSpecName: "inventory") pod "893073df-8e96-4106-ac53-246e4487ce6c" (UID: "893073df-8e96-4106-ac53-246e4487ce6c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:21:38.172303 master-0 kubenswrapper[18707]: I0320 09:21:38.172262 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/893073df-8e96-4106-ac53-246e4487ce6c-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "893073df-8e96-4106-ac53-246e4487ce6c" (UID: "893073df-8e96-4106-ac53-246e4487ce6c"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:21:38.243950 master-0 kubenswrapper[18707]: I0320 09:21:38.243823 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/893073df-8e96-4106-ac53-246e4487ce6c-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:21:38.243950 master-0 kubenswrapper[18707]: I0320 09:21:38.243908 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/893073df-8e96-4106-ac53-246e4487ce6c-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:21:38.243950 master-0 kubenswrapper[18707]: I0320 09:21:38.243932 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4gvf\" (UniqueName: \"kubernetes.io/projected/893073df-8e96-4106-ac53-246e4487ce6c-kube-api-access-k4gvf\") on node \"master-0\" DevicePath \"\"" Mar 20 09:21:38.587895 master-0 kubenswrapper[18707]: I0320 09:21:38.587836 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" event={"ID":"893073df-8e96-4106-ac53-246e4487ce6c","Type":"ContainerDied","Data":"c3958671f2fb20ac98e5b4cc5614b220bebea2bd77b1d7473bdd70a43ef28426"} Mar 20 09:21:38.587895 master-0 kubenswrapper[18707]: I0320 09:21:38.587888 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3958671f2fb20ac98e5b4cc5614b220bebea2bd77b1d7473bdd70a43ef28426" Mar 20 09:21:38.588207 master-0 kubenswrapper[18707]: I0320 09:21:38.587931 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-dataplane-step-1-edpm-b-6gmw5" Mar 20 09:21:38.813789 master-0 kubenswrapper[18707]: I0320 09:21:38.813713 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-dataplane-step-1-edpm-b-jdmpl"] Mar 20 09:21:38.814428 master-0 kubenswrapper[18707]: E0320 09:21:38.814397 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="893073df-8e96-4106-ac53-246e4487ce6c" containerName="install-os-dataplane-step-1-edpm-b" Mar 20 09:21:38.814428 master-0 kubenswrapper[18707]: I0320 09:21:38.814422 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="893073df-8e96-4106-ac53-246e4487ce6c" containerName="install-os-dataplane-step-1-edpm-b" Mar 20 09:21:38.814804 master-0 kubenswrapper[18707]: I0320 09:21:38.814774 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="893073df-8e96-4106-ac53-246e4487ce6c" containerName="install-os-dataplane-step-1-edpm-b" Mar 20 09:21:38.815930 master-0 kubenswrapper[18707]: I0320 09:21:38.815896 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" Mar 20 09:21:38.819347 master-0 kubenswrapper[18707]: I0320 09:21:38.819094 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-b" Mar 20 09:21:38.828941 master-0 kubenswrapper[18707]: I0320 09:21:38.828886 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-dataplane-step-1-edpm-b-jdmpl"] Mar 20 09:21:38.861449 master-0 kubenswrapper[18707]: I0320 09:21:38.861382 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/727d6a1e-0435-4642-bf04-f035c35bb0ae-inventory\") pod \"configure-os-dataplane-step-1-edpm-b-jdmpl\" (UID: \"727d6a1e-0435-4642-bf04-f035c35bb0ae\") " pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" Mar 20 09:21:38.861689 master-0 kubenswrapper[18707]: I0320 09:21:38.861515 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/727d6a1e-0435-4642-bf04-f035c35bb0ae-ssh-key-edpm-b\") pod \"configure-os-dataplane-step-1-edpm-b-jdmpl\" (UID: \"727d6a1e-0435-4642-bf04-f035c35bb0ae\") " pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" Mar 20 09:21:38.861965 master-0 kubenswrapper[18707]: I0320 09:21:38.861929 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtx2h\" (UniqueName: \"kubernetes.io/projected/727d6a1e-0435-4642-bf04-f035c35bb0ae-kube-api-access-mtx2h\") pod \"configure-os-dataplane-step-1-edpm-b-jdmpl\" (UID: \"727d6a1e-0435-4642-bf04-f035c35bb0ae\") " pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" Mar 20 09:21:38.964915 master-0 kubenswrapper[18707]: I0320 09:21:38.964848 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/727d6a1e-0435-4642-bf04-f035c35bb0ae-inventory\") pod \"configure-os-dataplane-step-1-edpm-b-jdmpl\" (UID: \"727d6a1e-0435-4642-bf04-f035c35bb0ae\") " pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" Mar 20 09:21:38.965318 master-0 kubenswrapper[18707]: I0320 09:21:38.965276 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/727d6a1e-0435-4642-bf04-f035c35bb0ae-ssh-key-edpm-b\") pod \"configure-os-dataplane-step-1-edpm-b-jdmpl\" (UID: \"727d6a1e-0435-4642-bf04-f035c35bb0ae\") " pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" Mar 20 09:21:38.965756 master-0 kubenswrapper[18707]: I0320 09:21:38.965737 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtx2h\" (UniqueName: \"kubernetes.io/projected/727d6a1e-0435-4642-bf04-f035c35bb0ae-kube-api-access-mtx2h\") pod \"configure-os-dataplane-step-1-edpm-b-jdmpl\" (UID: \"727d6a1e-0435-4642-bf04-f035c35bb0ae\") " pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" Mar 20 09:21:38.969039 master-0 kubenswrapper[18707]: I0320 09:21:38.969000 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/727d6a1e-0435-4642-bf04-f035c35bb0ae-ssh-key-edpm-b\") pod \"configure-os-dataplane-step-1-edpm-b-jdmpl\" (UID: \"727d6a1e-0435-4642-bf04-f035c35bb0ae\") " pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" Mar 20 09:21:38.970747 master-0 kubenswrapper[18707]: I0320 09:21:38.970710 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/727d6a1e-0435-4642-bf04-f035c35bb0ae-inventory\") pod \"configure-os-dataplane-step-1-edpm-b-jdmpl\" (UID: \"727d6a1e-0435-4642-bf04-f035c35bb0ae\") " pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" Mar 20 09:21:38.982863 master-0 kubenswrapper[18707]: I0320 09:21:38.982792 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtx2h\" (UniqueName: \"kubernetes.io/projected/727d6a1e-0435-4642-bf04-f035c35bb0ae-kube-api-access-mtx2h\") pod \"configure-os-dataplane-step-1-edpm-b-jdmpl\" (UID: \"727d6a1e-0435-4642-bf04-f035c35bb0ae\") " pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" Mar 20 09:21:39.152251 master-0 kubenswrapper[18707]: I0320 09:21:39.152043 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" Mar 20 09:21:39.806031 master-0 kubenswrapper[18707]: W0320 09:21:39.805956 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod727d6a1e_0435_4642_bf04_f035c35bb0ae.slice/crio-bd9315d7883c067e8c8b94b7e8d5ce0782ff2b70f8ac3b9c2dd5915ca4622fed WatchSource:0}: Error finding container bd9315d7883c067e8c8b94b7e8d5ce0782ff2b70f8ac3b9c2dd5915ca4622fed: Status 404 returned error can't find the container with id bd9315d7883c067e8c8b94b7e8d5ce0782ff2b70f8ac3b9c2dd5915ca4622fed Mar 20 09:21:39.819740 master-0 kubenswrapper[18707]: I0320 09:21:39.819669 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-dataplane-step-1-edpm-b-jdmpl"] Mar 20 09:21:40.624701 master-0 kubenswrapper[18707]: I0320 09:21:40.624542 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" event={"ID":"727d6a1e-0435-4642-bf04-f035c35bb0ae","Type":"ContainerStarted","Data":"bd9315d7883c067e8c8b94b7e8d5ce0782ff2b70f8ac3b9c2dd5915ca4622fed"} Mar 20 09:21:41.641200 master-0 kubenswrapper[18707]: I0320 09:21:41.641122 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" event={"ID":"727d6a1e-0435-4642-bf04-f035c35bb0ae","Type":"ContainerStarted","Data":"e994927c129aa4feb2d2ca8743722f67a693c80d4e28425887b8247957a41821"} Mar 20 09:21:41.876794 master-0 kubenswrapper[18707]: I0320 09:21:41.876673 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" podStartSLOduration=2.953523919 podStartE2EDuration="3.876650888s" podCreationTimestamp="2026-03-20 09:21:38 +0000 UTC" firstStartedPulling="2026-03-20 09:21:39.808717114 +0000 UTC m=+2444.964897480" lastFinishedPulling="2026-03-20 09:21:40.731844093 +0000 UTC m=+2445.888024449" observedRunningTime="2026-03-20 09:21:41.86681768 +0000 UTC m=+2447.022998056" watchObservedRunningTime="2026-03-20 09:21:41.876650888 +0000 UTC m=+2447.032831244" Mar 20 09:22:18.118162 master-0 kubenswrapper[18707]: I0320 09:22:18.118083 18707 generic.go:334] "Generic (PLEG): container finished" podID="7a3517ff-d93c-4eb3-8400-7d6232546814" containerID="9e739557801372d51e0599c1791e9d201c92ce4eecd1e77e5e08bafdf85cc0d1" exitCode=0 Mar 20 09:22:18.118162 master-0 kubenswrapper[18707]: I0320 09:22:18.118154 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" event={"ID":"7a3517ff-d93c-4eb3-8400-7d6232546814","Type":"ContainerDied","Data":"9e739557801372d51e0599c1791e9d201c92ce4eecd1e77e5e08bafdf85cc0d1"} Mar 20 09:22:19.625202 master-0 kubenswrapper[18707]: I0320 09:22:19.625149 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" Mar 20 09:22:19.759762 master-0 kubenswrapper[18707]: I0320 09:22:19.759685 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/7a3517ff-d93c-4eb3-8400-7d6232546814-ssh-key-edpm-a\") pod \"7a3517ff-d93c-4eb3-8400-7d6232546814\" (UID: \"7a3517ff-d93c-4eb3-8400-7d6232546814\") " Mar 20 09:22:19.760480 master-0 kubenswrapper[18707]: I0320 09:22:19.760419 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm5kx\" (UniqueName: \"kubernetes.io/projected/7a3517ff-d93c-4eb3-8400-7d6232546814-kube-api-access-zm5kx\") pod \"7a3517ff-d93c-4eb3-8400-7d6232546814\" (UID: \"7a3517ff-d93c-4eb3-8400-7d6232546814\") " Mar 20 09:22:19.760719 master-0 kubenswrapper[18707]: I0320 09:22:19.760556 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a3517ff-d93c-4eb3-8400-7d6232546814-inventory\") pod \"7a3517ff-d93c-4eb3-8400-7d6232546814\" (UID: \"7a3517ff-d93c-4eb3-8400-7d6232546814\") " Mar 20 09:22:19.763704 master-0 kubenswrapper[18707]: I0320 09:22:19.763622 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a3517ff-d93c-4eb3-8400-7d6232546814-kube-api-access-zm5kx" (OuterVolumeSpecName: "kube-api-access-zm5kx") pod "7a3517ff-d93c-4eb3-8400-7d6232546814" (UID: "7a3517ff-d93c-4eb3-8400-7d6232546814"). InnerVolumeSpecName "kube-api-access-zm5kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:22:19.789575 master-0 kubenswrapper[18707]: I0320 09:22:19.787651 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3517ff-d93c-4eb3-8400-7d6232546814-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "7a3517ff-d93c-4eb3-8400-7d6232546814" (UID: "7a3517ff-d93c-4eb3-8400-7d6232546814"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:19.802448 master-0 kubenswrapper[18707]: I0320 09:22:19.802398 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3517ff-d93c-4eb3-8400-7d6232546814-inventory" (OuterVolumeSpecName: "inventory") pod "7a3517ff-d93c-4eb3-8400-7d6232546814" (UID: "7a3517ff-d93c-4eb3-8400-7d6232546814"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:19.864531 master-0 kubenswrapper[18707]: I0320 09:22:19.864434 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm5kx\" (UniqueName: \"kubernetes.io/projected/7a3517ff-d93c-4eb3-8400-7d6232546814-kube-api-access-zm5kx\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:19.864531 master-0 kubenswrapper[18707]: I0320 09:22:19.864509 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a3517ff-d93c-4eb3-8400-7d6232546814-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:19.864531 master-0 kubenswrapper[18707]: I0320 09:22:19.864528 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/7a3517ff-d93c-4eb3-8400-7d6232546814-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:20.141252 master-0 kubenswrapper[18707]: I0320 09:22:20.141152 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" event={"ID":"7a3517ff-d93c-4eb3-8400-7d6232546814","Type":"ContainerDied","Data":"ad67907e7e938a7cb431181a00d2a74b576308a9258eca1d995c587a477f8402"} Mar 20 09:22:20.141252 master-0 kubenswrapper[18707]: I0320 09:22:20.141180 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-dataplane-step-1-edpm-a-sgpm2" Mar 20 09:22:20.141252 master-0 kubenswrapper[18707]: I0320 09:22:20.141252 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad67907e7e938a7cb431181a00d2a74b576308a9258eca1d995c587a477f8402" Mar 20 09:22:20.260130 master-0 kubenswrapper[18707]: I0320 09:22:20.260060 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-dataplane-step-1-kt96m"] Mar 20 09:22:20.260753 master-0 kubenswrapper[18707]: E0320 09:22:20.260717 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3517ff-d93c-4eb3-8400-7d6232546814" containerName="configure-os-dataplane-step-1-edpm-a" Mar 20 09:22:20.260753 master-0 kubenswrapper[18707]: I0320 09:22:20.260748 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3517ff-d93c-4eb3-8400-7d6232546814" containerName="configure-os-dataplane-step-1-edpm-a" Mar 20 09:22:20.261057 master-0 kubenswrapper[18707]: I0320 09:22:20.261035 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a3517ff-d93c-4eb3-8400-7d6232546814" containerName="configure-os-dataplane-step-1-edpm-a" Mar 20 09:22:20.266033 master-0 kubenswrapper[18707]: I0320 09:22:20.262458 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.268997 master-0 kubenswrapper[18707]: I0320 09:22:20.267440 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:22:20.310071 master-0 kubenswrapper[18707]: I0320 09:22:20.309893 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-dataplane-step-1-kt96m"] Mar 20 09:22:20.378822 master-0 kubenswrapper[18707]: I0320 09:22:20.378698 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-inventory-1\") pod \"ssh-known-hosts-dataplane-step-1-kt96m\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.379032 master-0 kubenswrapper[18707]: I0320 09:22:20.378848 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpczb\" (UniqueName: \"kubernetes.io/projected/79fdbf0e-dda0-4314-8a37-7f78451f99d6-kube-api-access-xpczb\") pod \"ssh-known-hosts-dataplane-step-1-kt96m\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.379032 master-0 kubenswrapper[18707]: I0320 09:22:20.378936 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-ssh-key-edpm-b\") pod \"ssh-known-hosts-dataplane-step-1-kt96m\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.379110 master-0 kubenswrapper[18707]: I0320 09:22:20.379070 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-ssh-key-edpm-a\") pod \"ssh-known-hosts-dataplane-step-1-kt96m\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.379333 master-0 kubenswrapper[18707]: I0320 09:22:20.379267 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-inventory-0\") pod \"ssh-known-hosts-dataplane-step-1-kt96m\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.482713 master-0 kubenswrapper[18707]: I0320 09:22:20.482019 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-inventory-1\") pod \"ssh-known-hosts-dataplane-step-1-kt96m\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.482713 master-0 kubenswrapper[18707]: I0320 09:22:20.482408 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpczb\" (UniqueName: \"kubernetes.io/projected/79fdbf0e-dda0-4314-8a37-7f78451f99d6-kube-api-access-xpczb\") pod \"ssh-known-hosts-dataplane-step-1-kt96m\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.482713 master-0 kubenswrapper[18707]: I0320 09:22:20.482489 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-ssh-key-edpm-b\") pod \"ssh-known-hosts-dataplane-step-1-kt96m\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.482713 master-0 kubenswrapper[18707]: I0320 09:22:20.482625 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-ssh-key-edpm-a\") pod \"ssh-known-hosts-dataplane-step-1-kt96m\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.483951 master-0 kubenswrapper[18707]: I0320 09:22:20.483404 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-inventory-0\") pod \"ssh-known-hosts-dataplane-step-1-kt96m\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.486465 master-0 kubenswrapper[18707]: I0320 09:22:20.486415 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-ssh-key-edpm-a\") pod \"ssh-known-hosts-dataplane-step-1-kt96m\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.489361 master-0 kubenswrapper[18707]: I0320 09:22:20.488733 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-ssh-key-edpm-b\") pod \"ssh-known-hosts-dataplane-step-1-kt96m\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.489361 master-0 kubenswrapper[18707]: I0320 09:22:20.489078 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-inventory-1\") pod \"ssh-known-hosts-dataplane-step-1-kt96m\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.493745 master-0 kubenswrapper[18707]: I0320 09:22:20.493710 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-inventory-0\") pod \"ssh-known-hosts-dataplane-step-1-kt96m\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.503609 master-0 kubenswrapper[18707]: I0320 09:22:20.503555 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpczb\" (UniqueName: \"kubernetes.io/projected/79fdbf0e-dda0-4314-8a37-7f78451f99d6-kube-api-access-xpczb\") pod \"ssh-known-hosts-dataplane-step-1-kt96m\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:20.633820 master-0 kubenswrapper[18707]: I0320 09:22:20.633756 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:21.182256 master-0 kubenswrapper[18707]: I0320 09:22:21.182179 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-dataplane-step-1-kt96m"] Mar 20 09:22:21.203588 master-0 kubenswrapper[18707]: W0320 09:22:21.203493 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79fdbf0e_dda0_4314_8a37_7f78451f99d6.slice/crio-ba7be223f47aae2db83cae0298c5dfaea31f7740c2174644083895abc723929a WatchSource:0}: Error finding container ba7be223f47aae2db83cae0298c5dfaea31f7740c2174644083895abc723929a: Status 404 returned error can't find the container with id ba7be223f47aae2db83cae0298c5dfaea31f7740c2174644083895abc723929a Mar 20 09:22:22.173617 master-0 kubenswrapper[18707]: I0320 09:22:22.173552 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" event={"ID":"79fdbf0e-dda0-4314-8a37-7f78451f99d6","Type":"ContainerStarted","Data":"c202f36bbbf21cb07467a7ac9385b47852d509f6fead6090922e149cf97fb1ae"} Mar 20 09:22:22.173617 master-0 kubenswrapper[18707]: I0320 09:22:22.173612 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" event={"ID":"79fdbf0e-dda0-4314-8a37-7f78451f99d6","Type":"ContainerStarted","Data":"ba7be223f47aae2db83cae0298c5dfaea31f7740c2174644083895abc723929a"} Mar 20 09:22:23.186937 master-0 kubenswrapper[18707]: I0320 09:22:23.186783 18707 generic.go:334] "Generic (PLEG): container finished" podID="727d6a1e-0435-4642-bf04-f035c35bb0ae" containerID="e994927c129aa4feb2d2ca8743722f67a693c80d4e28425887b8247957a41821" exitCode=0 Mar 20 09:22:23.186937 master-0 kubenswrapper[18707]: I0320 09:22:23.186869 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" event={"ID":"727d6a1e-0435-4642-bf04-f035c35bb0ae","Type":"ContainerDied","Data":"e994927c129aa4feb2d2ca8743722f67a693c80d4e28425887b8247957a41821"} Mar 20 09:22:23.216239 master-0 kubenswrapper[18707]: I0320 09:22:23.216125 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" podStartSLOduration=2.563989037 podStartE2EDuration="3.216095277s" podCreationTimestamp="2026-03-20 09:22:20 +0000 UTC" firstStartedPulling="2026-03-20 09:22:21.206660366 +0000 UTC m=+2486.362840732" lastFinishedPulling="2026-03-20 09:22:21.858766576 +0000 UTC m=+2487.014946972" observedRunningTime="2026-03-20 09:22:22.198053765 +0000 UTC m=+2487.354234131" watchObservedRunningTime="2026-03-20 09:22:23.216095277 +0000 UTC m=+2488.372275643" Mar 20 09:22:24.732089 master-0 kubenswrapper[18707]: I0320 09:22:24.732030 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" Mar 20 09:22:24.850838 master-0 kubenswrapper[18707]: I0320 09:22:24.850798 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtx2h\" (UniqueName: \"kubernetes.io/projected/727d6a1e-0435-4642-bf04-f035c35bb0ae-kube-api-access-mtx2h\") pod \"727d6a1e-0435-4642-bf04-f035c35bb0ae\" (UID: \"727d6a1e-0435-4642-bf04-f035c35bb0ae\") " Mar 20 09:22:24.851201 master-0 kubenswrapper[18707]: I0320 09:22:24.851163 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/727d6a1e-0435-4642-bf04-f035c35bb0ae-inventory\") pod \"727d6a1e-0435-4642-bf04-f035c35bb0ae\" (UID: \"727d6a1e-0435-4642-bf04-f035c35bb0ae\") " Mar 20 09:22:24.851468 master-0 kubenswrapper[18707]: I0320 09:22:24.851454 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/727d6a1e-0435-4642-bf04-f035c35bb0ae-ssh-key-edpm-b\") pod \"727d6a1e-0435-4642-bf04-f035c35bb0ae\" (UID: \"727d6a1e-0435-4642-bf04-f035c35bb0ae\") " Mar 20 09:22:24.878784 master-0 kubenswrapper[18707]: I0320 09:22:24.878633 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/727d6a1e-0435-4642-bf04-f035c35bb0ae-kube-api-access-mtx2h" (OuterVolumeSpecName: "kube-api-access-mtx2h") pod "727d6a1e-0435-4642-bf04-f035c35bb0ae" (UID: "727d6a1e-0435-4642-bf04-f035c35bb0ae"). InnerVolumeSpecName "kube-api-access-mtx2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:22:24.882520 master-0 kubenswrapper[18707]: I0320 09:22:24.882439 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/727d6a1e-0435-4642-bf04-f035c35bb0ae-inventory" (OuterVolumeSpecName: "inventory") pod "727d6a1e-0435-4642-bf04-f035c35bb0ae" (UID: "727d6a1e-0435-4642-bf04-f035c35bb0ae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:24.886120 master-0 kubenswrapper[18707]: I0320 09:22:24.886069 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/727d6a1e-0435-4642-bf04-f035c35bb0ae-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "727d6a1e-0435-4642-bf04-f035c35bb0ae" (UID: "727d6a1e-0435-4642-bf04-f035c35bb0ae"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:24.954825 master-0 kubenswrapper[18707]: I0320 09:22:24.954765 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtx2h\" (UniqueName: \"kubernetes.io/projected/727d6a1e-0435-4642-bf04-f035c35bb0ae-kube-api-access-mtx2h\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:24.954825 master-0 kubenswrapper[18707]: I0320 09:22:24.954811 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/727d6a1e-0435-4642-bf04-f035c35bb0ae-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:24.954825 master-0 kubenswrapper[18707]: I0320 09:22:24.954823 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/727d6a1e-0435-4642-bf04-f035c35bb0ae-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:25.210898 master-0 kubenswrapper[18707]: I0320 09:22:25.210690 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" event={"ID":"727d6a1e-0435-4642-bf04-f035c35bb0ae","Type":"ContainerDied","Data":"bd9315d7883c067e8c8b94b7e8d5ce0782ff2b70f8ac3b9c2dd5915ca4622fed"} Mar 20 09:22:25.211151 master-0 kubenswrapper[18707]: I0320 09:22:25.211134 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd9315d7883c067e8c8b94b7e8d5ce0782ff2b70f8ac3b9c2dd5915ca4622fed" Mar 20 09:22:25.211244 master-0 kubenswrapper[18707]: I0320 09:22:25.210813 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-dataplane-step-1-edpm-b-jdmpl" Mar 20 09:22:25.301801 master-0 kubenswrapper[18707]: I0320 09:22:25.301722 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-dataplane-step-1-edpm-b-4xvvh"] Mar 20 09:22:25.302516 master-0 kubenswrapper[18707]: E0320 09:22:25.302475 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="727d6a1e-0435-4642-bf04-f035c35bb0ae" containerName="configure-os-dataplane-step-1-edpm-b" Mar 20 09:22:25.302516 master-0 kubenswrapper[18707]: I0320 09:22:25.302507 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="727d6a1e-0435-4642-bf04-f035c35bb0ae" containerName="configure-os-dataplane-step-1-edpm-b" Mar 20 09:22:25.302918 master-0 kubenswrapper[18707]: I0320 09:22:25.302883 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="727d6a1e-0435-4642-bf04-f035c35bb0ae" containerName="configure-os-dataplane-step-1-edpm-b" Mar 20 09:22:25.304108 master-0 kubenswrapper[18707]: I0320 09:22:25.304068 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" Mar 20 09:22:25.323211 master-0 kubenswrapper[18707]: I0320 09:22:25.320434 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-dataplane-step-1-edpm-b-4xvvh"] Mar 20 09:22:25.470383 master-0 kubenswrapper[18707]: I0320 09:22:25.470240 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c98429a-3a91-4e17-9831-3d368b1083b4-inventory\") pod \"run-os-dataplane-step-1-edpm-b-4xvvh\" (UID: \"7c98429a-3a91-4e17-9831-3d368b1083b4\") " pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" Mar 20 09:22:25.470891 master-0 kubenswrapper[18707]: I0320 09:22:25.470843 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/7c98429a-3a91-4e17-9831-3d368b1083b4-ssh-key-edpm-b\") pod \"run-os-dataplane-step-1-edpm-b-4xvvh\" (UID: \"7c98429a-3a91-4e17-9831-3d368b1083b4\") " pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" Mar 20 09:22:25.471090 master-0 kubenswrapper[18707]: I0320 09:22:25.471066 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qxhk\" (UniqueName: \"kubernetes.io/projected/7c98429a-3a91-4e17-9831-3d368b1083b4-kube-api-access-2qxhk\") pod \"run-os-dataplane-step-1-edpm-b-4xvvh\" (UID: \"7c98429a-3a91-4e17-9831-3d368b1083b4\") " pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" Mar 20 09:22:25.573683 master-0 kubenswrapper[18707]: I0320 09:22:25.573610 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/7c98429a-3a91-4e17-9831-3d368b1083b4-ssh-key-edpm-b\") pod \"run-os-dataplane-step-1-edpm-b-4xvvh\" (UID: \"7c98429a-3a91-4e17-9831-3d368b1083b4\") " pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" Mar 20 09:22:25.573949 master-0 kubenswrapper[18707]: I0320 09:22:25.573825 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qxhk\" (UniqueName: \"kubernetes.io/projected/7c98429a-3a91-4e17-9831-3d368b1083b4-kube-api-access-2qxhk\") pod \"run-os-dataplane-step-1-edpm-b-4xvvh\" (UID: \"7c98429a-3a91-4e17-9831-3d368b1083b4\") " pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" Mar 20 09:22:25.573949 master-0 kubenswrapper[18707]: I0320 09:22:25.573894 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c98429a-3a91-4e17-9831-3d368b1083b4-inventory\") pod \"run-os-dataplane-step-1-edpm-b-4xvvh\" (UID: \"7c98429a-3a91-4e17-9831-3d368b1083b4\") " pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" Mar 20 09:22:25.578243 master-0 kubenswrapper[18707]: I0320 09:22:25.577877 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/7c98429a-3a91-4e17-9831-3d368b1083b4-ssh-key-edpm-b\") pod \"run-os-dataplane-step-1-edpm-b-4xvvh\" (UID: \"7c98429a-3a91-4e17-9831-3d368b1083b4\") " pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" Mar 20 09:22:25.578504 master-0 kubenswrapper[18707]: I0320 09:22:25.578263 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c98429a-3a91-4e17-9831-3d368b1083b4-inventory\") pod \"run-os-dataplane-step-1-edpm-b-4xvvh\" (UID: \"7c98429a-3a91-4e17-9831-3d368b1083b4\") " pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" Mar 20 09:22:25.604036 master-0 kubenswrapper[18707]: I0320 09:22:25.603964 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qxhk\" (UniqueName: \"kubernetes.io/projected/7c98429a-3a91-4e17-9831-3d368b1083b4-kube-api-access-2qxhk\") pod \"run-os-dataplane-step-1-edpm-b-4xvvh\" (UID: \"7c98429a-3a91-4e17-9831-3d368b1083b4\") " pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" Mar 20 09:22:25.640940 master-0 kubenswrapper[18707]: I0320 09:22:25.640845 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" Mar 20 09:22:26.255066 master-0 kubenswrapper[18707]: W0320 09:22:26.254353 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c98429a_3a91_4e17_9831_3d368b1083b4.slice/crio-a884e436f7cd09c814fce176fb4273f517b6850dd1752ff1e1f71912029ac8b3 WatchSource:0}: Error finding container a884e436f7cd09c814fce176fb4273f517b6850dd1752ff1e1f71912029ac8b3: Status 404 returned error can't find the container with id a884e436f7cd09c814fce176fb4273f517b6850dd1752ff1e1f71912029ac8b3 Mar 20 09:22:26.257280 master-0 kubenswrapper[18707]: I0320 09:22:26.256791 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-dataplane-step-1-edpm-b-4xvvh"] Mar 20 09:22:27.239519 master-0 kubenswrapper[18707]: I0320 09:22:27.239451 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" event={"ID":"7c98429a-3a91-4e17-9831-3d368b1083b4","Type":"ContainerStarted","Data":"cc3617a35edf106baf303912e541c5069ee24efc491f4f72857ee2c645f1b079"} Mar 20 09:22:27.239519 master-0 kubenswrapper[18707]: I0320 09:22:27.239514 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" event={"ID":"7c98429a-3a91-4e17-9831-3d368b1083b4","Type":"ContainerStarted","Data":"a884e436f7cd09c814fce176fb4273f517b6850dd1752ff1e1f71912029ac8b3"} Mar 20 09:22:27.259635 master-0 kubenswrapper[18707]: I0320 09:22:27.259551 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" podStartSLOduration=1.858344975 podStartE2EDuration="2.259516682s" podCreationTimestamp="2026-03-20 09:22:25 +0000 UTC" firstStartedPulling="2026-03-20 09:22:26.258591635 +0000 UTC m=+2491.414771991" lastFinishedPulling="2026-03-20 09:22:26.659763342 +0000 UTC m=+2491.815943698" observedRunningTime="2026-03-20 09:22:27.258913725 +0000 UTC m=+2492.415094091" watchObservedRunningTime="2026-03-20 09:22:27.259516682 +0000 UTC m=+2492.415697038" Mar 20 09:22:29.262038 master-0 kubenswrapper[18707]: I0320 09:22:29.261976 18707 generic.go:334] "Generic (PLEG): container finished" podID="79fdbf0e-dda0-4314-8a37-7f78451f99d6" containerID="c202f36bbbf21cb07467a7ac9385b47852d509f6fead6090922e149cf97fb1ae" exitCode=0 Mar 20 09:22:29.262829 master-0 kubenswrapper[18707]: I0320 09:22:29.262077 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" event={"ID":"79fdbf0e-dda0-4314-8a37-7f78451f99d6","Type":"ContainerDied","Data":"c202f36bbbf21cb07467a7ac9385b47852d509f6fead6090922e149cf97fb1ae"} Mar 20 09:22:30.807638 master-0 kubenswrapper[18707]: I0320 09:22:30.807585 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:30.935781 master-0 kubenswrapper[18707]: I0320 09:22:30.935571 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-inventory-1\") pod \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " Mar 20 09:22:30.935781 master-0 kubenswrapper[18707]: I0320 09:22:30.935622 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpczb\" (UniqueName: \"kubernetes.io/projected/79fdbf0e-dda0-4314-8a37-7f78451f99d6-kube-api-access-xpczb\") pod \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " Mar 20 09:22:30.935781 master-0 kubenswrapper[18707]: I0320 09:22:30.935686 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-ssh-key-edpm-a\") pod \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " Mar 20 09:22:30.935781 master-0 kubenswrapper[18707]: I0320 09:22:30.935743 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-ssh-key-edpm-b\") pod \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " Mar 20 09:22:30.936039 master-0 kubenswrapper[18707]: I0320 09:22:30.935819 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-inventory-0\") pod \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\" (UID: \"79fdbf0e-dda0-4314-8a37-7f78451f99d6\") " Mar 20 09:22:30.942021 master-0 kubenswrapper[18707]: I0320 09:22:30.940743 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79fdbf0e-dda0-4314-8a37-7f78451f99d6-kube-api-access-xpczb" (OuterVolumeSpecName: "kube-api-access-xpczb") pod "79fdbf0e-dda0-4314-8a37-7f78451f99d6" (UID: "79fdbf0e-dda0-4314-8a37-7f78451f99d6"). InnerVolumeSpecName "kube-api-access-xpczb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:22:30.973729 master-0 kubenswrapper[18707]: I0320 09:22:30.973002 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "79fdbf0e-dda0-4314-8a37-7f78451f99d6" (UID: "79fdbf0e-dda0-4314-8a37-7f78451f99d6"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:30.975867 master-0 kubenswrapper[18707]: I0320 09:22:30.975826 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-inventory-1" (OuterVolumeSpecName: "inventory-1") pod "79fdbf0e-dda0-4314-8a37-7f78451f99d6" (UID: "79fdbf0e-dda0-4314-8a37-7f78451f99d6"). InnerVolumeSpecName "inventory-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:30.976761 master-0 kubenswrapper[18707]: I0320 09:22:30.976664 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "79fdbf0e-dda0-4314-8a37-7f78451f99d6" (UID: "79fdbf0e-dda0-4314-8a37-7f78451f99d6"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:30.981520 master-0 kubenswrapper[18707]: I0320 09:22:30.981352 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "79fdbf0e-dda0-4314-8a37-7f78451f99d6" (UID: "79fdbf0e-dda0-4314-8a37-7f78451f99d6"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:31.038692 master-0 kubenswrapper[18707]: I0320 09:22:31.038632 18707 reconciler_common.go:293] "Volume detached for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-inventory-1\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:31.038692 master-0 kubenswrapper[18707]: I0320 09:22:31.038687 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpczb\" (UniqueName: \"kubernetes.io/projected/79fdbf0e-dda0-4314-8a37-7f78451f99d6-kube-api-access-xpczb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:31.038692 master-0 kubenswrapper[18707]: I0320 09:22:31.038700 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:31.038692 master-0 kubenswrapper[18707]: I0320 09:22:31.038709 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:31.038692 master-0 kubenswrapper[18707]: I0320 09:22:31.038718 18707 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/79fdbf0e-dda0-4314-8a37-7f78451f99d6-inventory-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:31.289752 master-0 kubenswrapper[18707]: I0320 09:22:31.289584 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" event={"ID":"79fdbf0e-dda0-4314-8a37-7f78451f99d6","Type":"ContainerDied","Data":"ba7be223f47aae2db83cae0298c5dfaea31f7740c2174644083895abc723929a"} Mar 20 09:22:31.289752 master-0 kubenswrapper[18707]: I0320 09:22:31.289626 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-dataplane-step-1-kt96m" Mar 20 09:22:31.289752 master-0 kubenswrapper[18707]: I0320 09:22:31.289638 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba7be223f47aae2db83cae0298c5dfaea31f7740c2174644083895abc723929a" Mar 20 09:22:31.430764 master-0 kubenswrapper[18707]: I0320 09:22:31.420422 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-dataplane-step-1-edpm-a-mtbhq"] Mar 20 09:22:31.430764 master-0 kubenswrapper[18707]: E0320 09:22:31.421698 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fdbf0e-dda0-4314-8a37-7f78451f99d6" containerName="ssh-known-hosts-dataplane-step-1" Mar 20 09:22:31.430764 master-0 kubenswrapper[18707]: I0320 09:22:31.421727 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fdbf0e-dda0-4314-8a37-7f78451f99d6" containerName="ssh-known-hosts-dataplane-step-1" Mar 20 09:22:31.430764 master-0 kubenswrapper[18707]: I0320 09:22:31.422449 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="79fdbf0e-dda0-4314-8a37-7f78451f99d6" containerName="ssh-known-hosts-dataplane-step-1" Mar 20 09:22:31.430764 master-0 kubenswrapper[18707]: I0320 09:22:31.423987 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" Mar 20 09:22:31.438471 master-0 kubenswrapper[18707]: I0320 09:22:31.438393 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:22:31.457483 master-0 kubenswrapper[18707]: I0320 09:22:31.455897 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-dataplane-step-1-edpm-a-mtbhq"] Mar 20 09:22:31.552580 master-0 kubenswrapper[18707]: I0320 09:22:31.552494 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk5s9\" (UniqueName: \"kubernetes.io/projected/81883ed0-a696-465c-8e9d-f60820c4e8c0-kube-api-access-sk5s9\") pod \"run-os-dataplane-step-1-edpm-a-mtbhq\" (UID: \"81883ed0-a696-465c-8e9d-f60820c4e8c0\") " pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" Mar 20 09:22:31.552844 master-0 kubenswrapper[18707]: I0320 09:22:31.552647 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/81883ed0-a696-465c-8e9d-f60820c4e8c0-ssh-key-edpm-a\") pod \"run-os-dataplane-step-1-edpm-a-mtbhq\" (UID: \"81883ed0-a696-465c-8e9d-f60820c4e8c0\") " pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" Mar 20 09:22:31.552844 master-0 kubenswrapper[18707]: I0320 09:22:31.552835 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81883ed0-a696-465c-8e9d-f60820c4e8c0-inventory\") pod \"run-os-dataplane-step-1-edpm-a-mtbhq\" (UID: \"81883ed0-a696-465c-8e9d-f60820c4e8c0\") " pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" Mar 20 09:22:31.654736 master-0 kubenswrapper[18707]: I0320 09:22:31.654674 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81883ed0-a696-465c-8e9d-f60820c4e8c0-inventory\") pod \"run-os-dataplane-step-1-edpm-a-mtbhq\" (UID: \"81883ed0-a696-465c-8e9d-f60820c4e8c0\") " pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" Mar 20 09:22:31.655004 master-0 kubenswrapper[18707]: I0320 09:22:31.654777 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk5s9\" (UniqueName: \"kubernetes.io/projected/81883ed0-a696-465c-8e9d-f60820c4e8c0-kube-api-access-sk5s9\") pod \"run-os-dataplane-step-1-edpm-a-mtbhq\" (UID: \"81883ed0-a696-465c-8e9d-f60820c4e8c0\") " pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" Mar 20 09:22:31.655004 master-0 kubenswrapper[18707]: I0320 09:22:31.654844 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/81883ed0-a696-465c-8e9d-f60820c4e8c0-ssh-key-edpm-a\") pod \"run-os-dataplane-step-1-edpm-a-mtbhq\" (UID: \"81883ed0-a696-465c-8e9d-f60820c4e8c0\") " pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" Mar 20 09:22:31.659496 master-0 kubenswrapper[18707]: I0320 09:22:31.659459 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/81883ed0-a696-465c-8e9d-f60820c4e8c0-ssh-key-edpm-a\") pod \"run-os-dataplane-step-1-edpm-a-mtbhq\" (UID: \"81883ed0-a696-465c-8e9d-f60820c4e8c0\") " pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" Mar 20 09:22:31.660262 master-0 kubenswrapper[18707]: I0320 09:22:31.660145 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81883ed0-a696-465c-8e9d-f60820c4e8c0-inventory\") pod \"run-os-dataplane-step-1-edpm-a-mtbhq\" (UID: \"81883ed0-a696-465c-8e9d-f60820c4e8c0\") " pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" Mar 20 09:22:31.672785 master-0 kubenswrapper[18707]: I0320 09:22:31.672724 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk5s9\" (UniqueName: \"kubernetes.io/projected/81883ed0-a696-465c-8e9d-f60820c4e8c0-kube-api-access-sk5s9\") pod \"run-os-dataplane-step-1-edpm-a-mtbhq\" (UID: \"81883ed0-a696-465c-8e9d-f60820c4e8c0\") " pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" Mar 20 09:22:31.783267 master-0 kubenswrapper[18707]: I0320 09:22:31.783210 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" Mar 20 09:22:32.393282 master-0 kubenswrapper[18707]: I0320 09:22:32.392985 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-dataplane-step-1-edpm-a-mtbhq"] Mar 20 09:22:33.315590 master-0 kubenswrapper[18707]: I0320 09:22:33.315524 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" event={"ID":"81883ed0-a696-465c-8e9d-f60820c4e8c0","Type":"ContainerStarted","Data":"dafae84f884c811cb97e1d2ad68e13d749b960db0880d7d40c3fa65ebae28ecb"} Mar 20 09:22:33.315590 master-0 kubenswrapper[18707]: I0320 09:22:33.315578 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" event={"ID":"81883ed0-a696-465c-8e9d-f60820c4e8c0","Type":"ContainerStarted","Data":"2014ec6998e9fb6077b42f2e0cb3a2ec3e4fa7f77d3b6f0f8e3853d2a1700054"} Mar 20 09:22:33.339381 master-0 kubenswrapper[18707]: I0320 09:22:33.339278 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" podStartSLOduration=1.794555014 podStartE2EDuration="2.339254149s" podCreationTimestamp="2026-03-20 09:22:31 +0000 UTC" firstStartedPulling="2026-03-20 09:22:32.392127461 +0000 UTC m=+2497.548307817" lastFinishedPulling="2026-03-20 09:22:32.936826596 +0000 UTC m=+2498.093006952" observedRunningTime="2026-03-20 09:22:33.336856011 +0000 UTC m=+2498.493036367" watchObservedRunningTime="2026-03-20 09:22:33.339254149 +0000 UTC m=+2498.495434505" Mar 20 09:22:34.344946 master-0 kubenswrapper[18707]: I0320 09:22:34.344770 18707 generic.go:334] "Generic (PLEG): container finished" podID="7c98429a-3a91-4e17-9831-3d368b1083b4" containerID="cc3617a35edf106baf303912e541c5069ee24efc491f4f72857ee2c645f1b079" exitCode=0 Mar 20 09:22:34.344946 master-0 kubenswrapper[18707]: I0320 09:22:34.344873 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" event={"ID":"7c98429a-3a91-4e17-9831-3d368b1083b4","Type":"ContainerDied","Data":"cc3617a35edf106baf303912e541c5069ee24efc491f4f72857ee2c645f1b079"} Mar 20 09:22:36.008430 master-0 kubenswrapper[18707]: I0320 09:22:36.008366 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" Mar 20 09:22:36.181541 master-0 kubenswrapper[18707]: I0320 09:22:36.181394 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qxhk\" (UniqueName: \"kubernetes.io/projected/7c98429a-3a91-4e17-9831-3d368b1083b4-kube-api-access-2qxhk\") pod \"7c98429a-3a91-4e17-9831-3d368b1083b4\" (UID: \"7c98429a-3a91-4e17-9831-3d368b1083b4\") " Mar 20 09:22:36.181774 master-0 kubenswrapper[18707]: I0320 09:22:36.181652 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/7c98429a-3a91-4e17-9831-3d368b1083b4-ssh-key-edpm-b\") pod \"7c98429a-3a91-4e17-9831-3d368b1083b4\" (UID: \"7c98429a-3a91-4e17-9831-3d368b1083b4\") " Mar 20 09:22:36.181774 master-0 kubenswrapper[18707]: I0320 09:22:36.181724 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c98429a-3a91-4e17-9831-3d368b1083b4-inventory\") pod \"7c98429a-3a91-4e17-9831-3d368b1083b4\" (UID: \"7c98429a-3a91-4e17-9831-3d368b1083b4\") " Mar 20 09:22:36.188532 master-0 kubenswrapper[18707]: I0320 09:22:36.186227 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c98429a-3a91-4e17-9831-3d368b1083b4-kube-api-access-2qxhk" (OuterVolumeSpecName: "kube-api-access-2qxhk") pod "7c98429a-3a91-4e17-9831-3d368b1083b4" (UID: "7c98429a-3a91-4e17-9831-3d368b1083b4"). InnerVolumeSpecName "kube-api-access-2qxhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:22:36.226280 master-0 kubenswrapper[18707]: I0320 09:22:36.225591 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c98429a-3a91-4e17-9831-3d368b1083b4-inventory" (OuterVolumeSpecName: "inventory") pod "7c98429a-3a91-4e17-9831-3d368b1083b4" (UID: "7c98429a-3a91-4e17-9831-3d368b1083b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:36.226280 master-0 kubenswrapper[18707]: I0320 09:22:36.225809 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c98429a-3a91-4e17-9831-3d368b1083b4-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "7c98429a-3a91-4e17-9831-3d368b1083b4" (UID: "7c98429a-3a91-4e17-9831-3d368b1083b4"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:36.286789 master-0 kubenswrapper[18707]: I0320 09:22:36.286740 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qxhk\" (UniqueName: \"kubernetes.io/projected/7c98429a-3a91-4e17-9831-3d368b1083b4-kube-api-access-2qxhk\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:36.286789 master-0 kubenswrapper[18707]: I0320 09:22:36.286783 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/7c98429a-3a91-4e17-9831-3d368b1083b4-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:36.286789 master-0 kubenswrapper[18707]: I0320 09:22:36.286794 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c98429a-3a91-4e17-9831-3d368b1083b4-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:36.370516 master-0 kubenswrapper[18707]: I0320 09:22:36.370459 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" event={"ID":"7c98429a-3a91-4e17-9831-3d368b1083b4","Type":"ContainerDied","Data":"a884e436f7cd09c814fce176fb4273f517b6850dd1752ff1e1f71912029ac8b3"} Mar 20 09:22:36.370516 master-0 kubenswrapper[18707]: I0320 09:22:36.370515 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a884e436f7cd09c814fce176fb4273f517b6850dd1752ff1e1f71912029ac8b3" Mar 20 09:22:36.370846 master-0 kubenswrapper[18707]: I0320 09:22:36.370532 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-dataplane-step-1-edpm-b-4xvvh" Mar 20 09:22:37.222792 master-0 kubenswrapper[18707]: I0320 09:22:37.222693 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p"] Mar 20 09:22:37.223523 master-0 kubenswrapper[18707]: E0320 09:22:37.223378 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c98429a-3a91-4e17-9831-3d368b1083b4" containerName="run-os-dataplane-step-1-edpm-b" Mar 20 09:22:37.223523 master-0 kubenswrapper[18707]: I0320 09:22:37.223397 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c98429a-3a91-4e17-9831-3d368b1083b4" containerName="run-os-dataplane-step-1-edpm-b" Mar 20 09:22:37.223740 master-0 kubenswrapper[18707]: I0320 09:22:37.223713 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c98429a-3a91-4e17-9831-3d368b1083b4" containerName="run-os-dataplane-step-1-edpm-b" Mar 20 09:22:37.224592 master-0 kubenswrapper[18707]: I0320 09:22:37.224560 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" Mar 20 09:22:37.226490 master-0 kubenswrapper[18707]: I0320 09:22:37.226316 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-b" Mar 20 09:22:37.421769 master-0 kubenswrapper[18707]: I0320 09:22:37.421676 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6q65\" (UniqueName: \"kubernetes.io/projected/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-kube-api-access-n6q65\") pod \"reboot-os-dataplane-step-1-edpm-b-9dw2p\" (UID: \"43efd6b6-a06f-46f2-bcdf-4f7568aa6822\") " pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" Mar 20 09:22:37.422266 master-0 kubenswrapper[18707]: I0320 09:22:37.422202 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-ssh-key-edpm-b\") pod \"reboot-os-dataplane-step-1-edpm-b-9dw2p\" (UID: \"43efd6b6-a06f-46f2-bcdf-4f7568aa6822\") " pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" Mar 20 09:22:37.422666 master-0 kubenswrapper[18707]: I0320 09:22:37.422635 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-inventory\") pod \"reboot-os-dataplane-step-1-edpm-b-9dw2p\" (UID: \"43efd6b6-a06f-46f2-bcdf-4f7568aa6822\") " pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" Mar 20 09:22:37.445222 master-0 kubenswrapper[18707]: I0320 09:22:37.445112 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p"] Mar 20 09:22:37.527464 master-0 kubenswrapper[18707]: I0320 09:22:37.527346 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6q65\" (UniqueName: \"kubernetes.io/projected/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-kube-api-access-n6q65\") pod \"reboot-os-dataplane-step-1-edpm-b-9dw2p\" (UID: \"43efd6b6-a06f-46f2-bcdf-4f7568aa6822\") " pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" Mar 20 09:22:37.527682 master-0 kubenswrapper[18707]: I0320 09:22:37.527659 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-ssh-key-edpm-b\") pod \"reboot-os-dataplane-step-1-edpm-b-9dw2p\" (UID: \"43efd6b6-a06f-46f2-bcdf-4f7568aa6822\") " pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" Mar 20 09:22:37.527740 master-0 kubenswrapper[18707]: I0320 09:22:37.527724 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-inventory\") pod \"reboot-os-dataplane-step-1-edpm-b-9dw2p\" (UID: \"43efd6b6-a06f-46f2-bcdf-4f7568aa6822\") " pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" Mar 20 09:22:37.535416 master-0 kubenswrapper[18707]: I0320 09:22:37.531541 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-inventory\") pod \"reboot-os-dataplane-step-1-edpm-b-9dw2p\" (UID: \"43efd6b6-a06f-46f2-bcdf-4f7568aa6822\") " pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" Mar 20 09:22:37.535416 master-0 kubenswrapper[18707]: I0320 09:22:37.533324 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-ssh-key-edpm-b\") pod \"reboot-os-dataplane-step-1-edpm-b-9dw2p\" (UID: \"43efd6b6-a06f-46f2-bcdf-4f7568aa6822\") " pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" Mar 20 09:22:37.582317 master-0 kubenswrapper[18707]: I0320 09:22:37.582083 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6q65\" (UniqueName: \"kubernetes.io/projected/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-kube-api-access-n6q65\") pod \"reboot-os-dataplane-step-1-edpm-b-9dw2p\" (UID: \"43efd6b6-a06f-46f2-bcdf-4f7568aa6822\") " pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" Mar 20 09:22:37.845956 master-0 kubenswrapper[18707]: I0320 09:22:37.845410 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" Mar 20 09:22:38.706990 master-0 kubenswrapper[18707]: W0320 09:22:38.706894 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43efd6b6_a06f_46f2_bcdf_4f7568aa6822.slice/crio-b6ee120d49e485bfe38b77e4abe70dfe0761ffe095b0d3cbb2f0ceacc7a84289 WatchSource:0}: Error finding container b6ee120d49e485bfe38b77e4abe70dfe0761ffe095b0d3cbb2f0ceacc7a84289: Status 404 returned error can't find the container with id b6ee120d49e485bfe38b77e4abe70dfe0761ffe095b0d3cbb2f0ceacc7a84289 Mar 20 09:22:38.710696 master-0 kubenswrapper[18707]: I0320 09:22:38.710639 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p"] Mar 20 09:22:39.421610 master-0 kubenswrapper[18707]: I0320 09:22:39.421109 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" event={"ID":"43efd6b6-a06f-46f2-bcdf-4f7568aa6822","Type":"ContainerStarted","Data":"b6ee120d49e485bfe38b77e4abe70dfe0761ffe095b0d3cbb2f0ceacc7a84289"} Mar 20 09:22:40.434729 master-0 kubenswrapper[18707]: I0320 09:22:40.434643 18707 generic.go:334] "Generic (PLEG): container finished" podID="81883ed0-a696-465c-8e9d-f60820c4e8c0" containerID="dafae84f884c811cb97e1d2ad68e13d749b960db0880d7d40c3fa65ebae28ecb" exitCode=0 Mar 20 09:22:40.434729 master-0 kubenswrapper[18707]: I0320 09:22:40.434739 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" event={"ID":"81883ed0-a696-465c-8e9d-f60820c4e8c0","Type":"ContainerDied","Data":"dafae84f884c811cb97e1d2ad68e13d749b960db0880d7d40c3fa65ebae28ecb"} Mar 20 09:22:40.436548 master-0 kubenswrapper[18707]: I0320 09:22:40.436511 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" event={"ID":"43efd6b6-a06f-46f2-bcdf-4f7568aa6822","Type":"ContainerStarted","Data":"7c12afaad26bd5e1f0fc1c2401ae315940f8cdd4ad82a8ed8488872855ad1350"} Mar 20 09:22:40.514609 master-0 kubenswrapper[18707]: I0320 09:22:40.514130 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" podStartSLOduration=4.002396673 podStartE2EDuration="4.514100805s" podCreationTimestamp="2026-03-20 09:22:36 +0000 UTC" firstStartedPulling="2026-03-20 09:22:38.709656568 +0000 UTC m=+2503.865836924" lastFinishedPulling="2026-03-20 09:22:39.2213607 +0000 UTC m=+2504.377541056" observedRunningTime="2026-03-20 09:22:40.508637011 +0000 UTC m=+2505.664817387" watchObservedRunningTime="2026-03-20 09:22:40.514100805 +0000 UTC m=+2505.670281161" Mar 20 09:22:42.021577 master-0 kubenswrapper[18707]: I0320 09:22:42.016395 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" Mar 20 09:22:42.211278 master-0 kubenswrapper[18707]: I0320 09:22:42.205915 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk5s9\" (UniqueName: \"kubernetes.io/projected/81883ed0-a696-465c-8e9d-f60820c4e8c0-kube-api-access-sk5s9\") pod \"81883ed0-a696-465c-8e9d-f60820c4e8c0\" (UID: \"81883ed0-a696-465c-8e9d-f60820c4e8c0\") " Mar 20 09:22:42.211278 master-0 kubenswrapper[18707]: I0320 09:22:42.206057 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81883ed0-a696-465c-8e9d-f60820c4e8c0-inventory\") pod \"81883ed0-a696-465c-8e9d-f60820c4e8c0\" (UID: \"81883ed0-a696-465c-8e9d-f60820c4e8c0\") " Mar 20 09:22:42.211278 master-0 kubenswrapper[18707]: I0320 09:22:42.206123 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/81883ed0-a696-465c-8e9d-f60820c4e8c0-ssh-key-edpm-a\") pod \"81883ed0-a696-465c-8e9d-f60820c4e8c0\" (UID: \"81883ed0-a696-465c-8e9d-f60820c4e8c0\") " Mar 20 09:22:42.216778 master-0 kubenswrapper[18707]: I0320 09:22:42.213535 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81883ed0-a696-465c-8e9d-f60820c4e8c0-kube-api-access-sk5s9" (OuterVolumeSpecName: "kube-api-access-sk5s9") pod "81883ed0-a696-465c-8e9d-f60820c4e8c0" (UID: "81883ed0-a696-465c-8e9d-f60820c4e8c0"). InnerVolumeSpecName "kube-api-access-sk5s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:22:42.233078 master-0 kubenswrapper[18707]: I0320 09:22:42.233014 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81883ed0-a696-465c-8e9d-f60820c4e8c0-inventory" (OuterVolumeSpecName: "inventory") pod "81883ed0-a696-465c-8e9d-f60820c4e8c0" (UID: "81883ed0-a696-465c-8e9d-f60820c4e8c0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:42.238216 master-0 kubenswrapper[18707]: I0320 09:22:42.238146 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81883ed0-a696-465c-8e9d-f60820c4e8c0-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "81883ed0-a696-465c-8e9d-f60820c4e8c0" (UID: "81883ed0-a696-465c-8e9d-f60820c4e8c0"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:42.310211 master-0 kubenswrapper[18707]: I0320 09:22:42.310026 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81883ed0-a696-465c-8e9d-f60820c4e8c0-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:42.310211 master-0 kubenswrapper[18707]: I0320 09:22:42.310067 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/81883ed0-a696-465c-8e9d-f60820c4e8c0-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:42.310211 master-0 kubenswrapper[18707]: I0320 09:22:42.310080 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk5s9\" (UniqueName: \"kubernetes.io/projected/81883ed0-a696-465c-8e9d-f60820c4e8c0-kube-api-access-sk5s9\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:42.457635 master-0 kubenswrapper[18707]: I0320 09:22:42.457578 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" event={"ID":"81883ed0-a696-465c-8e9d-f60820c4e8c0","Type":"ContainerDied","Data":"2014ec6998e9fb6077b42f2e0cb3a2ec3e4fa7f77d3b6f0f8e3853d2a1700054"} Mar 20 09:22:42.457635 master-0 kubenswrapper[18707]: I0320 09:22:42.457629 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2014ec6998e9fb6077b42f2e0cb3a2ec3e4fa7f77d3b6f0f8e3853d2a1700054" Mar 20 09:22:42.457943 master-0 kubenswrapper[18707]: I0320 09:22:42.457694 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-dataplane-step-1-edpm-a-mtbhq" Mar 20 09:22:42.676477 master-0 kubenswrapper[18707]: I0320 09:22:42.676407 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j"] Mar 20 09:22:42.677050 master-0 kubenswrapper[18707]: E0320 09:22:42.677016 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81883ed0-a696-465c-8e9d-f60820c4e8c0" containerName="run-os-dataplane-step-1-edpm-a" Mar 20 09:22:42.677050 master-0 kubenswrapper[18707]: I0320 09:22:42.677036 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="81883ed0-a696-465c-8e9d-f60820c4e8c0" containerName="run-os-dataplane-step-1-edpm-a" Mar 20 09:22:42.677346 master-0 kubenswrapper[18707]: I0320 09:22:42.677311 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="81883ed0-a696-465c-8e9d-f60820c4e8c0" containerName="run-os-dataplane-step-1-edpm-a" Mar 20 09:22:42.678562 master-0 kubenswrapper[18707]: I0320 09:22:42.678211 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" Mar 20 09:22:42.683655 master-0 kubenswrapper[18707]: I0320 09:22:42.683609 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:22:42.693268 master-0 kubenswrapper[18707]: I0320 09:22:42.693198 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j"] Mar 20 09:22:42.730826 master-0 kubenswrapper[18707]: I0320 09:22:42.730691 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hwhg\" (UniqueName: \"kubernetes.io/projected/5cddf32e-d511-438c-903a-ee7cdce93de8-kube-api-access-6hwhg\") pod \"reboot-os-dataplane-step-1-edpm-a-rwc7j\" (UID: \"5cddf32e-d511-438c-903a-ee7cdce93de8\") " pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" Mar 20 09:22:42.731016 master-0 kubenswrapper[18707]: I0320 09:22:42.730932 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5cddf32e-d511-438c-903a-ee7cdce93de8-inventory\") pod \"reboot-os-dataplane-step-1-edpm-a-rwc7j\" (UID: \"5cddf32e-d511-438c-903a-ee7cdce93de8\") " pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" Mar 20 09:22:42.731016 master-0 kubenswrapper[18707]: I0320 09:22:42.730975 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/5cddf32e-d511-438c-903a-ee7cdce93de8-ssh-key-edpm-a\") pod \"reboot-os-dataplane-step-1-edpm-a-rwc7j\" (UID: \"5cddf32e-d511-438c-903a-ee7cdce93de8\") " pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" Mar 20 09:22:42.832153 master-0 kubenswrapper[18707]: I0320 09:22:42.832072 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hwhg\" (UniqueName: \"kubernetes.io/projected/5cddf32e-d511-438c-903a-ee7cdce93de8-kube-api-access-6hwhg\") pod \"reboot-os-dataplane-step-1-edpm-a-rwc7j\" (UID: \"5cddf32e-d511-438c-903a-ee7cdce93de8\") " pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" Mar 20 09:22:42.832445 master-0 kubenswrapper[18707]: I0320 09:22:42.832303 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5cddf32e-d511-438c-903a-ee7cdce93de8-inventory\") pod \"reboot-os-dataplane-step-1-edpm-a-rwc7j\" (UID: \"5cddf32e-d511-438c-903a-ee7cdce93de8\") " pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" Mar 20 09:22:42.832445 master-0 kubenswrapper[18707]: I0320 09:22:42.832336 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/5cddf32e-d511-438c-903a-ee7cdce93de8-ssh-key-edpm-a\") pod \"reboot-os-dataplane-step-1-edpm-a-rwc7j\" (UID: \"5cddf32e-d511-438c-903a-ee7cdce93de8\") " pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" Mar 20 09:22:42.840510 master-0 kubenswrapper[18707]: I0320 09:22:42.840172 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5cddf32e-d511-438c-903a-ee7cdce93de8-inventory\") pod \"reboot-os-dataplane-step-1-edpm-a-rwc7j\" (UID: \"5cddf32e-d511-438c-903a-ee7cdce93de8\") " pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" Mar 20 09:22:42.840510 master-0 kubenswrapper[18707]: I0320 09:22:42.840413 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/5cddf32e-d511-438c-903a-ee7cdce93de8-ssh-key-edpm-a\") pod \"reboot-os-dataplane-step-1-edpm-a-rwc7j\" (UID: \"5cddf32e-d511-438c-903a-ee7cdce93de8\") " pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" Mar 20 09:22:42.850638 master-0 kubenswrapper[18707]: I0320 09:22:42.850549 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hwhg\" (UniqueName: \"kubernetes.io/projected/5cddf32e-d511-438c-903a-ee7cdce93de8-kube-api-access-6hwhg\") pod \"reboot-os-dataplane-step-1-edpm-a-rwc7j\" (UID: \"5cddf32e-d511-438c-903a-ee7cdce93de8\") " pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" Mar 20 09:22:43.027541 master-0 kubenswrapper[18707]: I0320 09:22:43.027406 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" Mar 20 09:22:43.638715 master-0 kubenswrapper[18707]: I0320 09:22:43.638658 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j"] Mar 20 09:22:44.488460 master-0 kubenswrapper[18707]: I0320 09:22:44.488389 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" event={"ID":"5cddf32e-d511-438c-903a-ee7cdce93de8","Type":"ContainerStarted","Data":"da32d45fceb458285b1643cb7a68462c0f9786c034d63f718b5290e02e6bcd3e"} Mar 20 09:22:44.488460 master-0 kubenswrapper[18707]: I0320 09:22:44.488449 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" event={"ID":"5cddf32e-d511-438c-903a-ee7cdce93de8","Type":"ContainerStarted","Data":"48cd61e68a0addce3cea07266ffeabc4d7a99e1da79ed627f9782309b5c4c2f3"} Mar 20 09:22:44.670655 master-0 kubenswrapper[18707]: I0320 09:22:44.670569 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" podStartSLOduration=2.246228983 podStartE2EDuration="2.670549395s" podCreationTimestamp="2026-03-20 09:22:42 +0000 UTC" firstStartedPulling="2026-03-20 09:22:43.641120682 +0000 UTC m=+2508.797301028" lastFinishedPulling="2026-03-20 09:22:44.065441074 +0000 UTC m=+2509.221621440" observedRunningTime="2026-03-20 09:22:44.669889206 +0000 UTC m=+2509.826069582" watchObservedRunningTime="2026-03-20 09:22:44.670549395 +0000 UTC m=+2509.826729751" Mar 20 09:22:49.637048 master-0 kubenswrapper[18707]: I0320 09:22:49.636989 18707 generic.go:334] "Generic (PLEG): container finished" podID="43efd6b6-a06f-46f2-bcdf-4f7568aa6822" containerID="7c12afaad26bd5e1f0fc1c2401ae315940f8cdd4ad82a8ed8488872855ad1350" exitCode=0 Mar 20 09:22:49.637048 master-0 kubenswrapper[18707]: I0320 09:22:49.637041 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" event={"ID":"43efd6b6-a06f-46f2-bcdf-4f7568aa6822","Type":"ContainerDied","Data":"7c12afaad26bd5e1f0fc1c2401ae315940f8cdd4ad82a8ed8488872855ad1350"} Mar 20 09:22:51.249432 master-0 kubenswrapper[18707]: I0320 09:22:51.249376 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" Mar 20 09:22:51.363221 master-0 kubenswrapper[18707]: I0320 09:22:51.362881 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-ssh-key-edpm-b\") pod \"43efd6b6-a06f-46f2-bcdf-4f7568aa6822\" (UID: \"43efd6b6-a06f-46f2-bcdf-4f7568aa6822\") " Mar 20 09:22:51.363221 master-0 kubenswrapper[18707]: I0320 09:22:51.363108 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6q65\" (UniqueName: \"kubernetes.io/projected/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-kube-api-access-n6q65\") pod \"43efd6b6-a06f-46f2-bcdf-4f7568aa6822\" (UID: \"43efd6b6-a06f-46f2-bcdf-4f7568aa6822\") " Mar 20 09:22:51.363600 master-0 kubenswrapper[18707]: I0320 09:22:51.363324 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-inventory\") pod \"43efd6b6-a06f-46f2-bcdf-4f7568aa6822\" (UID: \"43efd6b6-a06f-46f2-bcdf-4f7568aa6822\") " Mar 20 09:22:51.380138 master-0 kubenswrapper[18707]: I0320 09:22:51.379996 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-kube-api-access-n6q65" (OuterVolumeSpecName: "kube-api-access-n6q65") pod "43efd6b6-a06f-46f2-bcdf-4f7568aa6822" (UID: "43efd6b6-a06f-46f2-bcdf-4f7568aa6822"). InnerVolumeSpecName "kube-api-access-n6q65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:22:51.418181 master-0 kubenswrapper[18707]: I0320 09:22:51.417003 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "43efd6b6-a06f-46f2-bcdf-4f7568aa6822" (UID: "43efd6b6-a06f-46f2-bcdf-4f7568aa6822"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:51.418181 master-0 kubenswrapper[18707]: I0320 09:22:51.417854 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-inventory" (OuterVolumeSpecName: "inventory") pod "43efd6b6-a06f-46f2-bcdf-4f7568aa6822" (UID: "43efd6b6-a06f-46f2-bcdf-4f7568aa6822"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:51.466666 master-0 kubenswrapper[18707]: I0320 09:22:51.466503 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6q65\" (UniqueName: \"kubernetes.io/projected/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-kube-api-access-n6q65\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:51.466666 master-0 kubenswrapper[18707]: I0320 09:22:51.466566 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:51.466666 master-0 kubenswrapper[18707]: I0320 09:22:51.466586 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/43efd6b6-a06f-46f2-bcdf-4f7568aa6822-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:51.664563 master-0 kubenswrapper[18707]: I0320 09:22:51.664425 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" event={"ID":"43efd6b6-a06f-46f2-bcdf-4f7568aa6822","Type":"ContainerDied","Data":"b6ee120d49e485bfe38b77e4abe70dfe0761ffe095b0d3cbb2f0ceacc7a84289"} Mar 20 09:22:51.664820 master-0 kubenswrapper[18707]: I0320 09:22:51.664800 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6ee120d49e485bfe38b77e4abe70dfe0761ffe095b0d3cbb2f0ceacc7a84289" Mar 20 09:22:51.664918 master-0 kubenswrapper[18707]: I0320 09:22:51.664463 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-dataplane-step-1-edpm-b-9dw2p" Mar 20 09:22:54.701719 master-0 kubenswrapper[18707]: I0320 09:22:54.701651 18707 generic.go:334] "Generic (PLEG): container finished" podID="5cddf32e-d511-438c-903a-ee7cdce93de8" containerID="da32d45fceb458285b1643cb7a68462c0f9786c034d63f718b5290e02e6bcd3e" exitCode=0 Mar 20 09:22:54.703035 master-0 kubenswrapper[18707]: I0320 09:22:54.702960 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" event={"ID":"5cddf32e-d511-438c-903a-ee7cdce93de8","Type":"ContainerDied","Data":"da32d45fceb458285b1643cb7a68462c0f9786c034d63f718b5290e02e6bcd3e"} Mar 20 09:22:56.294302 master-0 kubenswrapper[18707]: I0320 09:22:56.294164 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" Mar 20 09:22:56.313836 master-0 kubenswrapper[18707]: I0320 09:22:56.313760 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hwhg\" (UniqueName: \"kubernetes.io/projected/5cddf32e-d511-438c-903a-ee7cdce93de8-kube-api-access-6hwhg\") pod \"5cddf32e-d511-438c-903a-ee7cdce93de8\" (UID: \"5cddf32e-d511-438c-903a-ee7cdce93de8\") " Mar 20 09:22:56.320439 master-0 kubenswrapper[18707]: I0320 09:22:56.320372 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cddf32e-d511-438c-903a-ee7cdce93de8-kube-api-access-6hwhg" (OuterVolumeSpecName: "kube-api-access-6hwhg") pod "5cddf32e-d511-438c-903a-ee7cdce93de8" (UID: "5cddf32e-d511-438c-903a-ee7cdce93de8"). InnerVolumeSpecName "kube-api-access-6hwhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:22:56.421292 master-0 kubenswrapper[18707]: I0320 09:22:56.421222 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/5cddf32e-d511-438c-903a-ee7cdce93de8-ssh-key-edpm-a\") pod \"5cddf32e-d511-438c-903a-ee7cdce93de8\" (UID: \"5cddf32e-d511-438c-903a-ee7cdce93de8\") " Mar 20 09:22:56.424529 master-0 kubenswrapper[18707]: I0320 09:22:56.421575 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5cddf32e-d511-438c-903a-ee7cdce93de8-inventory\") pod \"5cddf32e-d511-438c-903a-ee7cdce93de8\" (UID: \"5cddf32e-d511-438c-903a-ee7cdce93de8\") " Mar 20 09:22:56.424529 master-0 kubenswrapper[18707]: I0320 09:22:56.423469 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hwhg\" (UniqueName: \"kubernetes.io/projected/5cddf32e-d511-438c-903a-ee7cdce93de8-kube-api-access-6hwhg\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:56.454342 master-0 kubenswrapper[18707]: I0320 09:22:56.453937 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cddf32e-d511-438c-903a-ee7cdce93de8-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "5cddf32e-d511-438c-903a-ee7cdce93de8" (UID: "5cddf32e-d511-438c-903a-ee7cdce93de8"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:56.462803 master-0 kubenswrapper[18707]: I0320 09:22:56.462178 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cddf32e-d511-438c-903a-ee7cdce93de8-inventory" (OuterVolumeSpecName: "inventory") pod "5cddf32e-d511-438c-903a-ee7cdce93de8" (UID: "5cddf32e-d511-438c-903a-ee7cdce93de8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:22:56.524647 master-0 kubenswrapper[18707]: I0320 09:22:56.524602 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/5cddf32e-d511-438c-903a-ee7cdce93de8-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:56.524914 master-0 kubenswrapper[18707]: I0320 09:22:56.524901 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5cddf32e-d511-438c-903a-ee7cdce93de8-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:22:56.726500 master-0 kubenswrapper[18707]: I0320 09:22:56.726372 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" event={"ID":"5cddf32e-d511-438c-903a-ee7cdce93de8","Type":"ContainerDied","Data":"48cd61e68a0addce3cea07266ffeabc4d7a99e1da79ed627f9782309b5c4c2f3"} Mar 20 09:22:56.726500 master-0 kubenswrapper[18707]: I0320 09:22:56.726432 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48cd61e68a0addce3cea07266ffeabc4d7a99e1da79ed627f9782309b5c4c2f3" Mar 20 09:22:56.726500 master-0 kubenswrapper[18707]: I0320 09:22:56.726501 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-dataplane-step-1-edpm-a-rwc7j" Mar 20 09:23:06.142040 master-0 kubenswrapper[18707]: I0320 09:23:06.141970 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-dataplane-step-2-edpm-a-5xzqr"] Mar 20 09:23:06.142709 master-0 kubenswrapper[18707]: E0320 09:23:06.142679 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cddf32e-d511-438c-903a-ee7cdce93de8" containerName="reboot-os-dataplane-step-1-edpm-a" Mar 20 09:23:06.142709 master-0 kubenswrapper[18707]: I0320 09:23:06.142705 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cddf32e-d511-438c-903a-ee7cdce93de8" containerName="reboot-os-dataplane-step-1-edpm-a" Mar 20 09:23:06.142810 master-0 kubenswrapper[18707]: E0320 09:23:06.142768 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43efd6b6-a06f-46f2-bcdf-4f7568aa6822" containerName="reboot-os-dataplane-step-1-edpm-b" Mar 20 09:23:06.142810 master-0 kubenswrapper[18707]: I0320 09:23:06.142780 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="43efd6b6-a06f-46f2-bcdf-4f7568aa6822" containerName="reboot-os-dataplane-step-1-edpm-b" Mar 20 09:23:06.143166 master-0 kubenswrapper[18707]: I0320 09:23:06.143114 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cddf32e-d511-438c-903a-ee7cdce93de8" containerName="reboot-os-dataplane-step-1-edpm-a" Mar 20 09:23:06.143166 master-0 kubenswrapper[18707]: I0320 09:23:06.143165 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="43efd6b6-a06f-46f2-bcdf-4f7568aa6822" containerName="reboot-os-dataplane-step-1-edpm-b" Mar 20 09:23:06.144215 master-0 kubenswrapper[18707]: I0320 09:23:06.144178 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.147328 master-0 kubenswrapper[18707]: I0320 09:23:06.147260 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 09:23:06.147628 master-0 kubenswrapper[18707]: I0320 09:23:06.147596 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"edpm-a-libvirt-default-certs-0" Mar 20 09:23:06.148790 master-0 kubenswrapper[18707]: I0320 09:23:06.147950 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:23:06.148790 master-0 kubenswrapper[18707]: I0320 09:23:06.147963 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:23:06.149847 master-0 kubenswrapper[18707]: I0320 09:23:06.149796 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"edpm-a-ovn-default-certs-0" Mar 20 09:23:06.153009 master-0 kubenswrapper[18707]: I0320 09:23:06.152934 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"edpm-a-neutron-metadata-default-certs-0" Mar 20 09:23:06.177936 master-0 kubenswrapper[18707]: I0320 09:23:06.177547 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-dataplane-step-2-edpm-b-w5v2r"] Mar 20 09:23:06.180229 master-0 kubenswrapper[18707]: I0320 09:23:06.180125 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.185513 master-0 kubenswrapper[18707]: I0320 09:23:06.185455 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"edpm-b-ovn-default-certs-0" Mar 20 09:23:06.186541 master-0 kubenswrapper[18707]: I0320 09:23:06.185793 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-b" Mar 20 09:23:06.186541 master-0 kubenswrapper[18707]: I0320 09:23:06.185954 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"edpm-b-neutron-metadata-default-certs-0" Mar 20 09:23:06.186541 master-0 kubenswrapper[18707]: I0320 09:23:06.186097 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"edpm-b-libvirt-default-certs-0" Mar 20 09:23:06.200277 master-0 kubenswrapper[18707]: I0320 09:23:06.200170 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-dataplane-step-2-edpm-a-5xzqr"] Mar 20 09:23:06.238532 master-0 kubenswrapper[18707]: I0320 09:23:06.238467 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-dataplane-step-2-edpm-b-w5v2r"] Mar 20 09:23:06.283028 master-0 kubenswrapper[18707]: I0320 09:23:06.282949 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-neutron-metadata-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.283318 master-0 kubenswrapper[18707]: I0320 09:23:06.283041 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-ovn-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.283318 master-0 kubenswrapper[18707]: I0320 09:23:06.283066 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-ssh-key-edpm-a\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.283318 master-0 kubenswrapper[18707]: I0320 09:23:06.283093 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-ovn-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.283318 master-0 kubenswrapper[18707]: I0320 09:23:06.283117 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-neutron-metadata-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.283318 master-0 kubenswrapper[18707]: I0320 09:23:06.283246 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-libvirt-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.283318 master-0 kubenswrapper[18707]: I0320 09:23:06.283291 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-inventory\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.283570 master-0 kubenswrapper[18707]: I0320 09:23:06.283338 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-b-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-libvirt-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.283570 master-0 kubenswrapper[18707]: I0320 09:23:06.283422 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-ovn-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.283660 master-0 kubenswrapper[18707]: I0320 09:23:06.283552 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-libvirt-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.284116 master-0 kubenswrapper[18707]: I0320 09:23:06.284077 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-nova-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.284306 master-0 kubenswrapper[18707]: I0320 09:23:06.284260 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-inventory\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.284364 master-0 kubenswrapper[18707]: I0320 09:23:06.284322 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-neutron-metadata-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.284409 master-0 kubenswrapper[18707]: I0320 09:23:06.284376 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-a-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-ovn-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.284453 master-0 kubenswrapper[18707]: I0320 09:23:06.284422 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rfzs\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-kube-api-access-5rfzs\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.284485 master-0 kubenswrapper[18707]: I0320 09:23:06.284453 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-ssh-key-edpm-b\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.284485 master-0 kubenswrapper[18707]: I0320 09:23:06.284474 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-libvirt-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.284544 master-0 kubenswrapper[18707]: I0320 09:23:06.284493 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-nova-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.284544 master-0 kubenswrapper[18707]: I0320 09:23:06.284514 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxxnk\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-kube-api-access-cxxnk\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.284810 master-0 kubenswrapper[18707]: I0320 09:23:06.284778 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-neutron-metadata-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.386517 master-0 kubenswrapper[18707]: I0320 09:23:06.386446 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-nova-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.386517 master-0 kubenswrapper[18707]: I0320 09:23:06.386509 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-inventory\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.386785 master-0 kubenswrapper[18707]: I0320 09:23:06.386538 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-neutron-metadata-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.386785 master-0 kubenswrapper[18707]: I0320 09:23:06.386569 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-ovn-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.386785 master-0 kubenswrapper[18707]: I0320 09:23:06.386588 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rfzs\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-kube-api-access-5rfzs\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.386785 master-0 kubenswrapper[18707]: I0320 09:23:06.386612 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-ssh-key-edpm-b\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.386785 master-0 kubenswrapper[18707]: I0320 09:23:06.386630 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-libvirt-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.386785 master-0 kubenswrapper[18707]: I0320 09:23:06.386648 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-nova-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.386785 master-0 kubenswrapper[18707]: I0320 09:23:06.386666 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxxnk\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-kube-api-access-cxxnk\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.386785 master-0 kubenswrapper[18707]: I0320 09:23:06.386690 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-neutron-metadata-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.386785 master-0 kubenswrapper[18707]: I0320 09:23:06.386747 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-neutron-metadata-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.386785 master-0 kubenswrapper[18707]: I0320 09:23:06.386778 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-ovn-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.387122 master-0 kubenswrapper[18707]: I0320 09:23:06.386795 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-ssh-key-edpm-a\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.387122 master-0 kubenswrapper[18707]: I0320 09:23:06.386815 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-ovn-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.387122 master-0 kubenswrapper[18707]: I0320 09:23:06.386835 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-neutron-metadata-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.387122 master-0 kubenswrapper[18707]: I0320 09:23:06.386853 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-libvirt-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.387122 master-0 kubenswrapper[18707]: I0320 09:23:06.386882 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-inventory\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.387122 master-0 kubenswrapper[18707]: I0320 09:23:06.386919 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-b-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-libvirt-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.387122 master-0 kubenswrapper[18707]: I0320 09:23:06.386954 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-ovn-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.387122 master-0 kubenswrapper[18707]: I0320 09:23:06.386980 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-a-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-libvirt-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.390950 master-0 kubenswrapper[18707]: I0320 09:23:06.390909 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-libvirt-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.392506 master-0 kubenswrapper[18707]: I0320 09:23:06.391896 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-neutron-metadata-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.392506 master-0 kubenswrapper[18707]: I0320 09:23:06.392427 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-neutron-metadata-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.393421 master-0 kubenswrapper[18707]: I0320 09:23:06.393025 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-libvirt-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.393421 master-0 kubenswrapper[18707]: I0320 09:23:06.393151 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-libvirt-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.393744 master-0 kubenswrapper[18707]: I0320 09:23:06.393552 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-inventory\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.394321 master-0 kubenswrapper[18707]: I0320 09:23:06.394275 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-ssh-key-edpm-a\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.395815 master-0 kubenswrapper[18707]: I0320 09:23:06.395594 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-ssh-key-edpm-b\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.395815 master-0 kubenswrapper[18707]: I0320 09:23:06.395738 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-inventory\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.395815 master-0 kubenswrapper[18707]: I0320 09:23:06.395772 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-neutron-metadata-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.396490 master-0 kubenswrapper[18707]: I0320 09:23:06.396451 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-neutron-metadata-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.397460 master-0 kubenswrapper[18707]: I0320 09:23:06.397409 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-nova-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.397957 master-0 kubenswrapper[18707]: I0320 09:23:06.397920 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-nova-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.398020 master-0 kubenswrapper[18707]: I0320 09:23:06.397965 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-ovn-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.398604 master-0 kubenswrapper[18707]: I0320 09:23:06.398580 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-a-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-ovn-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.398676 master-0 kubenswrapper[18707]: I0320 09:23:06.398599 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-libvirt-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.400326 master-0 kubenswrapper[18707]: I0320 09:23:06.400291 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-ovn-combined-ca-bundle\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.403302 master-0 kubenswrapper[18707]: I0320 09:23:06.401997 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-b-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-ovn-default-certs-0\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.406731 master-0 kubenswrapper[18707]: I0320 09:23:06.406697 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rfzs\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-kube-api-access-5rfzs\") pod \"install-certs-dataplane-step-2-edpm-a-5xzqr\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.416613 master-0 kubenswrapper[18707]: I0320 09:23:06.416566 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxxnk\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-kube-api-access-cxxnk\") pod \"install-certs-dataplane-step-2-edpm-b-w5v2r\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:06.467454 master-0 kubenswrapper[18707]: I0320 09:23:06.467396 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:06.532630 master-0 kubenswrapper[18707]: I0320 09:23:06.531735 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:07.052317 master-0 kubenswrapper[18707]: I0320 09:23:07.051914 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-dataplane-step-2-edpm-a-5xzqr"] Mar 20 09:23:07.052758 master-0 kubenswrapper[18707]: W0320 09:23:07.052711 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfac34132_97c0_4898_81d0_388cb1db4fad.slice/crio-b5b6448e35d616ae3793be8685828b93902b2a4cb82bdf7a695f85b344fb21a6 WatchSource:0}: Error finding container b5b6448e35d616ae3793be8685828b93902b2a4cb82bdf7a695f85b344fb21a6: Status 404 returned error can't find the container with id b5b6448e35d616ae3793be8685828b93902b2a4cb82bdf7a695f85b344fb21a6 Mar 20 09:23:07.060683 master-0 kubenswrapper[18707]: I0320 09:23:07.060501 18707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:23:07.196933 master-0 kubenswrapper[18707]: W0320 09:23:07.196852 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36ef6f41_ad37_4778_bde5_e7c688c27878.slice/crio-b2c0dc6f04aa592c43a6ba5e2e0a4442f548931fe708176e19bb8981f40f5d97 WatchSource:0}: Error finding container b2c0dc6f04aa592c43a6ba5e2e0a4442f548931fe708176e19bb8981f40f5d97: Status 404 returned error can't find the container with id b2c0dc6f04aa592c43a6ba5e2e0a4442f548931fe708176e19bb8981f40f5d97 Mar 20 09:23:07.202766 master-0 kubenswrapper[18707]: I0320 09:23:07.202692 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-dataplane-step-2-edpm-b-w5v2r"] Mar 20 09:23:07.865830 master-0 kubenswrapper[18707]: I0320 09:23:07.865674 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" event={"ID":"36ef6f41-ad37-4778-bde5-e7c688c27878","Type":"ContainerStarted","Data":"b2c0dc6f04aa592c43a6ba5e2e0a4442f548931fe708176e19bb8981f40f5d97"} Mar 20 09:23:07.868878 master-0 kubenswrapper[18707]: I0320 09:23:07.868801 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" event={"ID":"fac34132-97c0-4898-81d0-388cb1db4fad","Type":"ContainerStarted","Data":"b5b6448e35d616ae3793be8685828b93902b2a4cb82bdf7a695f85b344fb21a6"} Mar 20 09:23:08.886958 master-0 kubenswrapper[18707]: I0320 09:23:08.886895 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" event={"ID":"36ef6f41-ad37-4778-bde5-e7c688c27878","Type":"ContainerStarted","Data":"27ed110abf0529834ca6ab54e3a1e5ca0c23d8f4220221dede79c3c35b8bf2ed"} Mar 20 09:23:08.891012 master-0 kubenswrapper[18707]: I0320 09:23:08.890946 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" event={"ID":"fac34132-97c0-4898-81d0-388cb1db4fad","Type":"ContainerStarted","Data":"ed3379d261c36b541f2c126188bb060d2a2d716004f0406ee1e95370eb7e20e5"} Mar 20 09:23:08.916400 master-0 kubenswrapper[18707]: I0320 09:23:08.916319 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" podStartSLOduration=2.464370998 podStartE2EDuration="2.916304521s" podCreationTimestamp="2026-03-20 09:23:06 +0000 UTC" firstStartedPulling="2026-03-20 09:23:07.201797015 +0000 UTC m=+2532.357977371" lastFinishedPulling="2026-03-20 09:23:07.653730528 +0000 UTC m=+2532.809910894" observedRunningTime="2026-03-20 09:23:08.913969085 +0000 UTC m=+2534.070149431" watchObservedRunningTime="2026-03-20 09:23:08.916304521 +0000 UTC m=+2534.072484877" Mar 20 09:23:31.191996 master-0 kubenswrapper[18707]: I0320 09:23:31.191933 18707 generic.go:334] "Generic (PLEG): container finished" podID="fac34132-97c0-4898-81d0-388cb1db4fad" containerID="ed3379d261c36b541f2c126188bb060d2a2d716004f0406ee1e95370eb7e20e5" exitCode=0 Mar 20 09:23:31.192800 master-0 kubenswrapper[18707]: I0320 09:23:31.192012 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" event={"ID":"fac34132-97c0-4898-81d0-388cb1db4fad","Type":"ContainerDied","Data":"ed3379d261c36b541f2c126188bb060d2a2d716004f0406ee1e95370eb7e20e5"} Mar 20 09:23:32.242953 master-0 kubenswrapper[18707]: I0320 09:23:32.242882 18707 generic.go:334] "Generic (PLEG): container finished" podID="36ef6f41-ad37-4778-bde5-e7c688c27878" containerID="27ed110abf0529834ca6ab54e3a1e5ca0c23d8f4220221dede79c3c35b8bf2ed" exitCode=0 Mar 20 09:23:32.243467 master-0 kubenswrapper[18707]: I0320 09:23:32.242956 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" event={"ID":"36ef6f41-ad37-4778-bde5-e7c688c27878","Type":"ContainerDied","Data":"27ed110abf0529834ca6ab54e3a1e5ca0c23d8f4220221dede79c3c35b8bf2ed"} Mar 20 09:23:32.739142 master-0 kubenswrapper[18707]: I0320 09:23:32.739054 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:32.845640 master-0 kubenswrapper[18707]: I0320 09:23:32.845586 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-nova-combined-ca-bundle\") pod \"fac34132-97c0-4898-81d0-388cb1db4fad\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " Mar 20 09:23:32.845640 master-0 kubenswrapper[18707]: I0320 09:23:32.845641 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rfzs\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-kube-api-access-5rfzs\") pod \"fac34132-97c0-4898-81d0-388cb1db4fad\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " Mar 20 09:23:32.845902 master-0 kubenswrapper[18707]: I0320 09:23:32.845730 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-libvirt-default-certs-0\") pod \"fac34132-97c0-4898-81d0-388cb1db4fad\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " Mar 20 09:23:32.845994 master-0 kubenswrapper[18707]: I0320 09:23:32.845975 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-libvirt-combined-ca-bundle\") pod \"fac34132-97c0-4898-81d0-388cb1db4fad\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " Mar 20 09:23:32.846057 master-0 kubenswrapper[18707]: I0320 09:23:32.846003 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-ovn-combined-ca-bundle\") pod \"fac34132-97c0-4898-81d0-388cb1db4fad\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " Mar 20 09:23:32.846057 master-0 kubenswrapper[18707]: I0320 09:23:32.846026 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-inventory\") pod \"fac34132-97c0-4898-81d0-388cb1db4fad\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " Mar 20 09:23:32.846133 master-0 kubenswrapper[18707]: I0320 09:23:32.846086 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-ovn-default-certs-0\") pod \"fac34132-97c0-4898-81d0-388cb1db4fad\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " Mar 20 09:23:32.846133 master-0 kubenswrapper[18707]: I0320 09:23:32.846110 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-a-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-neutron-metadata-default-certs-0\") pod \"fac34132-97c0-4898-81d0-388cb1db4fad\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " Mar 20 09:23:32.846227 master-0 kubenswrapper[18707]: I0320 09:23:32.846136 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-neutron-metadata-combined-ca-bundle\") pod \"fac34132-97c0-4898-81d0-388cb1db4fad\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " Mar 20 09:23:32.846227 master-0 kubenswrapper[18707]: I0320 09:23:32.846220 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-ssh-key-edpm-a\") pod \"fac34132-97c0-4898-81d0-388cb1db4fad\" (UID: \"fac34132-97c0-4898-81d0-388cb1db4fad\") " Mar 20 09:23:32.850805 master-0 kubenswrapper[18707]: I0320 09:23:32.850733 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "fac34132-97c0-4898-81d0-388cb1db4fad" (UID: "fac34132-97c0-4898-81d0-388cb1db4fad"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:32.850954 master-0 kubenswrapper[18707]: I0320 09:23:32.850853 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "fac34132-97c0-4898-81d0-388cb1db4fad" (UID: "fac34132-97c0-4898-81d0-388cb1db4fad"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:32.850954 master-0 kubenswrapper[18707]: I0320 09:23:32.850856 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "edpm-a-neutron-metadata-default-certs-0") pod "fac34132-97c0-4898-81d0-388cb1db4fad" (UID: "fac34132-97c0-4898-81d0-388cb1db4fad"). InnerVolumeSpecName "edpm-a-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:23:32.851036 master-0 kubenswrapper[18707]: I0320 09:23:32.850984 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-kube-api-access-5rfzs" (OuterVolumeSpecName: "kube-api-access-5rfzs") pod "fac34132-97c0-4898-81d0-388cb1db4fad" (UID: "fac34132-97c0-4898-81d0-388cb1db4fad"). InnerVolumeSpecName "kube-api-access-5rfzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:23:32.852704 master-0 kubenswrapper[18707]: I0320 09:23:32.852606 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-ovn-default-certs-0" (OuterVolumeSpecName: "edpm-a-ovn-default-certs-0") pod "fac34132-97c0-4898-81d0-388cb1db4fad" (UID: "fac34132-97c0-4898-81d0-388cb1db4fad"). InnerVolumeSpecName "edpm-a-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:23:32.853177 master-0 kubenswrapper[18707]: I0320 09:23:32.853067 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "fac34132-97c0-4898-81d0-388cb1db4fad" (UID: "fac34132-97c0-4898-81d0-388cb1db4fad"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:32.862869 master-0 kubenswrapper[18707]: I0320 09:23:32.862689 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-libvirt-default-certs-0" (OuterVolumeSpecName: "edpm-a-libvirt-default-certs-0") pod "fac34132-97c0-4898-81d0-388cb1db4fad" (UID: "fac34132-97c0-4898-81d0-388cb1db4fad"). InnerVolumeSpecName "edpm-a-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:23:32.862869 master-0 kubenswrapper[18707]: I0320 09:23:32.862798 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "fac34132-97c0-4898-81d0-388cb1db4fad" (UID: "fac34132-97c0-4898-81d0-388cb1db4fad"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:32.878218 master-0 kubenswrapper[18707]: I0320 09:23:32.878109 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-inventory" (OuterVolumeSpecName: "inventory") pod "fac34132-97c0-4898-81d0-388cb1db4fad" (UID: "fac34132-97c0-4898-81d0-388cb1db4fad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:32.879512 master-0 kubenswrapper[18707]: I0320 09:23:32.879476 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "fac34132-97c0-4898-81d0-388cb1db4fad" (UID: "fac34132-97c0-4898-81d0-388cb1db4fad"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:32.950476 master-0 kubenswrapper[18707]: I0320 09:23:32.950035 18707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-libvirt-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:32.950476 master-0 kubenswrapper[18707]: I0320 09:23:32.950102 18707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-ovn-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:32.950476 master-0 kubenswrapper[18707]: I0320 09:23:32.950126 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:32.950476 master-0 kubenswrapper[18707]: I0320 09:23:32.950150 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-a-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-ovn-default-certs-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:32.950476 master-0 kubenswrapper[18707]: I0320 09:23:32.950174 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-a-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-neutron-metadata-default-certs-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:32.950476 master-0 kubenswrapper[18707]: I0320 09:23:32.950223 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:32.950476 master-0 kubenswrapper[18707]: I0320 09:23:32.950244 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:32.950476 master-0 kubenswrapper[18707]: I0320 09:23:32.950263 18707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fac34132-97c0-4898-81d0-388cb1db4fad-nova-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:32.950476 master-0 kubenswrapper[18707]: I0320 09:23:32.950290 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rfzs\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-kube-api-access-5rfzs\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:32.950476 master-0 kubenswrapper[18707]: I0320 09:23:32.950311 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-a-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fac34132-97c0-4898-81d0-388cb1db4fad-edpm-a-libvirt-default-certs-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:33.258515 master-0 kubenswrapper[18707]: I0320 09:23:33.258364 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" event={"ID":"fac34132-97c0-4898-81d0-388cb1db4fad","Type":"ContainerDied","Data":"b5b6448e35d616ae3793be8685828b93902b2a4cb82bdf7a695f85b344fb21a6"} Mar 20 09:23:33.258515 master-0 kubenswrapper[18707]: I0320 09:23:33.258437 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5b6448e35d616ae3793be8685828b93902b2a4cb82bdf7a695f85b344fb21a6" Mar 20 09:23:33.258515 master-0 kubenswrapper[18707]: I0320 09:23:33.258383 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-dataplane-step-2-edpm-a-5xzqr" Mar 20 09:23:33.366091 master-0 kubenswrapper[18707]: I0320 09:23:33.366035 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-dataplane-step-2-edpm-a-r2hdg"] Mar 20 09:23:33.366909 master-0 kubenswrapper[18707]: E0320 09:23:33.366885 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fac34132-97c0-4898-81d0-388cb1db4fad" containerName="install-certs-dataplane-step-2-edpm-a" Mar 20 09:23:33.366909 master-0 kubenswrapper[18707]: I0320 09:23:33.366907 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac34132-97c0-4898-81d0-388cb1db4fad" containerName="install-certs-dataplane-step-2-edpm-a" Mar 20 09:23:33.367987 master-0 kubenswrapper[18707]: I0320 09:23:33.367732 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fac34132-97c0-4898-81d0-388cb1db4fad" containerName="install-certs-dataplane-step-2-edpm-a" Mar 20 09:23:33.369220 master-0 kubenswrapper[18707]: I0320 09:23:33.369147 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.380763 master-0 kubenswrapper[18707]: I0320 09:23:33.380619 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 20 09:23:33.381515 master-0 kubenswrapper[18707]: I0320 09:23:33.381427 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:23:33.388641 master-0 kubenswrapper[18707]: I0320 09:23:33.388540 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-dataplane-step-2-edpm-a-r2hdg"] Mar 20 09:23:33.467395 master-0 kubenswrapper[18707]: I0320 09:23:33.467339 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctgwb\" (UniqueName: \"kubernetes.io/projected/102bd436-4002-4ed5-b6e5-abe99de820b5-kube-api-access-ctgwb\") pod \"ovn-dataplane-step-2-edpm-a-r2hdg\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.467395 master-0 kubenswrapper[18707]: I0320 09:23:33.467397 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-inventory\") pod \"ovn-dataplane-step-2-edpm-a-r2hdg\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.467696 master-0 kubenswrapper[18707]: I0320 09:23:33.467527 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/102bd436-4002-4ed5-b6e5-abe99de820b5-ovncontroller-config-0\") pod \"ovn-dataplane-step-2-edpm-a-r2hdg\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.467696 master-0 kubenswrapper[18707]: I0320 09:23:33.467572 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-ovn-combined-ca-bundle\") pod \"ovn-dataplane-step-2-edpm-a-r2hdg\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.467696 master-0 kubenswrapper[18707]: I0320 09:23:33.467595 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-ssh-key-edpm-a\") pod \"ovn-dataplane-step-2-edpm-a-r2hdg\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.569663 master-0 kubenswrapper[18707]: I0320 09:23:33.569601 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctgwb\" (UniqueName: \"kubernetes.io/projected/102bd436-4002-4ed5-b6e5-abe99de820b5-kube-api-access-ctgwb\") pod \"ovn-dataplane-step-2-edpm-a-r2hdg\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.569663 master-0 kubenswrapper[18707]: I0320 09:23:33.569656 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-inventory\") pod \"ovn-dataplane-step-2-edpm-a-r2hdg\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.570124 master-0 kubenswrapper[18707]: I0320 09:23:33.570085 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/102bd436-4002-4ed5-b6e5-abe99de820b5-ovncontroller-config-0\") pod \"ovn-dataplane-step-2-edpm-a-r2hdg\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.570450 master-0 kubenswrapper[18707]: I0320 09:23:33.570407 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-ovn-combined-ca-bundle\") pod \"ovn-dataplane-step-2-edpm-a-r2hdg\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.570602 master-0 kubenswrapper[18707]: I0320 09:23:33.570582 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-ssh-key-edpm-a\") pod \"ovn-dataplane-step-2-edpm-a-r2hdg\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.571373 master-0 kubenswrapper[18707]: I0320 09:23:33.571342 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/102bd436-4002-4ed5-b6e5-abe99de820b5-ovncontroller-config-0\") pod \"ovn-dataplane-step-2-edpm-a-r2hdg\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.577426 master-0 kubenswrapper[18707]: I0320 09:23:33.573864 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-inventory\") pod \"ovn-dataplane-step-2-edpm-a-r2hdg\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.578690 master-0 kubenswrapper[18707]: I0320 09:23:33.578656 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-ssh-key-edpm-a\") pod \"ovn-dataplane-step-2-edpm-a-r2hdg\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.579666 master-0 kubenswrapper[18707]: I0320 09:23:33.579642 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-ovn-combined-ca-bundle\") pod \"ovn-dataplane-step-2-edpm-a-r2hdg\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.586107 master-0 kubenswrapper[18707]: I0320 09:23:33.586075 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctgwb\" (UniqueName: \"kubernetes.io/projected/102bd436-4002-4ed5-b6e5-abe99de820b5-kube-api-access-ctgwb\") pod \"ovn-dataplane-step-2-edpm-a-r2hdg\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.709142 master-0 kubenswrapper[18707]: I0320 09:23:33.709068 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:23:33.847280 master-0 kubenswrapper[18707]: I0320 09:23:33.847152 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:34.009845 master-0 kubenswrapper[18707]: I0320 09:23:34.009800 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-nova-combined-ca-bundle\") pod \"36ef6f41-ad37-4778-bde5-e7c688c27878\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " Mar 20 09:23:34.010095 master-0 kubenswrapper[18707]: I0320 09:23:34.010080 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-inventory\") pod \"36ef6f41-ad37-4778-bde5-e7c688c27878\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " Mar 20 09:23:34.010269 master-0 kubenswrapper[18707]: I0320 09:23:34.010248 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-ovn-combined-ca-bundle\") pod \"36ef6f41-ad37-4778-bde5-e7c688c27878\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " Mar 20 09:23:34.010379 master-0 kubenswrapper[18707]: I0320 09:23:34.010365 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-ssh-key-edpm-b\") pod \"36ef6f41-ad37-4778-bde5-e7c688c27878\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " Mar 20 09:23:34.010561 master-0 kubenswrapper[18707]: I0320 09:23:34.010547 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxxnk\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-kube-api-access-cxxnk\") pod \"36ef6f41-ad37-4778-bde5-e7c688c27878\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " Mar 20 09:23:34.010723 master-0 kubenswrapper[18707]: I0320 09:23:34.010708 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-neutron-metadata-combined-ca-bundle\") pod \"36ef6f41-ad37-4778-bde5-e7c688c27878\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " Mar 20 09:23:34.010808 master-0 kubenswrapper[18707]: I0320 09:23:34.010796 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-b-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-libvirt-default-certs-0\") pod \"36ef6f41-ad37-4778-bde5-e7c688c27878\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " Mar 20 09:23:34.011989 master-0 kubenswrapper[18707]: I0320 09:23:34.011413 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-b-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-neutron-metadata-default-certs-0\") pod \"36ef6f41-ad37-4778-bde5-e7c688c27878\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " Mar 20 09:23:34.011989 master-0 kubenswrapper[18707]: I0320 09:23:34.011501 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-b-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-ovn-default-certs-0\") pod \"36ef6f41-ad37-4778-bde5-e7c688c27878\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " Mar 20 09:23:34.011989 master-0 kubenswrapper[18707]: I0320 09:23:34.011659 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-libvirt-combined-ca-bundle\") pod \"36ef6f41-ad37-4778-bde5-e7c688c27878\" (UID: \"36ef6f41-ad37-4778-bde5-e7c688c27878\") " Mar 20 09:23:34.015329 master-0 kubenswrapper[18707]: I0320 09:23:34.015238 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "36ef6f41-ad37-4778-bde5-e7c688c27878" (UID: "36ef6f41-ad37-4778-bde5-e7c688c27878"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:34.015502 master-0 kubenswrapper[18707]: I0320 09:23:34.015459 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-kube-api-access-cxxnk" (OuterVolumeSpecName: "kube-api-access-cxxnk") pod "36ef6f41-ad37-4778-bde5-e7c688c27878" (UID: "36ef6f41-ad37-4778-bde5-e7c688c27878"). InnerVolumeSpecName "kube-api-access-cxxnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:23:34.015563 master-0 kubenswrapper[18707]: I0320 09:23:34.015543 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "36ef6f41-ad37-4778-bde5-e7c688c27878" (UID: "36ef6f41-ad37-4778-bde5-e7c688c27878"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:34.017212 master-0 kubenswrapper[18707]: I0320 09:23:34.017161 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-libvirt-default-certs-0" (OuterVolumeSpecName: "edpm-b-libvirt-default-certs-0") pod "36ef6f41-ad37-4778-bde5-e7c688c27878" (UID: "36ef6f41-ad37-4778-bde5-e7c688c27878"). InnerVolumeSpecName "edpm-b-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:23:34.017991 master-0 kubenswrapper[18707]: I0320 09:23:34.017933 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-ovn-default-certs-0" (OuterVolumeSpecName: "edpm-b-ovn-default-certs-0") pod "36ef6f41-ad37-4778-bde5-e7c688c27878" (UID: "36ef6f41-ad37-4778-bde5-e7c688c27878"). InnerVolumeSpecName "edpm-b-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:23:34.019042 master-0 kubenswrapper[18707]: I0320 09:23:34.019000 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "36ef6f41-ad37-4778-bde5-e7c688c27878" (UID: "36ef6f41-ad37-4778-bde5-e7c688c27878"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:34.019363 master-0 kubenswrapper[18707]: I0320 09:23:34.019333 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "36ef6f41-ad37-4778-bde5-e7c688c27878" (UID: "36ef6f41-ad37-4778-bde5-e7c688c27878"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:34.019423 master-0 kubenswrapper[18707]: I0320 09:23:34.019362 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "edpm-b-neutron-metadata-default-certs-0") pod "36ef6f41-ad37-4778-bde5-e7c688c27878" (UID: "36ef6f41-ad37-4778-bde5-e7c688c27878"). InnerVolumeSpecName "edpm-b-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:23:34.047643 master-0 kubenswrapper[18707]: I0320 09:23:34.047552 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-inventory" (OuterVolumeSpecName: "inventory") pod "36ef6f41-ad37-4778-bde5-e7c688c27878" (UID: "36ef6f41-ad37-4778-bde5-e7c688c27878"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:34.061052 master-0 kubenswrapper[18707]: I0320 09:23:34.060972 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "36ef6f41-ad37-4778-bde5-e7c688c27878" (UID: "36ef6f41-ad37-4778-bde5-e7c688c27878"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:23:34.115243 master-0 kubenswrapper[18707]: I0320 09:23:34.115085 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-b-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-neutron-metadata-default-certs-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:34.115243 master-0 kubenswrapper[18707]: I0320 09:23:34.115147 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-b-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-ovn-default-certs-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:34.115243 master-0 kubenswrapper[18707]: I0320 09:23:34.115158 18707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-libvirt-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:34.115243 master-0 kubenswrapper[18707]: I0320 09:23:34.115168 18707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-nova-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:34.115243 master-0 kubenswrapper[18707]: I0320 09:23:34.115215 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:34.115243 master-0 kubenswrapper[18707]: I0320 09:23:34.115225 18707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-ovn-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:34.115243 master-0 kubenswrapper[18707]: I0320 09:23:34.115234 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:34.115243 master-0 kubenswrapper[18707]: I0320 09:23:34.115245 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxxnk\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-kube-api-access-cxxnk\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:34.115649 master-0 kubenswrapper[18707]: I0320 09:23:34.115254 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef6f41-ad37-4778-bde5-e7c688c27878-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:34.115649 master-0 kubenswrapper[18707]: I0320 09:23:34.115284 18707 reconciler_common.go:293] "Volume detached for volume \"edpm-b-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/36ef6f41-ad37-4778-bde5-e7c688c27878-edpm-b-libvirt-default-certs-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:23:34.285064 master-0 kubenswrapper[18707]: I0320 09:23:34.284986 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" event={"ID":"36ef6f41-ad37-4778-bde5-e7c688c27878","Type":"ContainerDied","Data":"b2c0dc6f04aa592c43a6ba5e2e0a4442f548931fe708176e19bb8981f40f5d97"} Mar 20 09:23:34.285064 master-0 kubenswrapper[18707]: I0320 09:23:34.285054 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2c0dc6f04aa592c43a6ba5e2e0a4442f548931fe708176e19bb8981f40f5d97" Mar 20 09:23:34.285776 master-0 kubenswrapper[18707]: I0320 09:23:34.285123 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-dataplane-step-2-edpm-b-w5v2r" Mar 20 09:23:34.289785 master-0 kubenswrapper[18707]: I0320 09:23:34.288778 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-dataplane-step-2-edpm-a-r2hdg"] Mar 20 09:23:34.393280 master-0 kubenswrapper[18707]: I0320 09:23:34.388912 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-dataplane-step-2-edpm-b-tm4wb"] Mar 20 09:23:34.393280 master-0 kubenswrapper[18707]: E0320 09:23:34.389691 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ef6f41-ad37-4778-bde5-e7c688c27878" containerName="install-certs-dataplane-step-2-edpm-b" Mar 20 09:23:34.393280 master-0 kubenswrapper[18707]: I0320 09:23:34.389715 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ef6f41-ad37-4778-bde5-e7c688c27878" containerName="install-certs-dataplane-step-2-edpm-b" Mar 20 09:23:34.393280 master-0 kubenswrapper[18707]: I0320 09:23:34.389983 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="36ef6f41-ad37-4778-bde5-e7c688c27878" containerName="install-certs-dataplane-step-2-edpm-b" Mar 20 09:23:34.393280 master-0 kubenswrapper[18707]: I0320 09:23:34.390970 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.396283 master-0 kubenswrapper[18707]: I0320 09:23:34.394626 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-b" Mar 20 09:23:34.403403 master-0 kubenswrapper[18707]: I0320 09:23:34.403324 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-dataplane-step-2-edpm-b-tm4wb"] Mar 20 09:23:34.452809 master-0 kubenswrapper[18707]: E0320 09:23:34.452755 18707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36ef6f41_ad37_4778_bde5_e7c688c27878.slice\": RecentStats: unable to find data in memory cache]" Mar 20 09:23:34.525487 master-0 kubenswrapper[18707]: I0320 09:23:34.525408 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-inventory\") pod \"ovn-dataplane-step-2-edpm-b-tm4wb\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.525487 master-0 kubenswrapper[18707]: I0320 09:23:34.525494 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-ssh-key-edpm-b\") pod \"ovn-dataplane-step-2-edpm-b-tm4wb\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.526075 master-0 kubenswrapper[18707]: I0320 09:23:34.526011 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbwpm\" (UniqueName: \"kubernetes.io/projected/15fc9432-99bc-41a1-a87b-0dc3c351182e-kube-api-access-gbwpm\") pod \"ovn-dataplane-step-2-edpm-b-tm4wb\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.526075 master-0 kubenswrapper[18707]: I0320 09:23:34.526072 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-ovn-combined-ca-bundle\") pod \"ovn-dataplane-step-2-edpm-b-tm4wb\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.526577 master-0 kubenswrapper[18707]: I0320 09:23:34.526403 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/15fc9432-99bc-41a1-a87b-0dc3c351182e-ovncontroller-config-0\") pod \"ovn-dataplane-step-2-edpm-b-tm4wb\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.628683 master-0 kubenswrapper[18707]: I0320 09:23:34.628616 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-ovn-combined-ca-bundle\") pod \"ovn-dataplane-step-2-edpm-b-tm4wb\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.628683 master-0 kubenswrapper[18707]: I0320 09:23:34.628675 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbwpm\" (UniqueName: \"kubernetes.io/projected/15fc9432-99bc-41a1-a87b-0dc3c351182e-kube-api-access-gbwpm\") pod \"ovn-dataplane-step-2-edpm-b-tm4wb\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.628957 master-0 kubenswrapper[18707]: I0320 09:23:34.628754 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/15fc9432-99bc-41a1-a87b-0dc3c351182e-ovncontroller-config-0\") pod \"ovn-dataplane-step-2-edpm-b-tm4wb\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.628957 master-0 kubenswrapper[18707]: I0320 09:23:34.628795 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-inventory\") pod \"ovn-dataplane-step-2-edpm-b-tm4wb\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.629076 master-0 kubenswrapper[18707]: I0320 09:23:34.629017 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-ssh-key-edpm-b\") pod \"ovn-dataplane-step-2-edpm-b-tm4wb\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.630095 master-0 kubenswrapper[18707]: I0320 09:23:34.630039 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/15fc9432-99bc-41a1-a87b-0dc3c351182e-ovncontroller-config-0\") pod \"ovn-dataplane-step-2-edpm-b-tm4wb\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.632826 master-0 kubenswrapper[18707]: I0320 09:23:34.632789 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-ssh-key-edpm-b\") pod \"ovn-dataplane-step-2-edpm-b-tm4wb\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.634510 master-0 kubenswrapper[18707]: I0320 09:23:34.633922 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-ovn-combined-ca-bundle\") pod \"ovn-dataplane-step-2-edpm-b-tm4wb\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.634577 master-0 kubenswrapper[18707]: I0320 09:23:34.634519 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-inventory\") pod \"ovn-dataplane-step-2-edpm-b-tm4wb\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.650913 master-0 kubenswrapper[18707]: I0320 09:23:34.650783 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbwpm\" (UniqueName: \"kubernetes.io/projected/15fc9432-99bc-41a1-a87b-0dc3c351182e-kube-api-access-gbwpm\") pod \"ovn-dataplane-step-2-edpm-b-tm4wb\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:34.755740 master-0 kubenswrapper[18707]: I0320 09:23:34.755617 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:23:35.321310 master-0 kubenswrapper[18707]: I0320 09:23:35.320424 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" event={"ID":"102bd436-4002-4ed5-b6e5-abe99de820b5","Type":"ContainerStarted","Data":"8f03893e41a599e441d57e4342a71bc15586daf1e13489ca215ee7ede8cf5ac9"} Mar 20 09:23:35.321310 master-0 kubenswrapper[18707]: I0320 09:23:35.320477 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" event={"ID":"102bd436-4002-4ed5-b6e5-abe99de820b5","Type":"ContainerStarted","Data":"7f2ee76e5f1fa7f034d3237c58df56e97025ce97b51a8f9c6e57202f028d4555"} Mar 20 09:23:35.362707 master-0 kubenswrapper[18707]: I0320 09:23:35.362633 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-dataplane-step-2-edpm-b-tm4wb"] Mar 20 09:23:35.363163 master-0 kubenswrapper[18707]: I0320 09:23:35.363109 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" podStartSLOduration=1.914607417 podStartE2EDuration="2.363095782s" podCreationTimestamp="2026-03-20 09:23:33 +0000 UTC" firstStartedPulling="2026-03-20 09:23:34.29339021 +0000 UTC m=+2559.449570566" lastFinishedPulling="2026-03-20 09:23:34.741878575 +0000 UTC m=+2559.898058931" observedRunningTime="2026-03-20 09:23:35.345050382 +0000 UTC m=+2560.501230758" watchObservedRunningTime="2026-03-20 09:23:35.363095782 +0000 UTC m=+2560.519276138" Mar 20 09:23:36.334897 master-0 kubenswrapper[18707]: I0320 09:23:36.334786 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" event={"ID":"15fc9432-99bc-41a1-a87b-0dc3c351182e","Type":"ContainerStarted","Data":"44d84e74f41c066e3152cd9362ced1e2781d8b45148bac33f9fa16761ac2171a"} Mar 20 09:23:36.334897 master-0 kubenswrapper[18707]: I0320 09:23:36.334883 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" event={"ID":"15fc9432-99bc-41a1-a87b-0dc3c351182e","Type":"ContainerStarted","Data":"2d29561cc275377979f672a192fe9a093b0c53e01eac8c22301c937cd3d3d1c4"} Mar 20 09:23:36.391551 master-0 kubenswrapper[18707]: I0320 09:23:36.391420 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" podStartSLOduration=1.949161836 podStartE2EDuration="2.391395624s" podCreationTimestamp="2026-03-20 09:23:34 +0000 UTC" firstStartedPulling="2026-03-20 09:23:35.341497872 +0000 UTC m=+2560.497678228" lastFinishedPulling="2026-03-20 09:23:35.78373167 +0000 UTC m=+2560.939912016" observedRunningTime="2026-03-20 09:23:36.386808605 +0000 UTC m=+2561.542988961" watchObservedRunningTime="2026-03-20 09:23:36.391395624 +0000 UTC m=+2561.547575980" Mar 20 09:24:37.139483 master-0 kubenswrapper[18707]: I0320 09:24:37.138969 18707 generic.go:334] "Generic (PLEG): container finished" podID="102bd436-4002-4ed5-b6e5-abe99de820b5" containerID="8f03893e41a599e441d57e4342a71bc15586daf1e13489ca215ee7ede8cf5ac9" exitCode=0 Mar 20 09:24:37.139483 master-0 kubenswrapper[18707]: I0320 09:24:37.139041 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" event={"ID":"102bd436-4002-4ed5-b6e5-abe99de820b5","Type":"ContainerDied","Data":"8f03893e41a599e441d57e4342a71bc15586daf1e13489ca215ee7ede8cf5ac9"} Mar 20 09:24:38.153532 master-0 kubenswrapper[18707]: I0320 09:24:38.153374 18707 generic.go:334] "Generic (PLEG): container finished" podID="15fc9432-99bc-41a1-a87b-0dc3c351182e" containerID="44d84e74f41c066e3152cd9362ced1e2781d8b45148bac33f9fa16761ac2171a" exitCode=0 Mar 20 09:24:38.153532 master-0 kubenswrapper[18707]: I0320 09:24:38.153484 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" event={"ID":"15fc9432-99bc-41a1-a87b-0dc3c351182e","Type":"ContainerDied","Data":"44d84e74f41c066e3152cd9362ced1e2781d8b45148bac33f9fa16761ac2171a"} Mar 20 09:24:38.953381 master-0 kubenswrapper[18707]: I0320 09:24:38.953326 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:24:39.024785 master-0 kubenswrapper[18707]: I0320 09:24:39.024702 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctgwb\" (UniqueName: \"kubernetes.io/projected/102bd436-4002-4ed5-b6e5-abe99de820b5-kube-api-access-ctgwb\") pod \"102bd436-4002-4ed5-b6e5-abe99de820b5\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " Mar 20 09:24:39.025045 master-0 kubenswrapper[18707]: I0320 09:24:39.024812 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-ssh-key-edpm-a\") pod \"102bd436-4002-4ed5-b6e5-abe99de820b5\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " Mar 20 09:24:39.025045 master-0 kubenswrapper[18707]: I0320 09:24:39.025005 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-ovn-combined-ca-bundle\") pod \"102bd436-4002-4ed5-b6e5-abe99de820b5\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " Mar 20 09:24:39.025207 master-0 kubenswrapper[18707]: I0320 09:24:39.025169 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-inventory\") pod \"102bd436-4002-4ed5-b6e5-abe99de820b5\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " Mar 20 09:24:39.025480 master-0 kubenswrapper[18707]: I0320 09:24:39.025434 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/102bd436-4002-4ed5-b6e5-abe99de820b5-ovncontroller-config-0\") pod \"102bd436-4002-4ed5-b6e5-abe99de820b5\" (UID: \"102bd436-4002-4ed5-b6e5-abe99de820b5\") " Mar 20 09:24:39.029579 master-0 kubenswrapper[18707]: I0320 09:24:39.029327 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "102bd436-4002-4ed5-b6e5-abe99de820b5" (UID: "102bd436-4002-4ed5-b6e5-abe99de820b5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:24:39.029579 master-0 kubenswrapper[18707]: I0320 09:24:39.029502 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/102bd436-4002-4ed5-b6e5-abe99de820b5-kube-api-access-ctgwb" (OuterVolumeSpecName: "kube-api-access-ctgwb") pod "102bd436-4002-4ed5-b6e5-abe99de820b5" (UID: "102bd436-4002-4ed5-b6e5-abe99de820b5"). InnerVolumeSpecName "kube-api-access-ctgwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:24:39.057000 master-0 kubenswrapper[18707]: I0320 09:24:39.056927 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/102bd436-4002-4ed5-b6e5-abe99de820b5-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "102bd436-4002-4ed5-b6e5-abe99de820b5" (UID: "102bd436-4002-4ed5-b6e5-abe99de820b5"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:24:39.057000 master-0 kubenswrapper[18707]: I0320 09:24:39.057005 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-inventory" (OuterVolumeSpecName: "inventory") pod "102bd436-4002-4ed5-b6e5-abe99de820b5" (UID: "102bd436-4002-4ed5-b6e5-abe99de820b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:24:39.058074 master-0 kubenswrapper[18707]: I0320 09:24:39.058033 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "102bd436-4002-4ed5-b6e5-abe99de820b5" (UID: "102bd436-4002-4ed5-b6e5-abe99de820b5"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:24:39.130448 master-0 kubenswrapper[18707]: I0320 09:24:39.130411 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctgwb\" (UniqueName: \"kubernetes.io/projected/102bd436-4002-4ed5-b6e5-abe99de820b5-kube-api-access-ctgwb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:24:39.130611 master-0 kubenswrapper[18707]: I0320 09:24:39.130599 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:24:39.130699 master-0 kubenswrapper[18707]: I0320 09:24:39.130687 18707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-ovn-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:24:39.130797 master-0 kubenswrapper[18707]: I0320 09:24:39.130783 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/102bd436-4002-4ed5-b6e5-abe99de820b5-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:24:39.130950 master-0 kubenswrapper[18707]: I0320 09:24:39.130934 18707 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/102bd436-4002-4ed5-b6e5-abe99de820b5-ovncontroller-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:24:39.167413 master-0 kubenswrapper[18707]: I0320 09:24:39.167351 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" event={"ID":"102bd436-4002-4ed5-b6e5-abe99de820b5","Type":"ContainerDied","Data":"7f2ee76e5f1fa7f034d3237c58df56e97025ce97b51a8f9c6e57202f028d4555"} Mar 20 09:24:39.168563 master-0 kubenswrapper[18707]: I0320 09:24:39.168538 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f2ee76e5f1fa7f034d3237c58df56e97025ce97b51a8f9c6e57202f028d4555" Mar 20 09:24:39.168672 master-0 kubenswrapper[18707]: I0320 09:24:39.167455 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-dataplane-step-2-edpm-a-r2hdg" Mar 20 09:24:39.545813 master-0 kubenswrapper[18707]: I0320 09:24:39.545176 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j"] Mar 20 09:24:39.551356 master-0 kubenswrapper[18707]: E0320 09:24:39.551233 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="102bd436-4002-4ed5-b6e5-abe99de820b5" containerName="ovn-dataplane-step-2-edpm-a" Mar 20 09:24:39.551356 master-0 kubenswrapper[18707]: I0320 09:24:39.551292 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="102bd436-4002-4ed5-b6e5-abe99de820b5" containerName="ovn-dataplane-step-2-edpm-a" Mar 20 09:24:39.551698 master-0 kubenswrapper[18707]: I0320 09:24:39.551676 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="102bd436-4002-4ed5-b6e5-abe99de820b5" containerName="ovn-dataplane-step-2-edpm-a" Mar 20 09:24:39.559090 master-0 kubenswrapper[18707]: I0320 09:24:39.559030 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.563798 master-0 kubenswrapper[18707]: I0320 09:24:39.562869 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 09:24:39.563798 master-0 kubenswrapper[18707]: I0320 09:24:39.563091 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:24:39.567659 master-0 kubenswrapper[18707]: I0320 09:24:39.564067 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 09:24:39.577999 master-0 kubenswrapper[18707]: I0320 09:24:39.577813 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j"] Mar 20 09:24:39.646849 master-0 kubenswrapper[18707]: I0320 09:24:39.646627 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.646849 master-0 kubenswrapper[18707]: I0320 09:24:39.646690 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.646849 master-0 kubenswrapper[18707]: I0320 09:24:39.646803 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ct2n\" (UniqueName: \"kubernetes.io/projected/e2c649f6-80b4-4425-b556-d58186eb6663-kube-api-access-6ct2n\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.647120 master-0 kubenswrapper[18707]: I0320 09:24:39.646894 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.647120 master-0 kubenswrapper[18707]: I0320 09:24:39.646921 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.647120 master-0 kubenswrapper[18707]: I0320 09:24:39.646958 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.730920 master-0 kubenswrapper[18707]: I0320 09:24:39.730857 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:24:39.751453 master-0 kubenswrapper[18707]: I0320 09:24:39.751408 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ct2n\" (UniqueName: \"kubernetes.io/projected/e2c649f6-80b4-4425-b556-d58186eb6663-kube-api-access-6ct2n\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.751781 master-0 kubenswrapper[18707]: I0320 09:24:39.751759 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.751894 master-0 kubenswrapper[18707]: I0320 09:24:39.751876 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.752003 master-0 kubenswrapper[18707]: I0320 09:24:39.751990 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.752209 master-0 kubenswrapper[18707]: I0320 09:24:39.752193 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.752301 master-0 kubenswrapper[18707]: I0320 09:24:39.752287 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.756595 master-0 kubenswrapper[18707]: I0320 09:24:39.756566 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.756898 master-0 kubenswrapper[18707]: I0320 09:24:39.756843 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.760003 master-0 kubenswrapper[18707]: I0320 09:24:39.759961 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.760907 master-0 kubenswrapper[18707]: I0320 09:24:39.760876 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.771385 master-0 kubenswrapper[18707]: I0320 09:24:39.771324 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.794653 master-0 kubenswrapper[18707]: I0320 09:24:39.794526 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ct2n\" (UniqueName: \"kubernetes.io/projected/e2c649f6-80b4-4425-b556-d58186eb6663-kube-api-access-6ct2n\") pod \"neutron-metadata-dataplane-step-2-edpm-a-jnw8j\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.853889 master-0 kubenswrapper[18707]: I0320 09:24:39.853675 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbwpm\" (UniqueName: \"kubernetes.io/projected/15fc9432-99bc-41a1-a87b-0dc3c351182e-kube-api-access-gbwpm\") pod \"15fc9432-99bc-41a1-a87b-0dc3c351182e\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " Mar 20 09:24:39.853889 master-0 kubenswrapper[18707]: I0320 09:24:39.853818 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/15fc9432-99bc-41a1-a87b-0dc3c351182e-ovncontroller-config-0\") pod \"15fc9432-99bc-41a1-a87b-0dc3c351182e\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " Mar 20 09:24:39.854136 master-0 kubenswrapper[18707]: I0320 09:24:39.853929 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-ssh-key-edpm-b\") pod \"15fc9432-99bc-41a1-a87b-0dc3c351182e\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " Mar 20 09:24:39.854363 master-0 kubenswrapper[18707]: I0320 09:24:39.854293 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-inventory\") pod \"15fc9432-99bc-41a1-a87b-0dc3c351182e\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " Mar 20 09:24:39.854477 master-0 kubenswrapper[18707]: I0320 09:24:39.854453 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-ovn-combined-ca-bundle\") pod \"15fc9432-99bc-41a1-a87b-0dc3c351182e\" (UID: \"15fc9432-99bc-41a1-a87b-0dc3c351182e\") " Mar 20 09:24:39.856683 master-0 kubenswrapper[18707]: I0320 09:24:39.856634 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15fc9432-99bc-41a1-a87b-0dc3c351182e-kube-api-access-gbwpm" (OuterVolumeSpecName: "kube-api-access-gbwpm") pod "15fc9432-99bc-41a1-a87b-0dc3c351182e" (UID: "15fc9432-99bc-41a1-a87b-0dc3c351182e"). InnerVolumeSpecName "kube-api-access-gbwpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:24:39.859500 master-0 kubenswrapper[18707]: I0320 09:24:39.859271 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "15fc9432-99bc-41a1-a87b-0dc3c351182e" (UID: "15fc9432-99bc-41a1-a87b-0dc3c351182e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:24:39.890214 master-0 kubenswrapper[18707]: I0320 09:24:39.886212 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "15fc9432-99bc-41a1-a87b-0dc3c351182e" (UID: "15fc9432-99bc-41a1-a87b-0dc3c351182e"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:24:39.900023 master-0 kubenswrapper[18707]: I0320 09:24:39.899953 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15fc9432-99bc-41a1-a87b-0dc3c351182e-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "15fc9432-99bc-41a1-a87b-0dc3c351182e" (UID: "15fc9432-99bc-41a1-a87b-0dc3c351182e"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:24:39.902061 master-0 kubenswrapper[18707]: I0320 09:24:39.901987 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-inventory" (OuterVolumeSpecName: "inventory") pod "15fc9432-99bc-41a1-a87b-0dc3c351182e" (UID: "15fc9432-99bc-41a1-a87b-0dc3c351182e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:24:39.902061 master-0 kubenswrapper[18707]: I0320 09:24:39.902024 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:24:39.959334 master-0 kubenswrapper[18707]: I0320 09:24:39.959230 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:24:39.959334 master-0 kubenswrapper[18707]: I0320 09:24:39.959285 18707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-ovn-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:24:39.959334 master-0 kubenswrapper[18707]: I0320 09:24:39.959299 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbwpm\" (UniqueName: \"kubernetes.io/projected/15fc9432-99bc-41a1-a87b-0dc3c351182e-kube-api-access-gbwpm\") on node \"master-0\" DevicePath \"\"" Mar 20 09:24:39.959334 master-0 kubenswrapper[18707]: I0320 09:24:39.959308 18707 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/15fc9432-99bc-41a1-a87b-0dc3c351182e-ovncontroller-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:24:39.959334 master-0 kubenswrapper[18707]: I0320 09:24:39.959320 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/15fc9432-99bc-41a1-a87b-0dc3c351182e-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:24:40.215234 master-0 kubenswrapper[18707]: I0320 09:24:40.214898 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" event={"ID":"15fc9432-99bc-41a1-a87b-0dc3c351182e","Type":"ContainerDied","Data":"2d29561cc275377979f672a192fe9a093b0c53e01eac8c22301c937cd3d3d1c4"} Mar 20 09:24:40.215234 master-0 kubenswrapper[18707]: I0320 09:24:40.215028 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d29561cc275377979f672a192fe9a093b0c53e01eac8c22301c937cd3d3d1c4" Mar 20 09:24:40.215234 master-0 kubenswrapper[18707]: I0320 09:24:40.215124 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-dataplane-step-2-edpm-b-tm4wb" Mar 20 09:24:40.311214 master-0 kubenswrapper[18707]: I0320 09:24:40.311099 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j"] Mar 20 09:24:40.312378 master-0 kubenswrapper[18707]: E0320 09:24:40.311782 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fc9432-99bc-41a1-a87b-0dc3c351182e" containerName="ovn-dataplane-step-2-edpm-b" Mar 20 09:24:40.312378 master-0 kubenswrapper[18707]: I0320 09:24:40.311802 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fc9432-99bc-41a1-a87b-0dc3c351182e" containerName="ovn-dataplane-step-2-edpm-b" Mar 20 09:24:40.312378 master-0 kubenswrapper[18707]: I0320 09:24:40.312125 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="15fc9432-99bc-41a1-a87b-0dc3c351182e" containerName="ovn-dataplane-step-2-edpm-b" Mar 20 09:24:40.313143 master-0 kubenswrapper[18707]: I0320 09:24:40.313092 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.317781 master-0 kubenswrapper[18707]: I0320 09:24:40.317737 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-b" Mar 20 09:24:40.327405 master-0 kubenswrapper[18707]: I0320 09:24:40.327342 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j"] Mar 20 09:24:40.374350 master-0 kubenswrapper[18707]: I0320 09:24:40.374275 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7g4p\" (UniqueName: \"kubernetes.io/projected/f3d103fe-3311-412d-9b59-740d586e71c9-kube-api-access-h7g4p\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.374350 master-0 kubenswrapper[18707]: I0320 09:24:40.374354 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.374763 master-0 kubenswrapper[18707]: I0320 09:24:40.374478 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.374763 master-0 kubenswrapper[18707]: I0320 09:24:40.374523 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.374763 master-0 kubenswrapper[18707]: I0320 09:24:40.374620 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.374763 master-0 kubenswrapper[18707]: I0320 09:24:40.374697 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.484209 master-0 kubenswrapper[18707]: I0320 09:24:40.476856 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.484209 master-0 kubenswrapper[18707]: I0320 09:24:40.477010 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.484209 master-0 kubenswrapper[18707]: I0320 09:24:40.477389 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.484209 master-0 kubenswrapper[18707]: I0320 09:24:40.477484 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.484209 master-0 kubenswrapper[18707]: I0320 09:24:40.477592 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7g4p\" (UniqueName: \"kubernetes.io/projected/f3d103fe-3311-412d-9b59-740d586e71c9-kube-api-access-h7g4p\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.484209 master-0 kubenswrapper[18707]: I0320 09:24:40.477623 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.484209 master-0 kubenswrapper[18707]: I0320 09:24:40.480451 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.484209 master-0 kubenswrapper[18707]: I0320 09:24:40.481851 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.484209 master-0 kubenswrapper[18707]: I0320 09:24:40.482694 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.484209 master-0 kubenswrapper[18707]: I0320 09:24:40.482722 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.484825 master-0 kubenswrapper[18707]: I0320 09:24:40.484701 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.497242 master-0 kubenswrapper[18707]: I0320 09:24:40.496764 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7g4p\" (UniqueName: \"kubernetes.io/projected/f3d103fe-3311-412d-9b59-740d586e71c9-kube-api-access-h7g4p\") pod \"neutron-metadata-dataplane-step-2-edpm-b-cm55j\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:40.588490 master-0 kubenswrapper[18707]: I0320 09:24:40.588384 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j"] Mar 20 09:24:40.676227 master-0 kubenswrapper[18707]: I0320 09:24:40.676142 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:24:41.231385 master-0 kubenswrapper[18707]: I0320 09:24:41.231248 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" event={"ID":"e2c649f6-80b4-4425-b556-d58186eb6663","Type":"ContainerStarted","Data":"bd20c3ca57d3c27cdee5fd35489dc7e796a6d8fd929b5333ef150cb64a02cbb4"} Mar 20 09:24:41.283583 master-0 kubenswrapper[18707]: I0320 09:24:41.283528 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j"] Mar 20 09:24:42.244909 master-0 kubenswrapper[18707]: I0320 09:24:42.244822 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" event={"ID":"f3d103fe-3311-412d-9b59-740d586e71c9","Type":"ContainerStarted","Data":"9d882e77b0150d92ceed9efd962da68f8dd28ccf00e815a5dc124eea4ddba185"} Mar 20 09:24:42.244909 master-0 kubenswrapper[18707]: I0320 09:24:42.244916 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" event={"ID":"f3d103fe-3311-412d-9b59-740d586e71c9","Type":"ContainerStarted","Data":"e207f23818aa197cc75fb5b154ba289eb241b34b57b10ed4d9fa4897eae639dd"} Mar 20 09:24:42.248268 master-0 kubenswrapper[18707]: I0320 09:24:42.248230 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" event={"ID":"e2c649f6-80b4-4425-b556-d58186eb6663","Type":"ContainerStarted","Data":"c2859b3403b902923d716b4cce5bc1080dcb69ed2e41a0e015b30355339bf05e"} Mar 20 09:24:42.281222 master-0 kubenswrapper[18707]: I0320 09:24:42.277719 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" podStartSLOduration=1.8171023530000001 podStartE2EDuration="2.27769073s" podCreationTimestamp="2026-03-20 09:24:40 +0000 UTC" firstStartedPulling="2026-03-20 09:24:41.286124156 +0000 UTC m=+2626.442304522" lastFinishedPulling="2026-03-20 09:24:41.746712543 +0000 UTC m=+2626.902892899" observedRunningTime="2026-03-20 09:24:42.264050264 +0000 UTC m=+2627.420230620" watchObservedRunningTime="2026-03-20 09:24:42.27769073 +0000 UTC m=+2627.433871106" Mar 20 09:24:42.304238 master-0 kubenswrapper[18707]: I0320 09:24:42.300967 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" podStartSLOduration=2.578155509 podStartE2EDuration="3.300925807s" podCreationTimestamp="2026-03-20 09:24:39 +0000 UTC" firstStartedPulling="2026-03-20 09:24:40.593808259 +0000 UTC m=+2625.749988615" lastFinishedPulling="2026-03-20 09:24:41.316578557 +0000 UTC m=+2626.472758913" observedRunningTime="2026-03-20 09:24:42.285693736 +0000 UTC m=+2627.441874122" watchObservedRunningTime="2026-03-20 09:24:42.300925807 +0000 UTC m=+2627.457106173" Mar 20 09:25:22.746619 master-0 kubenswrapper[18707]: I0320 09:25:22.746546 18707 generic.go:334] "Generic (PLEG): container finished" podID="f3d103fe-3311-412d-9b59-740d586e71c9" containerID="9d882e77b0150d92ceed9efd962da68f8dd28ccf00e815a5dc124eea4ddba185" exitCode=2 Mar 20 09:25:22.746619 master-0 kubenswrapper[18707]: I0320 09:25:22.746605 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" event={"ID":"f3d103fe-3311-412d-9b59-740d586e71c9","Type":"ContainerDied","Data":"9d882e77b0150d92ceed9efd962da68f8dd28ccf00e815a5dc124eea4ddba185"} Mar 20 09:25:23.763892 master-0 kubenswrapper[18707]: I0320 09:25:23.763804 18707 generic.go:334] "Generic (PLEG): container finished" podID="e2c649f6-80b4-4425-b556-d58186eb6663" containerID="c2859b3403b902923d716b4cce5bc1080dcb69ed2e41a0e015b30355339bf05e" exitCode=2 Mar 20 09:25:23.764582 master-0 kubenswrapper[18707]: I0320 09:25:23.763909 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" event={"ID":"e2c649f6-80b4-4425-b556-d58186eb6663","Type":"ContainerDied","Data":"c2859b3403b902923d716b4cce5bc1080dcb69ed2e41a0e015b30355339bf05e"} Mar 20 09:25:24.314817 master-0 kubenswrapper[18707]: I0320 09:25:24.314753 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:25:24.408113 master-0 kubenswrapper[18707]: I0320 09:25:24.408042 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-neutron-metadata-combined-ca-bundle\") pod \"f3d103fe-3311-412d-9b59-740d586e71c9\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " Mar 20 09:25:24.408371 master-0 kubenswrapper[18707]: I0320 09:25:24.408137 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-nova-metadata-neutron-config-0\") pod \"f3d103fe-3311-412d-9b59-740d586e71c9\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " Mar 20 09:25:24.408371 master-0 kubenswrapper[18707]: I0320 09:25:24.408171 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7g4p\" (UniqueName: \"kubernetes.io/projected/f3d103fe-3311-412d-9b59-740d586e71c9-kube-api-access-h7g4p\") pod \"f3d103fe-3311-412d-9b59-740d586e71c9\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " Mar 20 09:25:24.408371 master-0 kubenswrapper[18707]: I0320 09:25:24.408314 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-ssh-key-edpm-b\") pod \"f3d103fe-3311-412d-9b59-740d586e71c9\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " Mar 20 09:25:24.408478 master-0 kubenswrapper[18707]: I0320 09:25:24.408377 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-inventory\") pod \"f3d103fe-3311-412d-9b59-740d586e71c9\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " Mar 20 09:25:24.408740 master-0 kubenswrapper[18707]: I0320 09:25:24.408707 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"f3d103fe-3311-412d-9b59-740d586e71c9\" (UID: \"f3d103fe-3311-412d-9b59-740d586e71c9\") " Mar 20 09:25:24.412792 master-0 kubenswrapper[18707]: I0320 09:25:24.412753 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f3d103fe-3311-412d-9b59-740d586e71c9" (UID: "f3d103fe-3311-412d-9b59-740d586e71c9"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:25:24.415445 master-0 kubenswrapper[18707]: I0320 09:25:24.415382 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3d103fe-3311-412d-9b59-740d586e71c9-kube-api-access-h7g4p" (OuterVolumeSpecName: "kube-api-access-h7g4p") pod "f3d103fe-3311-412d-9b59-740d586e71c9" (UID: "f3d103fe-3311-412d-9b59-740d586e71c9"). InnerVolumeSpecName "kube-api-access-h7g4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:25:24.440770 master-0 kubenswrapper[18707]: I0320 09:25:24.439696 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "f3d103fe-3311-412d-9b59-740d586e71c9" (UID: "f3d103fe-3311-412d-9b59-740d586e71c9"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:25:24.440770 master-0 kubenswrapper[18707]: I0320 09:25:24.439964 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "f3d103fe-3311-412d-9b59-740d586e71c9" (UID: "f3d103fe-3311-412d-9b59-740d586e71c9"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:25:24.441578 master-0 kubenswrapper[18707]: I0320 09:25:24.441534 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "f3d103fe-3311-412d-9b59-740d586e71c9" (UID: "f3d103fe-3311-412d-9b59-740d586e71c9"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:25:24.444391 master-0 kubenswrapper[18707]: I0320 09:25:24.444342 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-inventory" (OuterVolumeSpecName: "inventory") pod "f3d103fe-3311-412d-9b59-740d586e71c9" (UID: "f3d103fe-3311-412d-9b59-740d586e71c9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:25:24.513216 master-0 kubenswrapper[18707]: I0320 09:25:24.513138 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-neutron-ovn-metadata-agent-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:25:24.513353 master-0 kubenswrapper[18707]: I0320 09:25:24.513225 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:25:24.513353 master-0 kubenswrapper[18707]: I0320 09:25:24.513243 18707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-nova-metadata-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:25:24.513353 master-0 kubenswrapper[18707]: I0320 09:25:24.513260 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7g4p\" (UniqueName: \"kubernetes.io/projected/f3d103fe-3311-412d-9b59-740d586e71c9-kube-api-access-h7g4p\") on node \"master-0\" DevicePath \"\"" Mar 20 09:25:24.513353 master-0 kubenswrapper[18707]: I0320 09:25:24.513278 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:25:24.513353 master-0 kubenswrapper[18707]: I0320 09:25:24.513291 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3d103fe-3311-412d-9b59-740d586e71c9-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:25:24.778903 master-0 kubenswrapper[18707]: I0320 09:25:24.778756 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" Mar 20 09:25:24.779585 master-0 kubenswrapper[18707]: I0320 09:25:24.779306 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-cm55j" event={"ID":"f3d103fe-3311-412d-9b59-740d586e71c9","Type":"ContainerDied","Data":"e207f23818aa197cc75fb5b154ba289eb241b34b57b10ed4d9fa4897eae639dd"} Mar 20 09:25:24.779585 master-0 kubenswrapper[18707]: I0320 09:25:24.779364 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e207f23818aa197cc75fb5b154ba289eb241b34b57b10ed4d9fa4897eae639dd" Mar 20 09:25:25.304909 master-0 kubenswrapper[18707]: I0320 09:25:25.304859 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:25:25.435025 master-0 kubenswrapper[18707]: I0320 09:25:25.434947 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-neutron-ovn-metadata-agent-neutron-config-0\") pod \"e2c649f6-80b4-4425-b556-d58186eb6663\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " Mar 20 09:25:25.435313 master-0 kubenswrapper[18707]: I0320 09:25:25.435059 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-neutron-metadata-combined-ca-bundle\") pod \"e2c649f6-80b4-4425-b556-d58186eb6663\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " Mar 20 09:25:25.435398 master-0 kubenswrapper[18707]: I0320 09:25:25.435379 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ct2n\" (UniqueName: \"kubernetes.io/projected/e2c649f6-80b4-4425-b556-d58186eb6663-kube-api-access-6ct2n\") pod \"e2c649f6-80b4-4425-b556-d58186eb6663\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " Mar 20 09:25:25.435671 master-0 kubenswrapper[18707]: I0320 09:25:25.435621 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-inventory\") pod \"e2c649f6-80b4-4425-b556-d58186eb6663\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " Mar 20 09:25:25.435772 master-0 kubenswrapper[18707]: I0320 09:25:25.435738 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-ssh-key-edpm-a\") pod \"e2c649f6-80b4-4425-b556-d58186eb6663\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " Mar 20 09:25:25.436187 master-0 kubenswrapper[18707]: I0320 09:25:25.436150 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-nova-metadata-neutron-config-0\") pod \"e2c649f6-80b4-4425-b556-d58186eb6663\" (UID: \"e2c649f6-80b4-4425-b556-d58186eb6663\") " Mar 20 09:25:25.456138 master-0 kubenswrapper[18707]: I0320 09:25:25.456034 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2c649f6-80b4-4425-b556-d58186eb6663-kube-api-access-6ct2n" (OuterVolumeSpecName: "kube-api-access-6ct2n") pod "e2c649f6-80b4-4425-b556-d58186eb6663" (UID: "e2c649f6-80b4-4425-b556-d58186eb6663"). InnerVolumeSpecName "kube-api-access-6ct2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:25:25.457532 master-0 kubenswrapper[18707]: I0320 09:25:25.457414 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e2c649f6-80b4-4425-b556-d58186eb6663" (UID: "e2c649f6-80b4-4425-b556-d58186eb6663"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:25:25.473643 master-0 kubenswrapper[18707]: I0320 09:25:25.473575 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-inventory" (OuterVolumeSpecName: "inventory") pod "e2c649f6-80b4-4425-b556-d58186eb6663" (UID: "e2c649f6-80b4-4425-b556-d58186eb6663"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:25:25.476643 master-0 kubenswrapper[18707]: I0320 09:25:25.476600 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "e2c649f6-80b4-4425-b556-d58186eb6663" (UID: "e2c649f6-80b4-4425-b556-d58186eb6663"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:25:25.479301 master-0 kubenswrapper[18707]: I0320 09:25:25.479225 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "e2c649f6-80b4-4425-b556-d58186eb6663" (UID: "e2c649f6-80b4-4425-b556-d58186eb6663"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:25:25.488917 master-0 kubenswrapper[18707]: I0320 09:25:25.488861 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "e2c649f6-80b4-4425-b556-d58186eb6663" (UID: "e2c649f6-80b4-4425-b556-d58186eb6663"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:25:25.539893 master-0 kubenswrapper[18707]: I0320 09:25:25.539810 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:25:25.539893 master-0 kubenswrapper[18707]: I0320 09:25:25.539866 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:25:25.539893 master-0 kubenswrapper[18707]: I0320 09:25:25.539883 18707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-nova-metadata-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:25:25.539893 master-0 kubenswrapper[18707]: I0320 09:25:25.539901 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-neutron-ovn-metadata-agent-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:25:25.539893 master-0 kubenswrapper[18707]: I0320 09:25:25.539916 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c649f6-80b4-4425-b556-d58186eb6663-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:25:25.540637 master-0 kubenswrapper[18707]: I0320 09:25:25.539932 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ct2n\" (UniqueName: \"kubernetes.io/projected/e2c649f6-80b4-4425-b556-d58186eb6663-kube-api-access-6ct2n\") on node \"master-0\" DevicePath \"\"" Mar 20 09:25:25.793403 master-0 kubenswrapper[18707]: I0320 09:25:25.793272 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" event={"ID":"e2c649f6-80b4-4425-b556-d58186eb6663","Type":"ContainerDied","Data":"bd20c3ca57d3c27cdee5fd35489dc7e796a6d8fd929b5333ef150cb64a02cbb4"} Mar 20 09:25:25.793403 master-0 kubenswrapper[18707]: I0320 09:25:25.793330 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd20c3ca57d3c27cdee5fd35489dc7e796a6d8fd929b5333ef150cb64a02cbb4" Mar 20 09:25:25.793403 master-0 kubenswrapper[18707]: I0320 09:25:25.793284 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-jnw8j" Mar 20 09:25:32.047223 master-0 kubenswrapper[18707]: I0320 09:25:32.040610 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r"] Mar 20 09:25:32.047223 master-0 kubenswrapper[18707]: E0320 09:25:32.041235 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d103fe-3311-412d-9b59-740d586e71c9" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:25:32.047223 master-0 kubenswrapper[18707]: I0320 09:25:32.041254 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d103fe-3311-412d-9b59-740d586e71c9" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:25:32.047223 master-0 kubenswrapper[18707]: E0320 09:25:32.041288 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c649f6-80b4-4425-b556-d58186eb6663" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:25:32.047223 master-0 kubenswrapper[18707]: I0320 09:25:32.041296 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c649f6-80b4-4425-b556-d58186eb6663" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:25:32.047223 master-0 kubenswrapper[18707]: I0320 09:25:32.041585 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c649f6-80b4-4425-b556-d58186eb6663" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:25:32.047223 master-0 kubenswrapper[18707]: I0320 09:25:32.041606 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d103fe-3311-412d-9b59-740d586e71c9" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:25:32.047223 master-0 kubenswrapper[18707]: I0320 09:25:32.043131 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.047223 master-0 kubenswrapper[18707]: I0320 09:25:32.046808 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 09:25:32.047223 master-0 kubenswrapper[18707]: I0320 09:25:32.047106 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:25:32.048797 master-0 kubenswrapper[18707]: I0320 09:25:32.047317 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 09:25:32.065220 master-0 kubenswrapper[18707]: I0320 09:25:32.060460 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 09:25:32.065220 master-0 kubenswrapper[18707]: I0320 09:25:32.061035 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-b" Mar 20 09:25:32.065220 master-0 kubenswrapper[18707]: I0320 09:25:32.065050 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r"] Mar 20 09:25:32.086157 master-0 kubenswrapper[18707]: I0320 09:25:32.077859 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb"] Mar 20 09:25:32.086157 master-0 kubenswrapper[18707]: I0320 09:25:32.079729 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.086157 master-0 kubenswrapper[18707]: I0320 09:25:32.082735 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:25:32.149063 master-0 kubenswrapper[18707]: I0320 09:25:32.147889 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.149063 master-0 kubenswrapper[18707]: I0320 09:25:32.148131 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.149063 master-0 kubenswrapper[18707]: I0320 09:25:32.148271 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npnkk\" (UniqueName: \"kubernetes.io/projected/c66786be-0342-44f0-90c4-a1797c4ea873-kube-api-access-npnkk\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.149063 master-0 kubenswrapper[18707]: I0320 09:25:32.148326 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.149063 master-0 kubenswrapper[18707]: I0320 09:25:32.148608 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.149063 master-0 kubenswrapper[18707]: I0320 09:25:32.148642 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.178595 master-0 kubenswrapper[18707]: I0320 09:25:32.178536 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb"] Mar 20 09:25:32.251962 master-0 kubenswrapper[18707]: I0320 09:25:32.251875 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.252169 master-0 kubenswrapper[18707]: I0320 09:25:32.252018 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.252169 master-0 kubenswrapper[18707]: I0320 09:25:32.252074 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv2rd\" (UniqueName: \"kubernetes.io/projected/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-kube-api-access-tv2rd\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.252888 master-0 kubenswrapper[18707]: I0320 09:25:32.252853 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npnkk\" (UniqueName: \"kubernetes.io/projected/c66786be-0342-44f0-90c4-a1797c4ea873-kube-api-access-npnkk\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.253064 master-0 kubenswrapper[18707]: I0320 09:25:32.253046 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.253301 master-0 kubenswrapper[18707]: I0320 09:25:32.253285 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.253615 master-0 kubenswrapper[18707]: I0320 09:25:32.253598 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.253712 master-0 kubenswrapper[18707]: I0320 09:25:32.253699 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.253793 master-0 kubenswrapper[18707]: I0320 09:25:32.253780 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.253973 master-0 kubenswrapper[18707]: I0320 09:25:32.253959 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.254069 master-0 kubenswrapper[18707]: I0320 09:25:32.254052 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.254332 master-0 kubenswrapper[18707]: I0320 09:25:32.254315 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.258518 master-0 kubenswrapper[18707]: I0320 09:25:32.258489 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.258740 master-0 kubenswrapper[18707]: I0320 09:25:32.258707 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.258843 master-0 kubenswrapper[18707]: I0320 09:25:32.258809 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.259089 master-0 kubenswrapper[18707]: I0320 09:25:32.259049 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.259885 master-0 kubenswrapper[18707]: I0320 09:25:32.259861 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.274800 master-0 kubenswrapper[18707]: I0320 09:25:32.274779 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npnkk\" (UniqueName: \"kubernetes.io/projected/c66786be-0342-44f0-90c4-a1797c4ea873-kube-api-access-npnkk\") pod \"neutron-metadata-dataplane-step-2-edpm-b-4vq4r\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:32.357810 master-0 kubenswrapper[18707]: I0320 09:25:32.357644 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.358539 master-0 kubenswrapper[18707]: I0320 09:25:32.358491 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.358653 master-0 kubenswrapper[18707]: I0320 09:25:32.358580 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.358747 master-0 kubenswrapper[18707]: I0320 09:25:32.358724 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv2rd\" (UniqueName: \"kubernetes.io/projected/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-kube-api-access-tv2rd\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.358892 master-0 kubenswrapper[18707]: I0320 09:25:32.358865 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.359071 master-0 kubenswrapper[18707]: I0320 09:25:32.359042 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.361741 master-0 kubenswrapper[18707]: I0320 09:25:32.361706 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.362647 master-0 kubenswrapper[18707]: I0320 09:25:32.362598 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.363668 master-0 kubenswrapper[18707]: I0320 09:25:32.362911 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.363668 master-0 kubenswrapper[18707]: I0320 09:25:32.363591 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.365721 master-0 kubenswrapper[18707]: I0320 09:25:32.365669 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.381040 master-0 kubenswrapper[18707]: I0320 09:25:32.380981 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv2rd\" (UniqueName: \"kubernetes.io/projected/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-kube-api-access-tv2rd\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kwdqb\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.420577 master-0 kubenswrapper[18707]: I0320 09:25:32.420503 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:25:32.420802 master-0 kubenswrapper[18707]: I0320 09:25:32.420514 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:25:33.009646 master-0 kubenswrapper[18707]: I0320 09:25:33.009569 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r"] Mar 20 09:25:33.016446 master-0 kubenswrapper[18707]: W0320 09:25:33.016386 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc66786be_0342_44f0_90c4_a1797c4ea873.slice/crio-a828fd2c54b2588263bfd5bfc68372757df7a3ff4e1ccd760dc7839d675394f2 WatchSource:0}: Error finding container a828fd2c54b2588263bfd5bfc68372757df7a3ff4e1ccd760dc7839d675394f2: Status 404 returned error can't find the container with id a828fd2c54b2588263bfd5bfc68372757df7a3ff4e1ccd760dc7839d675394f2 Mar 20 09:25:33.697323 master-0 kubenswrapper[18707]: W0320 09:25:33.695078 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc047bd53_7522_4d1c_8cff_c53bd43bbfa6.slice/crio-3d4f05058dd182ca2a2e5c8369ef34feedb3b2225ae0bbf84a8ca3ae578085de WatchSource:0}: Error finding container 3d4f05058dd182ca2a2e5c8369ef34feedb3b2225ae0bbf84a8ca3ae578085de: Status 404 returned error can't find the container with id 3d4f05058dd182ca2a2e5c8369ef34feedb3b2225ae0bbf84a8ca3ae578085de Mar 20 09:25:33.702647 master-0 kubenswrapper[18707]: I0320 09:25:33.702568 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb"] Mar 20 09:25:33.904510 master-0 kubenswrapper[18707]: I0320 09:25:33.904421 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" event={"ID":"c047bd53-7522-4d1c-8cff-c53bd43bbfa6","Type":"ContainerStarted","Data":"3d4f05058dd182ca2a2e5c8369ef34feedb3b2225ae0bbf84a8ca3ae578085de"} Mar 20 09:25:33.905821 master-0 kubenswrapper[18707]: I0320 09:25:33.905783 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" event={"ID":"c66786be-0342-44f0-90c4-a1797c4ea873","Type":"ContainerStarted","Data":"a828fd2c54b2588263bfd5bfc68372757df7a3ff4e1ccd760dc7839d675394f2"} Mar 20 09:25:34.921048 master-0 kubenswrapper[18707]: I0320 09:25:34.920935 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" event={"ID":"c047bd53-7522-4d1c-8cff-c53bd43bbfa6","Type":"ContainerStarted","Data":"9687487d3cba33e94dbb1f4b6e86357877d80ef9f0bd7585ddbf6a61afa4dd34"} Mar 20 09:25:34.923678 master-0 kubenswrapper[18707]: I0320 09:25:34.923612 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" event={"ID":"c66786be-0342-44f0-90c4-a1797c4ea873","Type":"ContainerStarted","Data":"8e629d2e54c1671e757560fe4d565d0b8c8bbdc35a13a6b7778ab9ee2c0d1c1b"} Mar 20 09:25:34.951878 master-0 kubenswrapper[18707]: I0320 09:25:34.951622 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" podStartSLOduration=2.290775238 podStartE2EDuration="2.951598935s" podCreationTimestamp="2026-03-20 09:25:32 +0000 UTC" firstStartedPulling="2026-03-20 09:25:33.698252402 +0000 UTC m=+2678.854432758" lastFinishedPulling="2026-03-20 09:25:34.359076099 +0000 UTC m=+2679.515256455" observedRunningTime="2026-03-20 09:25:34.937059864 +0000 UTC m=+2680.093240240" watchObservedRunningTime="2026-03-20 09:25:34.951598935 +0000 UTC m=+2680.107779291" Mar 20 09:25:34.966667 master-0 kubenswrapper[18707]: I0320 09:25:34.966532 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" podStartSLOduration=1.869857572 podStartE2EDuration="2.966511716s" podCreationTimestamp="2026-03-20 09:25:32 +0000 UTC" firstStartedPulling="2026-03-20 09:25:33.019640213 +0000 UTC m=+2678.175820569" lastFinishedPulling="2026-03-20 09:25:34.116294347 +0000 UTC m=+2679.272474713" observedRunningTime="2026-03-20 09:25:34.955747142 +0000 UTC m=+2680.111927498" watchObservedRunningTime="2026-03-20 09:25:34.966511716 +0000 UTC m=+2680.122692062" Mar 20 09:26:14.493462 master-0 kubenswrapper[18707]: I0320 09:26:14.493283 18707 generic.go:334] "Generic (PLEG): container finished" podID="c047bd53-7522-4d1c-8cff-c53bd43bbfa6" containerID="9687487d3cba33e94dbb1f4b6e86357877d80ef9f0bd7585ddbf6a61afa4dd34" exitCode=2 Mar 20 09:26:14.493462 master-0 kubenswrapper[18707]: I0320 09:26:14.493365 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" event={"ID":"c047bd53-7522-4d1c-8cff-c53bd43bbfa6","Type":"ContainerDied","Data":"9687487d3cba33e94dbb1f4b6e86357877d80ef9f0bd7585ddbf6a61afa4dd34"} Mar 20 09:26:14.495671 master-0 kubenswrapper[18707]: I0320 09:26:14.495616 18707 generic.go:334] "Generic (PLEG): container finished" podID="c66786be-0342-44f0-90c4-a1797c4ea873" containerID="8e629d2e54c1671e757560fe4d565d0b8c8bbdc35a13a6b7778ab9ee2c0d1c1b" exitCode=2 Mar 20 09:26:14.495817 master-0 kubenswrapper[18707]: I0320 09:26:14.495676 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" event={"ID":"c66786be-0342-44f0-90c4-a1797c4ea873","Type":"ContainerDied","Data":"8e629d2e54c1671e757560fe4d565d0b8c8bbdc35a13a6b7778ab9ee2c0d1c1b"} Mar 20 09:26:16.012833 master-0 kubenswrapper[18707]: I0320 09:26:16.012753 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:26:16.084302 master-0 kubenswrapper[18707]: I0320 09:26:16.084159 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-nova-metadata-neutron-config-0\") pod \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " Mar 20 09:26:16.084302 master-0 kubenswrapper[18707]: I0320 09:26:16.084301 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-inventory\") pod \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " Mar 20 09:26:16.084647 master-0 kubenswrapper[18707]: I0320 09:26:16.084331 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " Mar 20 09:26:16.084647 master-0 kubenswrapper[18707]: I0320 09:26:16.084460 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-neutron-metadata-combined-ca-bundle\") pod \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " Mar 20 09:26:16.084647 master-0 kubenswrapper[18707]: I0320 09:26:16.084537 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv2rd\" (UniqueName: \"kubernetes.io/projected/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-kube-api-access-tv2rd\") pod \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " Mar 20 09:26:16.084778 master-0 kubenswrapper[18707]: I0320 09:26:16.084646 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-ssh-key-edpm-a\") pod \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\" (UID: \"c047bd53-7522-4d1c-8cff-c53bd43bbfa6\") " Mar 20 09:26:16.089787 master-0 kubenswrapper[18707]: I0320 09:26:16.089690 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c047bd53-7522-4d1c-8cff-c53bd43bbfa6" (UID: "c047bd53-7522-4d1c-8cff-c53bd43bbfa6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:26:16.089787 master-0 kubenswrapper[18707]: I0320 09:26:16.089754 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-kube-api-access-tv2rd" (OuterVolumeSpecName: "kube-api-access-tv2rd") pod "c047bd53-7522-4d1c-8cff-c53bd43bbfa6" (UID: "c047bd53-7522-4d1c-8cff-c53bd43bbfa6"). InnerVolumeSpecName "kube-api-access-tv2rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:26:16.111171 master-0 kubenswrapper[18707]: I0320 09:26:16.111113 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "c047bd53-7522-4d1c-8cff-c53bd43bbfa6" (UID: "c047bd53-7522-4d1c-8cff-c53bd43bbfa6"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:26:16.114814 master-0 kubenswrapper[18707]: I0320 09:26:16.114753 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "c047bd53-7522-4d1c-8cff-c53bd43bbfa6" (UID: "c047bd53-7522-4d1c-8cff-c53bd43bbfa6"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:26:16.117261 master-0 kubenswrapper[18707]: I0320 09:26:16.117214 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-inventory" (OuterVolumeSpecName: "inventory") pod "c047bd53-7522-4d1c-8cff-c53bd43bbfa6" (UID: "c047bd53-7522-4d1c-8cff-c53bd43bbfa6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:26:16.135356 master-0 kubenswrapper[18707]: I0320 09:26:16.134870 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "c047bd53-7522-4d1c-8cff-c53bd43bbfa6" (UID: "c047bd53-7522-4d1c-8cff-c53bd43bbfa6"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:26:16.187014 master-0 kubenswrapper[18707]: I0320 09:26:16.186905 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:26:16.187014 master-0 kubenswrapper[18707]: I0320 09:26:16.186943 18707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-nova-metadata-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:26:16.187014 master-0 kubenswrapper[18707]: I0320 09:26:16.186956 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:26:16.187014 master-0 kubenswrapper[18707]: I0320 09:26:16.186966 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-neutron-ovn-metadata-agent-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:26:16.187014 master-0 kubenswrapper[18707]: I0320 09:26:16.186979 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:26:16.187014 master-0 kubenswrapper[18707]: I0320 09:26:16.186989 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv2rd\" (UniqueName: \"kubernetes.io/projected/c047bd53-7522-4d1c-8cff-c53bd43bbfa6-kube-api-access-tv2rd\") on node \"master-0\" DevicePath \"\"" Mar 20 09:26:16.528531 master-0 kubenswrapper[18707]: I0320 09:26:16.528335 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" event={"ID":"c047bd53-7522-4d1c-8cff-c53bd43bbfa6","Type":"ContainerDied","Data":"3d4f05058dd182ca2a2e5c8369ef34feedb3b2225ae0bbf84a8ca3ae578085de"} Mar 20 09:26:16.528896 master-0 kubenswrapper[18707]: I0320 09:26:16.528876 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d4f05058dd182ca2a2e5c8369ef34feedb3b2225ae0bbf84a8ca3ae578085de" Mar 20 09:26:16.529052 master-0 kubenswrapper[18707]: I0320 09:26:16.528562 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kwdqb" Mar 20 09:26:17.200706 master-0 kubenswrapper[18707]: I0320 09:26:17.200054 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:26:17.315894 master-0 kubenswrapper[18707]: I0320 09:26:17.315845 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-ssh-key-edpm-b\") pod \"c66786be-0342-44f0-90c4-a1797c4ea873\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " Mar 20 09:26:17.316446 master-0 kubenswrapper[18707]: I0320 09:26:17.316399 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npnkk\" (UniqueName: \"kubernetes.io/projected/c66786be-0342-44f0-90c4-a1797c4ea873-kube-api-access-npnkk\") pod \"c66786be-0342-44f0-90c4-a1797c4ea873\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " Mar 20 09:26:17.316607 master-0 kubenswrapper[18707]: I0320 09:26:17.316591 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-neutron-ovn-metadata-agent-neutron-config-0\") pod \"c66786be-0342-44f0-90c4-a1797c4ea873\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " Mar 20 09:26:17.317353 master-0 kubenswrapper[18707]: I0320 09:26:17.317337 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-nova-metadata-neutron-config-0\") pod \"c66786be-0342-44f0-90c4-a1797c4ea873\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " Mar 20 09:26:17.317465 master-0 kubenswrapper[18707]: I0320 09:26:17.317450 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-inventory\") pod \"c66786be-0342-44f0-90c4-a1797c4ea873\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " Mar 20 09:26:17.317557 master-0 kubenswrapper[18707]: I0320 09:26:17.317544 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-neutron-metadata-combined-ca-bundle\") pod \"c66786be-0342-44f0-90c4-a1797c4ea873\" (UID: \"c66786be-0342-44f0-90c4-a1797c4ea873\") " Mar 20 09:26:17.333311 master-0 kubenswrapper[18707]: I0320 09:26:17.331690 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c66786be-0342-44f0-90c4-a1797c4ea873-kube-api-access-npnkk" (OuterVolumeSpecName: "kube-api-access-npnkk") pod "c66786be-0342-44f0-90c4-a1797c4ea873" (UID: "c66786be-0342-44f0-90c4-a1797c4ea873"). InnerVolumeSpecName "kube-api-access-npnkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:26:17.333311 master-0 kubenswrapper[18707]: I0320 09:26:17.331827 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c66786be-0342-44f0-90c4-a1797c4ea873" (UID: "c66786be-0342-44f0-90c4-a1797c4ea873"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:26:17.355456 master-0 kubenswrapper[18707]: I0320 09:26:17.355208 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "c66786be-0342-44f0-90c4-a1797c4ea873" (UID: "c66786be-0342-44f0-90c4-a1797c4ea873"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:26:17.356275 master-0 kubenswrapper[18707]: I0320 09:26:17.356141 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "c66786be-0342-44f0-90c4-a1797c4ea873" (UID: "c66786be-0342-44f0-90c4-a1797c4ea873"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:26:17.364178 master-0 kubenswrapper[18707]: I0320 09:26:17.364092 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-inventory" (OuterVolumeSpecName: "inventory") pod "c66786be-0342-44f0-90c4-a1797c4ea873" (UID: "c66786be-0342-44f0-90c4-a1797c4ea873"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:26:17.364950 master-0 kubenswrapper[18707]: I0320 09:26:17.364622 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "c66786be-0342-44f0-90c4-a1797c4ea873" (UID: "c66786be-0342-44f0-90c4-a1797c4ea873"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:26:17.422303 master-0 kubenswrapper[18707]: I0320 09:26:17.422012 18707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-nova-metadata-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:26:17.422303 master-0 kubenswrapper[18707]: I0320 09:26:17.422097 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:26:17.422303 master-0 kubenswrapper[18707]: I0320 09:26:17.422111 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:26:17.422303 master-0 kubenswrapper[18707]: I0320 09:26:17.422126 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:26:17.422303 master-0 kubenswrapper[18707]: I0320 09:26:17.422140 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npnkk\" (UniqueName: \"kubernetes.io/projected/c66786be-0342-44f0-90c4-a1797c4ea873-kube-api-access-npnkk\") on node \"master-0\" DevicePath \"\"" Mar 20 09:26:17.422303 master-0 kubenswrapper[18707]: I0320 09:26:17.422156 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c66786be-0342-44f0-90c4-a1797c4ea873-neutron-ovn-metadata-agent-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:26:17.548501 master-0 kubenswrapper[18707]: I0320 09:26:17.548412 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" event={"ID":"c66786be-0342-44f0-90c4-a1797c4ea873","Type":"ContainerDied","Data":"a828fd2c54b2588263bfd5bfc68372757df7a3ff4e1ccd760dc7839d675394f2"} Mar 20 09:26:17.548501 master-0 kubenswrapper[18707]: I0320 09:26:17.548505 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a828fd2c54b2588263bfd5bfc68372757df7a3ff4e1ccd760dc7839d675394f2" Mar 20 09:26:17.548501 master-0 kubenswrapper[18707]: I0320 09:26:17.548463 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-4vq4r" Mar 20 09:26:34.041945 master-0 kubenswrapper[18707]: I0320 09:26:34.041880 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs"] Mar 20 09:26:34.042702 master-0 kubenswrapper[18707]: E0320 09:26:34.042649 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c047bd53-7522-4d1c-8cff-c53bd43bbfa6" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:26:34.042702 master-0 kubenswrapper[18707]: I0320 09:26:34.042670 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c047bd53-7522-4d1c-8cff-c53bd43bbfa6" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:26:34.042779 master-0 kubenswrapper[18707]: E0320 09:26:34.042719 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c66786be-0342-44f0-90c4-a1797c4ea873" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:26:34.042779 master-0 kubenswrapper[18707]: I0320 09:26:34.042727 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66786be-0342-44f0-90c4-a1797c4ea873" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:26:34.043779 master-0 kubenswrapper[18707]: I0320 09:26:34.043036 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c66786be-0342-44f0-90c4-a1797c4ea873" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:26:34.043779 master-0 kubenswrapper[18707]: I0320 09:26:34.043112 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c047bd53-7522-4d1c-8cff-c53bd43bbfa6" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:26:34.044552 master-0 kubenswrapper[18707]: I0320 09:26:34.044090 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.047788 master-0 kubenswrapper[18707]: I0320 09:26:34.047737 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 09:26:34.048471 master-0 kubenswrapper[18707]: I0320 09:26:34.048404 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:26:34.050961 master-0 kubenswrapper[18707]: I0320 09:26:34.050924 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:26:34.054756 master-0 kubenswrapper[18707]: I0320 09:26:34.054703 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 09:26:34.054949 master-0 kubenswrapper[18707]: I0320 09:26:34.054844 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 09:26:34.087859 master-0 kubenswrapper[18707]: I0320 09:26:34.087790 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9"] Mar 20 09:26:34.091238 master-0 kubenswrapper[18707]: I0320 09:26:34.091193 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.094279 master-0 kubenswrapper[18707]: I0320 09:26:34.093958 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-b" Mar 20 09:26:34.100846 master-0 kubenswrapper[18707]: I0320 09:26:34.100785 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs"] Mar 20 09:26:34.115843 master-0 kubenswrapper[18707]: I0320 09:26:34.113719 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9"] Mar 20 09:26:34.148411 master-0 kubenswrapper[18707]: I0320 09:26:34.148347 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.148411 master-0 kubenswrapper[18707]: I0320 09:26:34.148409 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.148732 master-0 kubenswrapper[18707]: I0320 09:26:34.148463 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5nx6\" (UniqueName: \"kubernetes.io/projected/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-kube-api-access-c5nx6\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.148732 master-0 kubenswrapper[18707]: I0320 09:26:34.148522 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.148732 master-0 kubenswrapper[18707]: I0320 09:26:34.148550 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.148732 master-0 kubenswrapper[18707]: I0320 09:26:34.148597 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.250397 master-0 kubenswrapper[18707]: I0320 09:26:34.250321 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.250692 master-0 kubenswrapper[18707]: I0320 09:26:34.250565 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.250762 master-0 kubenswrapper[18707]: I0320 09:26:34.250689 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5nx6\" (UniqueName: \"kubernetes.io/projected/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-kube-api-access-c5nx6\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.250848 master-0 kubenswrapper[18707]: I0320 09:26:34.250816 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.250912 master-0 kubenswrapper[18707]: I0320 09:26:34.250864 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.251379 master-0 kubenswrapper[18707]: I0320 09:26:34.251346 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.251538 master-0 kubenswrapper[18707]: I0320 09:26:34.251496 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.256345 master-0 kubenswrapper[18707]: I0320 09:26:34.253911 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.256345 master-0 kubenswrapper[18707]: I0320 09:26:34.254263 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.256345 master-0 kubenswrapper[18707]: I0320 09:26:34.254787 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.256345 master-0 kubenswrapper[18707]: I0320 09:26:34.254817 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.256345 master-0 kubenswrapper[18707]: I0320 09:26:34.255442 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.256345 master-0 kubenswrapper[18707]: I0320 09:26:34.255506 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.256345 master-0 kubenswrapper[18707]: I0320 09:26:34.255544 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.256345 master-0 kubenswrapper[18707]: I0320 09:26:34.255567 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwpwb\" (UniqueName: \"kubernetes.io/projected/c28d364d-8035-4a3c-8cff-0b599adb269f-kube-api-access-rwpwb\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.258530 master-0 kubenswrapper[18707]: I0320 09:26:34.258500 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.260591 master-0 kubenswrapper[18707]: I0320 09:26:34.260546 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.266527 master-0 kubenswrapper[18707]: I0320 09:26:34.266483 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5nx6\" (UniqueName: \"kubernetes.io/projected/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-kube-api-access-c5nx6\") pod \"neutron-metadata-dataplane-step-2-edpm-a-48qbs\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.357915 master-0 kubenswrapper[18707]: I0320 09:26:34.357836 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.358246 master-0 kubenswrapper[18707]: I0320 09:26:34.358006 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.358246 master-0 kubenswrapper[18707]: I0320 09:26:34.358065 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwpwb\" (UniqueName: \"kubernetes.io/projected/c28d364d-8035-4a3c-8cff-0b599adb269f-kube-api-access-rwpwb\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.358246 master-0 kubenswrapper[18707]: I0320 09:26:34.358106 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.358246 master-0 kubenswrapper[18707]: I0320 09:26:34.358149 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.358246 master-0 kubenswrapper[18707]: I0320 09:26:34.358235 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.361821 master-0 kubenswrapper[18707]: I0320 09:26:34.361554 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.361821 master-0 kubenswrapper[18707]: I0320 09:26:34.361651 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.361994 master-0 kubenswrapper[18707]: I0320 09:26:34.361966 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.363111 master-0 kubenswrapper[18707]: I0320 09:26:34.362679 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.363111 master-0 kubenswrapper[18707]: I0320 09:26:34.363033 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.373617 master-0 kubenswrapper[18707]: I0320 09:26:34.373547 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:26:34.377736 master-0 kubenswrapper[18707]: I0320 09:26:34.377664 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwpwb\" (UniqueName: \"kubernetes.io/projected/c28d364d-8035-4a3c-8cff-0b599adb269f-kube-api-access-rwpwb\") pod \"neutron-metadata-dataplane-step-2-edpm-b-xbtq9\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.426432 master-0 kubenswrapper[18707]: I0320 09:26:34.421538 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:26:34.903959 master-0 kubenswrapper[18707]: I0320 09:26:34.903912 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs"] Mar 20 09:26:35.136454 master-0 kubenswrapper[18707]: I0320 09:26:35.136382 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9"] Mar 20 09:26:35.799520 master-0 kubenswrapper[18707]: I0320 09:26:35.799397 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" event={"ID":"c28d364d-8035-4a3c-8cff-0b599adb269f","Type":"ContainerStarted","Data":"b5d13bec65a6be59a67b0fa3f7d41326aaeeed788ef18e7051b76ce113376601"} Mar 20 09:26:35.800539 master-0 kubenswrapper[18707]: I0320 09:26:35.800479 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" event={"ID":"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4","Type":"ContainerStarted","Data":"69cf0a948600adec824c0279df0e7322be4ae13445e22fc344c38ff3127e023e"} Mar 20 09:26:36.814540 master-0 kubenswrapper[18707]: I0320 09:26:36.814466 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" event={"ID":"c28d364d-8035-4a3c-8cff-0b599adb269f","Type":"ContainerStarted","Data":"b8b86b86ce4642c48958e0f1e3cecbab97435f6633a7ff2752c62f69e3eef70e"} Mar 20 09:26:36.816738 master-0 kubenswrapper[18707]: I0320 09:26:36.816667 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" event={"ID":"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4","Type":"ContainerStarted","Data":"f86183ec7c3a054ecf20f5af2f573c1932f7ae6d57248a3cb19d74ecfb6cd1d8"} Mar 20 09:26:36.843282 master-0 kubenswrapper[18707]: I0320 09:26:36.843144 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" podStartSLOduration=2.3334571889999998 podStartE2EDuration="2.843119288s" podCreationTimestamp="2026-03-20 09:26:34 +0000 UTC" firstStartedPulling="2026-03-20 09:26:35.141343479 +0000 UTC m=+2740.297523835" lastFinishedPulling="2026-03-20 09:26:35.651005568 +0000 UTC m=+2740.807185934" observedRunningTime="2026-03-20 09:26:36.832167426 +0000 UTC m=+2741.988347802" watchObservedRunningTime="2026-03-20 09:26:36.843119288 +0000 UTC m=+2741.999299644" Mar 20 09:26:36.865891 master-0 kubenswrapper[18707]: I0320 09:26:36.865538 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" podStartSLOduration=2.213455797 podStartE2EDuration="2.865509364s" podCreationTimestamp="2026-03-20 09:26:34 +0000 UTC" firstStartedPulling="2026-03-20 09:26:34.905838473 +0000 UTC m=+2740.062018829" lastFinishedPulling="2026-03-20 09:26:35.55789203 +0000 UTC m=+2740.714072396" observedRunningTime="2026-03-20 09:26:36.857062304 +0000 UTC m=+2742.013242660" watchObservedRunningTime="2026-03-20 09:26:36.865509364 +0000 UTC m=+2742.021689730" Mar 20 09:26:41.300550 master-0 kubenswrapper[18707]: I0320 09:26:41.300512 18707 trace.go:236] Trace[569432525]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-0" (20-Mar-2026 09:26:40.299) (total time: 1001ms): Mar 20 09:26:41.300550 master-0 kubenswrapper[18707]: Trace[569432525]: [1.001265834s] [1.001265834s] END Mar 20 09:27:16.333870 master-0 kubenswrapper[18707]: I0320 09:27:16.333799 18707 generic.go:334] "Generic (PLEG): container finished" podID="c28d364d-8035-4a3c-8cff-0b599adb269f" containerID="b8b86b86ce4642c48958e0f1e3cecbab97435f6633a7ff2752c62f69e3eef70e" exitCode=2 Mar 20 09:27:16.333870 master-0 kubenswrapper[18707]: I0320 09:27:16.333867 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" event={"ID":"c28d364d-8035-4a3c-8cff-0b599adb269f","Type":"ContainerDied","Data":"b8b86b86ce4642c48958e0f1e3cecbab97435f6633a7ff2752c62f69e3eef70e"} Mar 20 09:27:17.347315 master-0 kubenswrapper[18707]: I0320 09:27:17.347228 18707 generic.go:334] "Generic (PLEG): container finished" podID="67dd2d0e-98ae-4a8d-9eef-4c0822667ac4" containerID="f86183ec7c3a054ecf20f5af2f573c1932f7ae6d57248a3cb19d74ecfb6cd1d8" exitCode=2 Mar 20 09:27:17.347910 master-0 kubenswrapper[18707]: I0320 09:27:17.347322 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" event={"ID":"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4","Type":"ContainerDied","Data":"f86183ec7c3a054ecf20f5af2f573c1932f7ae6d57248a3cb19d74ecfb6cd1d8"} Mar 20 09:27:17.902817 master-0 kubenswrapper[18707]: I0320 09:27:17.902776 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:27:18.013668 master-0 kubenswrapper[18707]: I0320 09:27:18.012953 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-ssh-key-edpm-b\") pod \"c28d364d-8035-4a3c-8cff-0b599adb269f\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " Mar 20 09:27:18.013668 master-0 kubenswrapper[18707]: I0320 09:27:18.013067 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-inventory\") pod \"c28d364d-8035-4a3c-8cff-0b599adb269f\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " Mar 20 09:27:18.013668 master-0 kubenswrapper[18707]: I0320 09:27:18.013160 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-neutron-metadata-combined-ca-bundle\") pod \"c28d364d-8035-4a3c-8cff-0b599adb269f\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " Mar 20 09:27:18.013668 master-0 kubenswrapper[18707]: I0320 09:27:18.013253 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwpwb\" (UniqueName: \"kubernetes.io/projected/c28d364d-8035-4a3c-8cff-0b599adb269f-kube-api-access-rwpwb\") pod \"c28d364d-8035-4a3c-8cff-0b599adb269f\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " Mar 20 09:27:18.013668 master-0 kubenswrapper[18707]: I0320 09:27:18.013316 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"c28d364d-8035-4a3c-8cff-0b599adb269f\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " Mar 20 09:27:18.013668 master-0 kubenswrapper[18707]: I0320 09:27:18.013355 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-nova-metadata-neutron-config-0\") pod \"c28d364d-8035-4a3c-8cff-0b599adb269f\" (UID: \"c28d364d-8035-4a3c-8cff-0b599adb269f\") " Mar 20 09:27:18.026539 master-0 kubenswrapper[18707]: I0320 09:27:18.026455 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c28d364d-8035-4a3c-8cff-0b599adb269f" (UID: "c28d364d-8035-4a3c-8cff-0b599adb269f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:27:18.028670 master-0 kubenswrapper[18707]: I0320 09:27:18.028622 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c28d364d-8035-4a3c-8cff-0b599adb269f-kube-api-access-rwpwb" (OuterVolumeSpecName: "kube-api-access-rwpwb") pod "c28d364d-8035-4a3c-8cff-0b599adb269f" (UID: "c28d364d-8035-4a3c-8cff-0b599adb269f"). InnerVolumeSpecName "kube-api-access-rwpwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:27:18.046371 master-0 kubenswrapper[18707]: I0320 09:27:18.046319 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-inventory" (OuterVolumeSpecName: "inventory") pod "c28d364d-8035-4a3c-8cff-0b599adb269f" (UID: "c28d364d-8035-4a3c-8cff-0b599adb269f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:27:18.049302 master-0 kubenswrapper[18707]: I0320 09:27:18.049252 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "c28d364d-8035-4a3c-8cff-0b599adb269f" (UID: "c28d364d-8035-4a3c-8cff-0b599adb269f"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:27:18.051404 master-0 kubenswrapper[18707]: I0320 09:27:18.051363 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "c28d364d-8035-4a3c-8cff-0b599adb269f" (UID: "c28d364d-8035-4a3c-8cff-0b599adb269f"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:27:18.061792 master-0 kubenswrapper[18707]: I0320 09:27:18.061592 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "c28d364d-8035-4a3c-8cff-0b599adb269f" (UID: "c28d364d-8035-4a3c-8cff-0b599adb269f"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:27:18.117082 master-0 kubenswrapper[18707]: I0320 09:27:18.117036 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:27:18.117082 master-0 kubenswrapper[18707]: I0320 09:27:18.117080 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:27:18.117296 master-0 kubenswrapper[18707]: I0320 09:27:18.117093 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:27:18.117296 master-0 kubenswrapper[18707]: I0320 09:27:18.117106 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwpwb\" (UniqueName: \"kubernetes.io/projected/c28d364d-8035-4a3c-8cff-0b599adb269f-kube-api-access-rwpwb\") on node \"master-0\" DevicePath \"\"" Mar 20 09:27:18.117296 master-0 kubenswrapper[18707]: I0320 09:27:18.117116 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-neutron-ovn-metadata-agent-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:27:18.117296 master-0 kubenswrapper[18707]: I0320 09:27:18.117130 18707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c28d364d-8035-4a3c-8cff-0b599adb269f-nova-metadata-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:27:18.360223 master-0 kubenswrapper[18707]: I0320 09:27:18.360092 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" Mar 20 09:27:18.362660 master-0 kubenswrapper[18707]: I0320 09:27:18.362526 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-xbtq9" event={"ID":"c28d364d-8035-4a3c-8cff-0b599adb269f","Type":"ContainerDied","Data":"b5d13bec65a6be59a67b0fa3f7d41326aaeeed788ef18e7051b76ce113376601"} Mar 20 09:27:18.362660 master-0 kubenswrapper[18707]: I0320 09:27:18.362594 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5d13bec65a6be59a67b0fa3f7d41326aaeeed788ef18e7051b76ce113376601" Mar 20 09:27:18.907008 master-0 kubenswrapper[18707]: I0320 09:27:18.906963 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:27:19.040006 master-0 kubenswrapper[18707]: I0320 09:27:19.039943 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-inventory\") pod \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " Mar 20 09:27:19.040252 master-0 kubenswrapper[18707]: I0320 09:27:19.040026 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-ssh-key-edpm-a\") pod \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " Mar 20 09:27:19.040315 master-0 kubenswrapper[18707]: I0320 09:27:19.040267 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-neutron-metadata-combined-ca-bundle\") pod \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " Mar 20 09:27:19.040315 master-0 kubenswrapper[18707]: I0320 09:27:19.040303 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " Mar 20 09:27:19.040479 master-0 kubenswrapper[18707]: I0320 09:27:19.040372 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-nova-metadata-neutron-config-0\") pod \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " Mar 20 09:27:19.040563 master-0 kubenswrapper[18707]: I0320 09:27:19.040523 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5nx6\" (UniqueName: \"kubernetes.io/projected/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-kube-api-access-c5nx6\") pod \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\" (UID: \"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4\") " Mar 20 09:27:19.043255 master-0 kubenswrapper[18707]: I0320 09:27:19.043132 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "67dd2d0e-98ae-4a8d-9eef-4c0822667ac4" (UID: "67dd2d0e-98ae-4a8d-9eef-4c0822667ac4"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:27:19.044840 master-0 kubenswrapper[18707]: I0320 09:27:19.044785 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-kube-api-access-c5nx6" (OuterVolumeSpecName: "kube-api-access-c5nx6") pod "67dd2d0e-98ae-4a8d-9eef-4c0822667ac4" (UID: "67dd2d0e-98ae-4a8d-9eef-4c0822667ac4"). InnerVolumeSpecName "kube-api-access-c5nx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:27:19.072767 master-0 kubenswrapper[18707]: I0320 09:27:19.072642 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "67dd2d0e-98ae-4a8d-9eef-4c0822667ac4" (UID: "67dd2d0e-98ae-4a8d-9eef-4c0822667ac4"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:27:19.073083 master-0 kubenswrapper[18707]: I0320 09:27:19.073016 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "67dd2d0e-98ae-4a8d-9eef-4c0822667ac4" (UID: "67dd2d0e-98ae-4a8d-9eef-4c0822667ac4"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:27:19.075482 master-0 kubenswrapper[18707]: I0320 09:27:19.075420 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-inventory" (OuterVolumeSpecName: "inventory") pod "67dd2d0e-98ae-4a8d-9eef-4c0822667ac4" (UID: "67dd2d0e-98ae-4a8d-9eef-4c0822667ac4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:27:19.084210 master-0 kubenswrapper[18707]: I0320 09:27:19.081877 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "67dd2d0e-98ae-4a8d-9eef-4c0822667ac4" (UID: "67dd2d0e-98ae-4a8d-9eef-4c0822667ac4"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:27:19.143339 master-0 kubenswrapper[18707]: I0320 09:27:19.143261 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:27:19.143339 master-0 kubenswrapper[18707]: I0320 09:27:19.143312 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:27:19.143339 master-0 kubenswrapper[18707]: I0320 09:27:19.143326 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:27:19.143339 master-0 kubenswrapper[18707]: I0320 09:27:19.143342 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-neutron-ovn-metadata-agent-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:27:19.143339 master-0 kubenswrapper[18707]: I0320 09:27:19.143354 18707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-nova-metadata-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:27:19.143805 master-0 kubenswrapper[18707]: I0320 09:27:19.143365 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5nx6\" (UniqueName: \"kubernetes.io/projected/67dd2d0e-98ae-4a8d-9eef-4c0822667ac4-kube-api-access-c5nx6\") on node \"master-0\" DevicePath \"\"" Mar 20 09:27:19.369959 master-0 kubenswrapper[18707]: I0320 09:27:19.369818 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" event={"ID":"67dd2d0e-98ae-4a8d-9eef-4c0822667ac4","Type":"ContainerDied","Data":"69cf0a948600adec824c0279df0e7322be4ae13445e22fc344c38ff3127e023e"} Mar 20 09:27:19.369959 master-0 kubenswrapper[18707]: I0320 09:27:19.369865 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-48qbs" Mar 20 09:27:19.371056 master-0 kubenswrapper[18707]: I0320 09:27:19.369868 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69cf0a948600adec824c0279df0e7322be4ae13445e22fc344c38ff3127e023e" Mar 20 09:27:55.049687 master-0 kubenswrapper[18707]: I0320 09:27:55.049623 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd"] Mar 20 09:27:55.050464 master-0 kubenswrapper[18707]: E0320 09:27:55.050348 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c28d364d-8035-4a3c-8cff-0b599adb269f" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:27:55.050464 master-0 kubenswrapper[18707]: I0320 09:27:55.050364 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28d364d-8035-4a3c-8cff-0b599adb269f" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:27:55.050464 master-0 kubenswrapper[18707]: E0320 09:27:55.050400 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67dd2d0e-98ae-4a8d-9eef-4c0822667ac4" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:27:55.050464 master-0 kubenswrapper[18707]: I0320 09:27:55.050408 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="67dd2d0e-98ae-4a8d-9eef-4c0822667ac4" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:27:55.050684 master-0 kubenswrapper[18707]: I0320 09:27:55.050659 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="67dd2d0e-98ae-4a8d-9eef-4c0822667ac4" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:27:55.050752 master-0 kubenswrapper[18707]: I0320 09:27:55.050722 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c28d364d-8035-4a3c-8cff-0b599adb269f" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:27:55.051671 master-0 kubenswrapper[18707]: I0320 09:27:55.051580 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.054751 master-0 kubenswrapper[18707]: I0320 09:27:55.054709 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:27:55.055027 master-0 kubenswrapper[18707]: I0320 09:27:55.055007 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 09:27:55.055319 master-0 kubenswrapper[18707]: I0320 09:27:55.055299 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-b" Mar 20 09:27:55.055440 master-0 kubenswrapper[18707]: I0320 09:27:55.055425 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 09:27:55.055683 master-0 kubenswrapper[18707]: I0320 09:27:55.055580 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 09:27:55.086506 master-0 kubenswrapper[18707]: I0320 09:27:55.086356 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd"] Mar 20 09:27:55.098985 master-0 kubenswrapper[18707]: I0320 09:27:55.098915 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.099253 master-0 kubenswrapper[18707]: I0320 09:27:55.099063 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5dmk\" (UniqueName: \"kubernetes.io/projected/49891315-b41e-48c6-a4d7-823c47f97bf0-kube-api-access-n5dmk\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.099253 master-0 kubenswrapper[18707]: I0320 09:27:55.099104 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.099253 master-0 kubenswrapper[18707]: I0320 09:27:55.099140 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.099253 master-0 kubenswrapper[18707]: I0320 09:27:55.099214 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.099872 master-0 kubenswrapper[18707]: I0320 09:27:55.099822 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.201630 master-0 kubenswrapper[18707]: I0320 09:27:55.201552 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5dmk\" (UniqueName: \"kubernetes.io/projected/49891315-b41e-48c6-a4d7-823c47f97bf0-kube-api-access-n5dmk\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.201630 master-0 kubenswrapper[18707]: I0320 09:27:55.201626 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.201935 master-0 kubenswrapper[18707]: I0320 09:27:55.201659 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.201935 master-0 kubenswrapper[18707]: I0320 09:27:55.201706 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.201935 master-0 kubenswrapper[18707]: I0320 09:27:55.201875 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.202639 master-0 kubenswrapper[18707]: I0320 09:27:55.202505 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.206301 master-0 kubenswrapper[18707]: I0320 09:27:55.206257 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.206944 master-0 kubenswrapper[18707]: I0320 09:27:55.206892 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.207274 master-0 kubenswrapper[18707]: I0320 09:27:55.207233 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.208523 master-0 kubenswrapper[18707]: I0320 09:27:55.208438 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.210062 master-0 kubenswrapper[18707]: I0320 09:27:55.210024 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.224453 master-0 kubenswrapper[18707]: I0320 09:27:55.224400 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5dmk\" (UniqueName: \"kubernetes.io/projected/49891315-b41e-48c6-a4d7-823c47f97bf0-kube-api-access-n5dmk\") pod \"neutron-metadata-dataplane-step-2-edpm-b-qvqxd\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:55.403028 master-0 kubenswrapper[18707]: I0320 09:27:55.402851 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:27:56.129145 master-0 kubenswrapper[18707]: I0320 09:27:56.128823 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd"] Mar 20 09:27:56.708073 master-0 kubenswrapper[18707]: I0320 09:27:56.708031 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:27:56.832116 master-0 kubenswrapper[18707]: I0320 09:27:56.831855 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" event={"ID":"49891315-b41e-48c6-a4d7-823c47f97bf0","Type":"ContainerStarted","Data":"74936676dc1a6c3b2caddd737c74e8bdfa0a2166d20f41df1d0a302012d167f7"} Mar 20 09:27:57.123863 master-0 kubenswrapper[18707]: I0320 09:27:57.123776 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242"] Mar 20 09:27:57.125705 master-0 kubenswrapper[18707]: I0320 09:27:57.125652 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242"] Mar 20 09:27:57.125821 master-0 kubenswrapper[18707]: I0320 09:27:57.125791 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.128793 master-0 kubenswrapper[18707]: I0320 09:27:57.128756 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:27:57.188968 master-0 kubenswrapper[18707]: I0320 09:27:57.188910 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.189720 master-0 kubenswrapper[18707]: I0320 09:27:57.189698 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.189874 master-0 kubenswrapper[18707]: I0320 09:27:57.189857 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.190083 master-0 kubenswrapper[18707]: I0320 09:27:57.190056 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x25rl\" (UniqueName: \"kubernetes.io/projected/ef5ff3d9-0b16-413a-9660-f387fb6b0089-kube-api-access-x25rl\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.190256 master-0 kubenswrapper[18707]: I0320 09:27:57.190235 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.190495 master-0 kubenswrapper[18707]: I0320 09:27:57.190472 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.293172 master-0 kubenswrapper[18707]: I0320 09:27:57.293000 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.293432 master-0 kubenswrapper[18707]: I0320 09:27:57.293240 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.293432 master-0 kubenswrapper[18707]: I0320 09:27:57.293285 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.293432 master-0 kubenswrapper[18707]: I0320 09:27:57.293350 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x25rl\" (UniqueName: \"kubernetes.io/projected/ef5ff3d9-0b16-413a-9660-f387fb6b0089-kube-api-access-x25rl\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.293432 master-0 kubenswrapper[18707]: I0320 09:27:57.293400 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.293588 master-0 kubenswrapper[18707]: I0320 09:27:57.293497 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.298342 master-0 kubenswrapper[18707]: I0320 09:27:57.297117 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.298342 master-0 kubenswrapper[18707]: I0320 09:27:57.297399 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.298798 master-0 kubenswrapper[18707]: I0320 09:27:57.298738 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.299392 master-0 kubenswrapper[18707]: I0320 09:27:57.299370 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.310883 master-0 kubenswrapper[18707]: I0320 09:27:57.310577 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.313236 master-0 kubenswrapper[18707]: I0320 09:27:57.312736 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x25rl\" (UniqueName: \"kubernetes.io/projected/ef5ff3d9-0b16-413a-9660-f387fb6b0089-kube-api-access-x25rl\") pod \"neutron-metadata-dataplane-step-2-edpm-a-kf242\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.447156 master-0 kubenswrapper[18707]: I0320 09:27:57.447090 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:27:57.845824 master-0 kubenswrapper[18707]: I0320 09:27:57.845762 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" event={"ID":"49891315-b41e-48c6-a4d7-823c47f97bf0","Type":"ContainerStarted","Data":"4fdf8c2c2f41721b62e1a5a30972cc7b04d94fcf43980b5d1bce8709abd0c847"} Mar 20 09:27:57.884597 master-0 kubenswrapper[18707]: I0320 09:27:57.884345 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" podStartSLOduration=2.318747473 podStartE2EDuration="2.884320952s" podCreationTimestamp="2026-03-20 09:27:55 +0000 UTC" firstStartedPulling="2026-03-20 09:27:56.139752666 +0000 UTC m=+2821.295933022" lastFinishedPulling="2026-03-20 09:27:56.705326145 +0000 UTC m=+2821.861506501" observedRunningTime="2026-03-20 09:27:57.872751253 +0000 UTC m=+2823.028931659" watchObservedRunningTime="2026-03-20 09:27:57.884320952 +0000 UTC m=+2823.040501308" Mar 20 09:27:58.071254 master-0 kubenswrapper[18707]: I0320 09:27:58.071148 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242"] Mar 20 09:27:58.863321 master-0 kubenswrapper[18707]: I0320 09:27:58.861508 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" event={"ID":"ef5ff3d9-0b16-413a-9660-f387fb6b0089","Type":"ContainerStarted","Data":"32a8b43c3b8fc3c6092d5da5a0b35183afc5c64e718df6f95dbd317df04b6eb1"} Mar 20 09:27:58.863321 master-0 kubenswrapper[18707]: I0320 09:27:58.861574 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" event={"ID":"ef5ff3d9-0b16-413a-9660-f387fb6b0089","Type":"ContainerStarted","Data":"c423b0728d07ef5fb2ec7ef66d26b92055ebe6e599dcb60b14fcd945d58050ff"} Mar 20 09:27:58.883602 master-0 kubenswrapper[18707]: I0320 09:27:58.883522 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" podStartSLOduration=1.495185999 podStartE2EDuration="1.883502548s" podCreationTimestamp="2026-03-20 09:27:57 +0000 UTC" firstStartedPulling="2026-03-20 09:27:58.071280647 +0000 UTC m=+2823.227461013" lastFinishedPulling="2026-03-20 09:27:58.459597216 +0000 UTC m=+2823.615777562" observedRunningTime="2026-03-20 09:27:58.878884466 +0000 UTC m=+2824.035064822" watchObservedRunningTime="2026-03-20 09:27:58.883502548 +0000 UTC m=+2824.039682904" Mar 20 09:28:37.323905 master-0 kubenswrapper[18707]: I0320 09:28:37.323602 18707 generic.go:334] "Generic (PLEG): container finished" podID="49891315-b41e-48c6-a4d7-823c47f97bf0" containerID="4fdf8c2c2f41721b62e1a5a30972cc7b04d94fcf43980b5d1bce8709abd0c847" exitCode=2 Mar 20 09:28:37.323905 master-0 kubenswrapper[18707]: I0320 09:28:37.323895 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" event={"ID":"49891315-b41e-48c6-a4d7-823c47f97bf0","Type":"ContainerDied","Data":"4fdf8c2c2f41721b62e1a5a30972cc7b04d94fcf43980b5d1bce8709abd0c847"} Mar 20 09:28:38.864257 master-0 kubenswrapper[18707]: I0320 09:28:38.864209 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:28:38.967894 master-0 kubenswrapper[18707]: I0320 09:28:38.967845 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5dmk\" (UniqueName: \"kubernetes.io/projected/49891315-b41e-48c6-a4d7-823c47f97bf0-kube-api-access-n5dmk\") pod \"49891315-b41e-48c6-a4d7-823c47f97bf0\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " Mar 20 09:28:38.968342 master-0 kubenswrapper[18707]: I0320 09:28:38.967910 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-neutron-metadata-combined-ca-bundle\") pod \"49891315-b41e-48c6-a4d7-823c47f97bf0\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " Mar 20 09:28:38.968342 master-0 kubenswrapper[18707]: I0320 09:28:38.967944 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-ssh-key-edpm-b\") pod \"49891315-b41e-48c6-a4d7-823c47f97bf0\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " Mar 20 09:28:38.968342 master-0 kubenswrapper[18707]: I0320 09:28:38.968059 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-nova-metadata-neutron-config-0\") pod \"49891315-b41e-48c6-a4d7-823c47f97bf0\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " Mar 20 09:28:38.968342 master-0 kubenswrapper[18707]: I0320 09:28:38.968129 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-inventory\") pod \"49891315-b41e-48c6-a4d7-823c47f97bf0\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " Mar 20 09:28:38.998844 master-0 kubenswrapper[18707]: I0320 09:28:38.968450 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"49891315-b41e-48c6-a4d7-823c47f97bf0\" (UID: \"49891315-b41e-48c6-a4d7-823c47f97bf0\") " Mar 20 09:28:38.998844 master-0 kubenswrapper[18707]: I0320 09:28:38.972536 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "49891315-b41e-48c6-a4d7-823c47f97bf0" (UID: "49891315-b41e-48c6-a4d7-823c47f97bf0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:28:38.998844 master-0 kubenswrapper[18707]: I0320 09:28:38.974798 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49891315-b41e-48c6-a4d7-823c47f97bf0-kube-api-access-n5dmk" (OuterVolumeSpecName: "kube-api-access-n5dmk") pod "49891315-b41e-48c6-a4d7-823c47f97bf0" (UID: "49891315-b41e-48c6-a4d7-823c47f97bf0"). InnerVolumeSpecName "kube-api-access-n5dmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:28:38.998844 master-0 kubenswrapper[18707]: I0320 09:28:38.998767 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "49891315-b41e-48c6-a4d7-823c47f97bf0" (UID: "49891315-b41e-48c6-a4d7-823c47f97bf0"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:28:39.000444 master-0 kubenswrapper[18707]: I0320 09:28:39.000390 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "49891315-b41e-48c6-a4d7-823c47f97bf0" (UID: "49891315-b41e-48c6-a4d7-823c47f97bf0"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:28:39.000817 master-0 kubenswrapper[18707]: I0320 09:28:39.000769 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-inventory" (OuterVolumeSpecName: "inventory") pod "49891315-b41e-48c6-a4d7-823c47f97bf0" (UID: "49891315-b41e-48c6-a4d7-823c47f97bf0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:28:39.004365 master-0 kubenswrapper[18707]: I0320 09:28:39.004320 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "49891315-b41e-48c6-a4d7-823c47f97bf0" (UID: "49891315-b41e-48c6-a4d7-823c47f97bf0"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:28:39.081845 master-0 kubenswrapper[18707]: I0320 09:28:39.081755 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5dmk\" (UniqueName: \"kubernetes.io/projected/49891315-b41e-48c6-a4d7-823c47f97bf0-kube-api-access-n5dmk\") on node \"master-0\" DevicePath \"\"" Mar 20 09:28:39.081845 master-0 kubenswrapper[18707]: I0320 09:28:39.081809 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:28:39.081845 master-0 kubenswrapper[18707]: I0320 09:28:39.081825 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:28:39.081845 master-0 kubenswrapper[18707]: I0320 09:28:39.081839 18707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-nova-metadata-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:28:39.081845 master-0 kubenswrapper[18707]: I0320 09:28:39.081851 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:28:39.081845 master-0 kubenswrapper[18707]: I0320 09:28:39.081861 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/49891315-b41e-48c6-a4d7-823c47f97bf0-neutron-ovn-metadata-agent-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:28:39.371689 master-0 kubenswrapper[18707]: I0320 09:28:39.370832 18707 generic.go:334] "Generic (PLEG): container finished" podID="ef5ff3d9-0b16-413a-9660-f387fb6b0089" containerID="32a8b43c3b8fc3c6092d5da5a0b35183afc5c64e718df6f95dbd317df04b6eb1" exitCode=2 Mar 20 09:28:39.371689 master-0 kubenswrapper[18707]: I0320 09:28:39.370914 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" event={"ID":"ef5ff3d9-0b16-413a-9660-f387fb6b0089","Type":"ContainerDied","Data":"32a8b43c3b8fc3c6092d5da5a0b35183afc5c64e718df6f95dbd317df04b6eb1"} Mar 20 09:28:39.374306 master-0 kubenswrapper[18707]: I0320 09:28:39.374209 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" event={"ID":"49891315-b41e-48c6-a4d7-823c47f97bf0","Type":"ContainerDied","Data":"74936676dc1a6c3b2caddd737c74e8bdfa0a2166d20f41df1d0a302012d167f7"} Mar 20 09:28:39.374306 master-0 kubenswrapper[18707]: I0320 09:28:39.374257 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74936676dc1a6c3b2caddd737c74e8bdfa0a2166d20f41df1d0a302012d167f7" Mar 20 09:28:39.374505 master-0 kubenswrapper[18707]: I0320 09:28:39.374400 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-qvqxd" Mar 20 09:28:40.890993 master-0 kubenswrapper[18707]: I0320 09:28:40.890938 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:28:41.045654 master-0 kubenswrapper[18707]: I0320 09:28:41.045116 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-ssh-key-edpm-a\") pod \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " Mar 20 09:28:41.046221 master-0 kubenswrapper[18707]: I0320 09:28:41.046161 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " Mar 20 09:28:41.046292 master-0 kubenswrapper[18707]: I0320 09:28:41.046241 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-nova-metadata-neutron-config-0\") pod \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " Mar 20 09:28:41.046475 master-0 kubenswrapper[18707]: I0320 09:28:41.046438 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-inventory\") pod \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " Mar 20 09:28:41.046546 master-0 kubenswrapper[18707]: I0320 09:28:41.046488 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x25rl\" (UniqueName: \"kubernetes.io/projected/ef5ff3d9-0b16-413a-9660-f387fb6b0089-kube-api-access-x25rl\") pod \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " Mar 20 09:28:41.046546 master-0 kubenswrapper[18707]: I0320 09:28:41.046535 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-neutron-metadata-combined-ca-bundle\") pod \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\" (UID: \"ef5ff3d9-0b16-413a-9660-f387fb6b0089\") " Mar 20 09:28:41.052607 master-0 kubenswrapper[18707]: I0320 09:28:41.052546 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef5ff3d9-0b16-413a-9660-f387fb6b0089-kube-api-access-x25rl" (OuterVolumeSpecName: "kube-api-access-x25rl") pod "ef5ff3d9-0b16-413a-9660-f387fb6b0089" (UID: "ef5ff3d9-0b16-413a-9660-f387fb6b0089"). InnerVolumeSpecName "kube-api-access-x25rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:28:41.052607 master-0 kubenswrapper[18707]: I0320 09:28:41.052535 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ef5ff3d9-0b16-413a-9660-f387fb6b0089" (UID: "ef5ff3d9-0b16-413a-9660-f387fb6b0089"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:28:41.073626 master-0 kubenswrapper[18707]: I0320 09:28:41.073562 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-inventory" (OuterVolumeSpecName: "inventory") pod "ef5ff3d9-0b16-413a-9660-f387fb6b0089" (UID: "ef5ff3d9-0b16-413a-9660-f387fb6b0089"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:28:41.074108 master-0 kubenswrapper[18707]: I0320 09:28:41.074048 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ef5ff3d9-0b16-413a-9660-f387fb6b0089" (UID: "ef5ff3d9-0b16-413a-9660-f387fb6b0089"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:28:41.078977 master-0 kubenswrapper[18707]: I0320 09:28:41.078258 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ef5ff3d9-0b16-413a-9660-f387fb6b0089" (UID: "ef5ff3d9-0b16-413a-9660-f387fb6b0089"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:28:41.081990 master-0 kubenswrapper[18707]: I0320 09:28:41.081942 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "ef5ff3d9-0b16-413a-9660-f387fb6b0089" (UID: "ef5ff3d9-0b16-413a-9660-f387fb6b0089"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:28:41.149797 master-0 kubenswrapper[18707]: I0320 09:28:41.149749 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:28:41.150089 master-0 kubenswrapper[18707]: I0320 09:28:41.150074 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x25rl\" (UniqueName: \"kubernetes.io/projected/ef5ff3d9-0b16-413a-9660-f387fb6b0089-kube-api-access-x25rl\") on node \"master-0\" DevicePath \"\"" Mar 20 09:28:41.150160 master-0 kubenswrapper[18707]: I0320 09:28:41.150149 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:28:41.150521 master-0 kubenswrapper[18707]: I0320 09:28:41.150501 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:28:41.150618 master-0 kubenswrapper[18707]: I0320 09:28:41.150602 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-neutron-ovn-metadata-agent-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:28:41.150702 master-0 kubenswrapper[18707]: I0320 09:28:41.150688 18707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ef5ff3d9-0b16-413a-9660-f387fb6b0089-nova-metadata-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:28:41.396907 master-0 kubenswrapper[18707]: I0320 09:28:41.396767 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" event={"ID":"ef5ff3d9-0b16-413a-9660-f387fb6b0089","Type":"ContainerDied","Data":"c423b0728d07ef5fb2ec7ef66d26b92055ebe6e599dcb60b14fcd945d58050ff"} Mar 20 09:28:41.396907 master-0 kubenswrapper[18707]: I0320 09:28:41.396814 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c423b0728d07ef5fb2ec7ef66d26b92055ebe6e599dcb60b14fcd945d58050ff" Mar 20 09:28:41.396907 master-0 kubenswrapper[18707]: I0320 09:28:41.396813 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-kf242" Mar 20 09:29:56.056662 master-0 kubenswrapper[18707]: I0320 09:29:56.056584 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx"] Mar 20 09:29:56.057433 master-0 kubenswrapper[18707]: E0320 09:29:56.057322 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49891315-b41e-48c6-a4d7-823c47f97bf0" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:29:56.057433 master-0 kubenswrapper[18707]: I0320 09:29:56.057344 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="49891315-b41e-48c6-a4d7-823c47f97bf0" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:29:56.057433 master-0 kubenswrapper[18707]: E0320 09:29:56.057403 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5ff3d9-0b16-413a-9660-f387fb6b0089" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:29:56.057433 master-0 kubenswrapper[18707]: I0320 09:29:56.057415 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5ff3d9-0b16-413a-9660-f387fb6b0089" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:29:56.057805 master-0 kubenswrapper[18707]: I0320 09:29:56.057764 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="49891315-b41e-48c6-a4d7-823c47f97bf0" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:29:56.057879 master-0 kubenswrapper[18707]: I0320 09:29:56.057809 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef5ff3d9-0b16-413a-9660-f387fb6b0089" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:29:56.058981 master-0 kubenswrapper[18707]: I0320 09:29:56.058939 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.076731 master-0 kubenswrapper[18707]: I0320 09:29:56.063056 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 09:29:56.076731 master-0 kubenswrapper[18707]: I0320 09:29:56.063532 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 09:29:56.076731 master-0 kubenswrapper[18707]: I0320 09:29:56.066464 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:29:56.076731 master-0 kubenswrapper[18707]: I0320 09:29:56.067147 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 09:29:56.076731 master-0 kubenswrapper[18707]: I0320 09:29:56.067336 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-b" Mar 20 09:29:56.085356 master-0 kubenswrapper[18707]: I0320 09:29:56.083828 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx"] Mar 20 09:29:56.121005 master-0 kubenswrapper[18707]: I0320 09:29:56.120863 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nccx\" (UniqueName: \"kubernetes.io/projected/cbf24ba7-9890-4461-b037-967fc9619197-kube-api-access-7nccx\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.121280 master-0 kubenswrapper[18707]: I0320 09:29:56.121256 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.121429 master-0 kubenswrapper[18707]: I0320 09:29:56.121400 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.121617 master-0 kubenswrapper[18707]: I0320 09:29:56.121580 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.121792 master-0 kubenswrapper[18707]: I0320 09:29:56.121760 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.121880 master-0 kubenswrapper[18707]: I0320 09:29:56.121857 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.224553 master-0 kubenswrapper[18707]: I0320 09:29:56.224471 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nccx\" (UniqueName: \"kubernetes.io/projected/cbf24ba7-9890-4461-b037-967fc9619197-kube-api-access-7nccx\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.224864 master-0 kubenswrapper[18707]: I0320 09:29:56.224719 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.224980 master-0 kubenswrapper[18707]: I0320 09:29:56.224943 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.225217 master-0 kubenswrapper[18707]: I0320 09:29:56.225173 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.225480 master-0 kubenswrapper[18707]: I0320 09:29:56.225438 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.225559 master-0 kubenswrapper[18707]: I0320 09:29:56.225525 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.229394 master-0 kubenswrapper[18707]: I0320 09:29:56.228694 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.229394 master-0 kubenswrapper[18707]: I0320 09:29:56.228969 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.229394 master-0 kubenswrapper[18707]: I0320 09:29:56.229107 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.229394 master-0 kubenswrapper[18707]: I0320 09:29:56.229299 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.232036 master-0 kubenswrapper[18707]: I0320 09:29:56.231622 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.252700 master-0 kubenswrapper[18707]: I0320 09:29:56.252599 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nccx\" (UniqueName: \"kubernetes.io/projected/cbf24ba7-9890-4461-b037-967fc9619197-kube-api-access-7nccx\") pod \"neutron-metadata-dataplane-step-2-edpm-b-fq5nx\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.405591 master-0 kubenswrapper[18707]: I0320 09:29:56.405426 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:29:56.994658 master-0 kubenswrapper[18707]: I0320 09:29:56.994598 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx"] Mar 20 09:29:57.000503 master-0 kubenswrapper[18707]: I0320 09:29:57.000460 18707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:29:57.468167 master-0 kubenswrapper[18707]: I0320 09:29:57.468088 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" event={"ID":"cbf24ba7-9890-4461-b037-967fc9619197","Type":"ContainerStarted","Data":"29c403cc317c829515731e0a580994c322dfa333c0f08c81dd04cfccd0d95695"} Mar 20 09:29:58.062011 master-0 kubenswrapper[18707]: I0320 09:29:58.060095 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9"] Mar 20 09:29:58.062011 master-0 kubenswrapper[18707]: I0320 09:29:58.061978 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.066312 master-0 kubenswrapper[18707]: I0320 09:29:58.066271 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:29:58.084201 master-0 kubenswrapper[18707]: I0320 09:29:58.083734 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76dhl\" (UniqueName: \"kubernetes.io/projected/1c846643-c449-4aa6-b35f-590d37916080-kube-api-access-76dhl\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.084201 master-0 kubenswrapper[18707]: I0320 09:29:58.084040 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.084201 master-0 kubenswrapper[18707]: I0320 09:29:58.084102 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.084201 master-0 kubenswrapper[18707]: I0320 09:29:58.084133 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.084621 master-0 kubenswrapper[18707]: I0320 09:29:58.084299 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.084621 master-0 kubenswrapper[18707]: I0320 09:29:58.084427 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.087298 master-0 kubenswrapper[18707]: I0320 09:29:58.086765 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9"] Mar 20 09:29:58.186094 master-0 kubenswrapper[18707]: I0320 09:29:58.186019 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.186426 master-0 kubenswrapper[18707]: I0320 09:29:58.186143 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.186426 master-0 kubenswrapper[18707]: I0320 09:29:58.186292 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76dhl\" (UniqueName: \"kubernetes.io/projected/1c846643-c449-4aa6-b35f-590d37916080-kube-api-access-76dhl\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.187105 master-0 kubenswrapper[18707]: I0320 09:29:58.187065 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.187236 master-0 kubenswrapper[18707]: I0320 09:29:58.187213 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.187309 master-0 kubenswrapper[18707]: I0320 09:29:58.187256 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.190174 master-0 kubenswrapper[18707]: I0320 09:29:58.190068 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.191279 master-0 kubenswrapper[18707]: I0320 09:29:58.191229 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.191580 master-0 kubenswrapper[18707]: I0320 09:29:58.191448 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.198678 master-0 kubenswrapper[18707]: I0320 09:29:58.198601 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.201863 master-0 kubenswrapper[18707]: I0320 09:29:58.201828 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.204889 master-0 kubenswrapper[18707]: I0320 09:29:58.204847 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76dhl\" (UniqueName: \"kubernetes.io/projected/1c846643-c449-4aa6-b35f-590d37916080-kube-api-access-76dhl\") pod \"neutron-metadata-dataplane-step-2-edpm-a-tk7z9\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.399958 master-0 kubenswrapper[18707]: I0320 09:29:58.399795 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:29:58.496772 master-0 kubenswrapper[18707]: I0320 09:29:58.496710 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" event={"ID":"cbf24ba7-9890-4461-b037-967fc9619197","Type":"ContainerStarted","Data":"2363119d83b5fd9d69014e9a1b23cd67b7146db3b6449a5480245c77bdb4f148"} Mar 20 09:29:58.518602 master-0 kubenswrapper[18707]: I0320 09:29:58.518515 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" podStartSLOduration=2.026911077 podStartE2EDuration="2.518487562s" podCreationTimestamp="2026-03-20 09:29:56 +0000 UTC" firstStartedPulling="2026-03-20 09:29:57.000423437 +0000 UTC m=+2942.156603813" lastFinishedPulling="2026-03-20 09:29:57.491999942 +0000 UTC m=+2942.648180298" observedRunningTime="2026-03-20 09:29:58.518353289 +0000 UTC m=+2943.674533645" watchObservedRunningTime="2026-03-20 09:29:58.518487562 +0000 UTC m=+2943.674667918" Mar 20 09:29:58.977856 master-0 kubenswrapper[18707]: W0320 09:29:58.977803 18707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c846643_c449_4aa6_b35f_590d37916080.slice/crio-0a813dc150a9f3c3b96350a2d38646a69575a2a3dccb88f9766944cb639aef29 WatchSource:0}: Error finding container 0a813dc150a9f3c3b96350a2d38646a69575a2a3dccb88f9766944cb639aef29: Status 404 returned error can't find the container with id 0a813dc150a9f3c3b96350a2d38646a69575a2a3dccb88f9766944cb639aef29 Mar 20 09:29:58.979831 master-0 kubenswrapper[18707]: I0320 09:29:58.979612 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9"] Mar 20 09:29:59.508417 master-0 kubenswrapper[18707]: I0320 09:29:59.508351 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" event={"ID":"1c846643-c449-4aa6-b35f-590d37916080","Type":"ContainerStarted","Data":"0a813dc150a9f3c3b96350a2d38646a69575a2a3dccb88f9766944cb639aef29"} Mar 20 09:30:00.537306 master-0 kubenswrapper[18707]: I0320 09:30:00.537147 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" event={"ID":"1c846643-c449-4aa6-b35f-590d37916080","Type":"ContainerStarted","Data":"d6f5e50e51a8b886dfde86ffa23b937577313a1750d4ffb7d6cd035a6bb6b221"} Mar 20 09:30:00.586012 master-0 kubenswrapper[18707]: I0320 09:30:00.585885 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" podStartSLOduration=2.000053761 podStartE2EDuration="2.585833994s" podCreationTimestamp="2026-03-20 09:29:58 +0000 UTC" firstStartedPulling="2026-03-20 09:29:58.981701301 +0000 UTC m=+2944.137881657" lastFinishedPulling="2026-03-20 09:29:59.567481524 +0000 UTC m=+2944.723661890" observedRunningTime="2026-03-20 09:30:00.563814778 +0000 UTC m=+2945.719995134" watchObservedRunningTime="2026-03-20 09:30:00.585833994 +0000 UTC m=+2945.742014360" Mar 20 09:30:38.060701 master-0 kubenswrapper[18707]: I0320 09:30:38.060627 18707 generic.go:334] "Generic (PLEG): container finished" podID="cbf24ba7-9890-4461-b037-967fc9619197" containerID="2363119d83b5fd9d69014e9a1b23cd67b7146db3b6449a5480245c77bdb4f148" exitCode=2 Mar 20 09:30:38.060701 master-0 kubenswrapper[18707]: I0320 09:30:38.060688 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" event={"ID":"cbf24ba7-9890-4461-b037-967fc9619197","Type":"ContainerDied","Data":"2363119d83b5fd9d69014e9a1b23cd67b7146db3b6449a5480245c77bdb4f148"} Mar 20 09:30:39.684554 master-0 kubenswrapper[18707]: I0320 09:30:39.684506 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:30:39.780224 master-0 kubenswrapper[18707]: I0320 09:30:39.780140 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-ssh-key-edpm-b\") pod \"cbf24ba7-9890-4461-b037-967fc9619197\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " Mar 20 09:30:39.780469 master-0 kubenswrapper[18707]: I0320 09:30:39.780294 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-nova-metadata-neutron-config-0\") pod \"cbf24ba7-9890-4461-b037-967fc9619197\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " Mar 20 09:30:39.780469 master-0 kubenswrapper[18707]: I0320 09:30:39.780377 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-neutron-ovn-metadata-agent-neutron-config-0\") pod \"cbf24ba7-9890-4461-b037-967fc9619197\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " Mar 20 09:30:39.780689 master-0 kubenswrapper[18707]: I0320 09:30:39.780555 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-neutron-metadata-combined-ca-bundle\") pod \"cbf24ba7-9890-4461-b037-967fc9619197\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " Mar 20 09:30:39.780689 master-0 kubenswrapper[18707]: I0320 09:30:39.780638 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nccx\" (UniqueName: \"kubernetes.io/projected/cbf24ba7-9890-4461-b037-967fc9619197-kube-api-access-7nccx\") pod \"cbf24ba7-9890-4461-b037-967fc9619197\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " Mar 20 09:30:39.781278 master-0 kubenswrapper[18707]: I0320 09:30:39.780826 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-inventory\") pod \"cbf24ba7-9890-4461-b037-967fc9619197\" (UID: \"cbf24ba7-9890-4461-b037-967fc9619197\") " Mar 20 09:30:39.806318 master-0 kubenswrapper[18707]: I0320 09:30:39.804565 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "cbf24ba7-9890-4461-b037-967fc9619197" (UID: "cbf24ba7-9890-4461-b037-967fc9619197"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:30:39.806318 master-0 kubenswrapper[18707]: I0320 09:30:39.804684 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbf24ba7-9890-4461-b037-967fc9619197-kube-api-access-7nccx" (OuterVolumeSpecName: "kube-api-access-7nccx") pod "cbf24ba7-9890-4461-b037-967fc9619197" (UID: "cbf24ba7-9890-4461-b037-967fc9619197"). InnerVolumeSpecName "kube-api-access-7nccx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:30:39.812056 master-0 kubenswrapper[18707]: I0320 09:30:39.810330 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "cbf24ba7-9890-4461-b037-967fc9619197" (UID: "cbf24ba7-9890-4461-b037-967fc9619197"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:30:39.812056 master-0 kubenswrapper[18707]: I0320 09:30:39.811239 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "cbf24ba7-9890-4461-b037-967fc9619197" (UID: "cbf24ba7-9890-4461-b037-967fc9619197"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:30:39.812056 master-0 kubenswrapper[18707]: I0320 09:30:39.811993 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-inventory" (OuterVolumeSpecName: "inventory") pod "cbf24ba7-9890-4461-b037-967fc9619197" (UID: "cbf24ba7-9890-4461-b037-967fc9619197"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:30:39.819948 master-0 kubenswrapper[18707]: I0320 09:30:39.819878 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "cbf24ba7-9890-4461-b037-967fc9619197" (UID: "cbf24ba7-9890-4461-b037-967fc9619197"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:30:39.887232 master-0 kubenswrapper[18707]: I0320 09:30:39.884045 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:30:39.887232 master-0 kubenswrapper[18707]: I0320 09:30:39.884103 18707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-nova-metadata-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:30:39.887232 master-0 kubenswrapper[18707]: I0320 09:30:39.884121 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-neutron-ovn-metadata-agent-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:30:39.887232 master-0 kubenswrapper[18707]: I0320 09:30:39.884139 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:30:39.887232 master-0 kubenswrapper[18707]: I0320 09:30:39.884153 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nccx\" (UniqueName: \"kubernetes.io/projected/cbf24ba7-9890-4461-b037-967fc9619197-kube-api-access-7nccx\") on node \"master-0\" DevicePath \"\"" Mar 20 09:30:39.887232 master-0 kubenswrapper[18707]: I0320 09:30:39.884166 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbf24ba7-9890-4461-b037-967fc9619197-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:30:40.088022 master-0 kubenswrapper[18707]: I0320 09:30:40.087868 18707 generic.go:334] "Generic (PLEG): container finished" podID="1c846643-c449-4aa6-b35f-590d37916080" containerID="d6f5e50e51a8b886dfde86ffa23b937577313a1750d4ffb7d6cd035a6bb6b221" exitCode=2 Mar 20 09:30:40.088271 master-0 kubenswrapper[18707]: I0320 09:30:40.087990 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" event={"ID":"1c846643-c449-4aa6-b35f-590d37916080","Type":"ContainerDied","Data":"d6f5e50e51a8b886dfde86ffa23b937577313a1750d4ffb7d6cd035a6bb6b221"} Mar 20 09:30:40.090559 master-0 kubenswrapper[18707]: I0320 09:30:40.090522 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" event={"ID":"cbf24ba7-9890-4461-b037-967fc9619197","Type":"ContainerDied","Data":"29c403cc317c829515731e0a580994c322dfa333c0f08c81dd04cfccd0d95695"} Mar 20 09:30:40.090711 master-0 kubenswrapper[18707]: I0320 09:30:40.090668 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29c403cc317c829515731e0a580994c322dfa333c0f08c81dd04cfccd0d95695" Mar 20 09:30:40.090876 master-0 kubenswrapper[18707]: I0320 09:30:40.090566 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-fq5nx" Mar 20 09:30:41.617430 master-0 kubenswrapper[18707]: I0320 09:30:41.617199 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:30:41.729544 master-0 kubenswrapper[18707]: I0320 09:30:41.729445 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76dhl\" (UniqueName: \"kubernetes.io/projected/1c846643-c449-4aa6-b35f-590d37916080-kube-api-access-76dhl\") pod \"1c846643-c449-4aa6-b35f-590d37916080\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " Mar 20 09:30:41.729970 master-0 kubenswrapper[18707]: I0320 09:30:41.729944 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-neutron-ovn-metadata-agent-neutron-config-0\") pod \"1c846643-c449-4aa6-b35f-590d37916080\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " Mar 20 09:30:41.730216 master-0 kubenswrapper[18707]: I0320 09:30:41.730178 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-ssh-key-edpm-a\") pod \"1c846643-c449-4aa6-b35f-590d37916080\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " Mar 20 09:30:41.730501 master-0 kubenswrapper[18707]: I0320 09:30:41.730436 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-nova-metadata-neutron-config-0\") pod \"1c846643-c449-4aa6-b35f-590d37916080\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " Mar 20 09:30:41.730766 master-0 kubenswrapper[18707]: I0320 09:30:41.730747 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-inventory\") pod \"1c846643-c449-4aa6-b35f-590d37916080\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " Mar 20 09:30:41.730874 master-0 kubenswrapper[18707]: I0320 09:30:41.730858 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-neutron-metadata-combined-ca-bundle\") pod \"1c846643-c449-4aa6-b35f-590d37916080\" (UID: \"1c846643-c449-4aa6-b35f-590d37916080\") " Mar 20 09:30:41.733812 master-0 kubenswrapper[18707]: I0320 09:30:41.733731 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c846643-c449-4aa6-b35f-590d37916080-kube-api-access-76dhl" (OuterVolumeSpecName: "kube-api-access-76dhl") pod "1c846643-c449-4aa6-b35f-590d37916080" (UID: "1c846643-c449-4aa6-b35f-590d37916080"). InnerVolumeSpecName "kube-api-access-76dhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:30:41.734625 master-0 kubenswrapper[18707]: I0320 09:30:41.734573 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1c846643-c449-4aa6-b35f-590d37916080" (UID: "1c846643-c449-4aa6-b35f-590d37916080"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:30:41.764232 master-0 kubenswrapper[18707]: I0320 09:30:41.764054 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "1c846643-c449-4aa6-b35f-590d37916080" (UID: "1c846643-c449-4aa6-b35f-590d37916080"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:30:41.765211 master-0 kubenswrapper[18707]: I0320 09:30:41.765150 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "1c846643-c449-4aa6-b35f-590d37916080" (UID: "1c846643-c449-4aa6-b35f-590d37916080"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:30:41.765841 master-0 kubenswrapper[18707]: I0320 09:30:41.765754 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "1c846643-c449-4aa6-b35f-590d37916080" (UID: "1c846643-c449-4aa6-b35f-590d37916080"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:30:41.766078 master-0 kubenswrapper[18707]: I0320 09:30:41.766045 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-inventory" (OuterVolumeSpecName: "inventory") pod "1c846643-c449-4aa6-b35f-590d37916080" (UID: "1c846643-c449-4aa6-b35f-590d37916080"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:30:41.835008 master-0 kubenswrapper[18707]: I0320 09:30:41.834837 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-neutron-ovn-metadata-agent-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:30:41.835008 master-0 kubenswrapper[18707]: I0320 09:30:41.834896 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:30:41.835008 master-0 kubenswrapper[18707]: I0320 09:30:41.834910 18707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-nova-metadata-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:30:41.835008 master-0 kubenswrapper[18707]: I0320 09:30:41.834922 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:30:41.835008 master-0 kubenswrapper[18707]: I0320 09:30:41.834933 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c846643-c449-4aa6-b35f-590d37916080-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:30:41.835008 master-0 kubenswrapper[18707]: I0320 09:30:41.834943 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76dhl\" (UniqueName: \"kubernetes.io/projected/1c846643-c449-4aa6-b35f-590d37916080-kube-api-access-76dhl\") on node \"master-0\" DevicePath \"\"" Mar 20 09:30:42.128473 master-0 kubenswrapper[18707]: I0320 09:30:42.128410 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" event={"ID":"1c846643-c449-4aa6-b35f-590d37916080","Type":"ContainerDied","Data":"0a813dc150a9f3c3b96350a2d38646a69575a2a3dccb88f9766944cb639aef29"} Mar 20 09:30:42.128784 master-0 kubenswrapper[18707]: I0320 09:30:42.128763 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a813dc150a9f3c3b96350a2d38646a69575a2a3dccb88f9766944cb639aef29" Mar 20 09:30:42.128896 master-0 kubenswrapper[18707]: I0320 09:30:42.128504 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-tk7z9" Mar 20 09:33:17.065602 master-0 kubenswrapper[18707]: I0320 09:33:17.065543 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh"] Mar 20 09:33:17.066798 master-0 kubenswrapper[18707]: E0320 09:33:17.066341 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c846643-c449-4aa6-b35f-590d37916080" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:33:17.066798 master-0 kubenswrapper[18707]: I0320 09:33:17.066362 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c846643-c449-4aa6-b35f-590d37916080" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:33:17.066798 master-0 kubenswrapper[18707]: E0320 09:33:17.066380 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf24ba7-9890-4461-b037-967fc9619197" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:33:17.066798 master-0 kubenswrapper[18707]: I0320 09:33:17.066392 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf24ba7-9890-4461-b037-967fc9619197" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:33:17.066798 master-0 kubenswrapper[18707]: I0320 09:33:17.066647 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c846643-c449-4aa6-b35f-590d37916080" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:33:17.066798 master-0 kubenswrapper[18707]: I0320 09:33:17.066708 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbf24ba7-9890-4461-b037-967fc9619197" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:33:17.067717 master-0 kubenswrapper[18707]: I0320 09:33:17.067646 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.105228 master-0 kubenswrapper[18707]: I0320 09:33:17.084479 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh"] Mar 20 09:33:17.108229 master-0 kubenswrapper[18707]: I0320 09:33:17.107841 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 09:33:17.108229 master-0 kubenswrapper[18707]: I0320 09:33:17.107960 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-b" Mar 20 09:33:17.108229 master-0 kubenswrapper[18707]: I0320 09:33:17.108092 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:33:17.108466 master-0 kubenswrapper[18707]: I0320 09:33:17.108236 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 09:33:17.108968 master-0 kubenswrapper[18707]: I0320 09:33:17.108908 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 09:33:17.226256 master-0 kubenswrapper[18707]: I0320 09:33:17.226162 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.226481 master-0 kubenswrapper[18707]: I0320 09:33:17.226450 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.226848 master-0 kubenswrapper[18707]: I0320 09:33:17.226773 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbcnq\" (UniqueName: \"kubernetes.io/projected/4d52574c-2e3d-4b9d-b467-687a99955d47-kube-api-access-zbcnq\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.226848 master-0 kubenswrapper[18707]: I0320 09:33:17.226815 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.226975 master-0 kubenswrapper[18707]: I0320 09:33:17.226956 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.229126 master-0 kubenswrapper[18707]: I0320 09:33:17.229079 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.332053 master-0 kubenswrapper[18707]: I0320 09:33:17.331542 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbcnq\" (UniqueName: \"kubernetes.io/projected/4d52574c-2e3d-4b9d-b467-687a99955d47-kube-api-access-zbcnq\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.332053 master-0 kubenswrapper[18707]: I0320 09:33:17.332000 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.332788 master-0 kubenswrapper[18707]: I0320 09:33:17.332756 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.332996 master-0 kubenswrapper[18707]: I0320 09:33:17.332919 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.333280 master-0 kubenswrapper[18707]: I0320 09:33:17.333107 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.333385 master-0 kubenswrapper[18707]: I0320 09:33:17.333280 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.338049 master-0 kubenswrapper[18707]: I0320 09:33:17.337977 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.338246 master-0 kubenswrapper[18707]: I0320 09:33:17.338159 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.338881 master-0 kubenswrapper[18707]: I0320 09:33:17.338475 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.338881 master-0 kubenswrapper[18707]: I0320 09:33:17.338598 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.345329 master-0 kubenswrapper[18707]: I0320 09:33:17.345277 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.353718 master-0 kubenswrapper[18707]: I0320 09:33:17.353662 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbcnq\" (UniqueName: \"kubernetes.io/projected/4d52574c-2e3d-4b9d-b467-687a99955d47-kube-api-access-zbcnq\") pod \"neutron-metadata-dataplane-step-2-edpm-b-5g8zh\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:17.439443 master-0 kubenswrapper[18707]: I0320 09:33:17.439377 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:33:18.025205 master-0 kubenswrapper[18707]: I0320 09:33:18.025129 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh"] Mar 20 09:33:18.199904 master-0 kubenswrapper[18707]: I0320 09:33:18.199791 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" event={"ID":"4d52574c-2e3d-4b9d-b467-687a99955d47","Type":"ContainerStarted","Data":"ad79042b47b23f54f972bcda59e2e8d2f1d29fdb05a10728458eb6c31d121814"} Mar 20 09:33:19.076102 master-0 kubenswrapper[18707]: I0320 09:33:19.075547 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k"] Mar 20 09:33:19.077537 master-0 kubenswrapper[18707]: I0320 09:33:19.077496 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.081023 master-0 kubenswrapper[18707]: I0320 09:33:19.080973 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:33:19.088215 master-0 kubenswrapper[18707]: I0320 09:33:19.088152 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k"] Mar 20 09:33:19.226432 master-0 kubenswrapper[18707]: I0320 09:33:19.226371 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" event={"ID":"4d52574c-2e3d-4b9d-b467-687a99955d47","Type":"ContainerStarted","Data":"c4bf75297aedf5844dee7d2000c6606a3c75e10345fcc8eb98e968a78a13ee93"} Mar 20 09:33:19.243632 master-0 kubenswrapper[18707]: I0320 09:33:19.243525 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" podStartSLOduration=1.593021313 podStartE2EDuration="2.243502436s" podCreationTimestamp="2026-03-20 09:33:17 +0000 UTC" firstStartedPulling="2026-03-20 09:33:18.028990469 +0000 UTC m=+3143.185170825" lastFinishedPulling="2026-03-20 09:33:18.679471572 +0000 UTC m=+3143.835651948" observedRunningTime="2026-03-20 09:33:19.242544409 +0000 UTC m=+3144.398724765" watchObservedRunningTime="2026-03-20 09:33:19.243502436 +0000 UTC m=+3144.399682792" Mar 20 09:33:19.281263 master-0 kubenswrapper[18707]: I0320 09:33:19.279570 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4blrz\" (UniqueName: \"kubernetes.io/projected/0e7018f9-065e-4556-8a68-b58fa9fe4402-kube-api-access-4blrz\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.281263 master-0 kubenswrapper[18707]: I0320 09:33:19.279701 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.281263 master-0 kubenswrapper[18707]: I0320 09:33:19.279769 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.281263 master-0 kubenswrapper[18707]: I0320 09:33:19.279849 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.281263 master-0 kubenswrapper[18707]: I0320 09:33:19.279877 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.281263 master-0 kubenswrapper[18707]: I0320 09:33:19.279900 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.382061 master-0 kubenswrapper[18707]: I0320 09:33:19.381910 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.382061 master-0 kubenswrapper[18707]: I0320 09:33:19.381986 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.383406 master-0 kubenswrapper[18707]: I0320 09:33:19.382129 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.383406 master-0 kubenswrapper[18707]: I0320 09:33:19.382174 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.383406 master-0 kubenswrapper[18707]: I0320 09:33:19.382228 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.383406 master-0 kubenswrapper[18707]: I0320 09:33:19.382437 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4blrz\" (UniqueName: \"kubernetes.io/projected/0e7018f9-065e-4556-8a68-b58fa9fe4402-kube-api-access-4blrz\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.388555 master-0 kubenswrapper[18707]: I0320 09:33:19.387544 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.388555 master-0 kubenswrapper[18707]: I0320 09:33:19.388475 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.390106 master-0 kubenswrapper[18707]: I0320 09:33:19.390048 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.392812 master-0 kubenswrapper[18707]: I0320 09:33:19.392620 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.405600 master-0 kubenswrapper[18707]: I0320 09:33:19.405546 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.406538 master-0 kubenswrapper[18707]: I0320 09:33:19.406463 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4blrz\" (UniqueName: \"kubernetes.io/projected/0e7018f9-065e-4556-8a68-b58fa9fe4402-kube-api-access-4blrz\") pod \"neutron-metadata-dataplane-step-2-edpm-a-8999k\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.407468 master-0 kubenswrapper[18707]: I0320 09:33:19.407427 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:33:19.993662 master-0 kubenswrapper[18707]: I0320 09:33:19.991059 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k"] Mar 20 09:33:20.237415 master-0 kubenswrapper[18707]: I0320 09:33:20.237284 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" event={"ID":"0e7018f9-065e-4556-8a68-b58fa9fe4402","Type":"ContainerStarted","Data":"e4fa23a3e77a658fa3378e7dc8f2f392de73aa81930e0a6ff93a46b81d00a9c6"} Mar 20 09:33:21.251768 master-0 kubenswrapper[18707]: I0320 09:33:21.251702 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" event={"ID":"0e7018f9-065e-4556-8a68-b58fa9fe4402","Type":"ContainerStarted","Data":"a34935b7d4513e8456ee712c4d3f1413861f125194f0b0dd2b59c47aebd22bbd"} Mar 20 09:33:21.273079 master-0 kubenswrapper[18707]: I0320 09:33:21.272999 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" podStartSLOduration=1.85995663 podStartE2EDuration="2.272728785s" podCreationTimestamp="2026-03-20 09:33:19 +0000 UTC" firstStartedPulling="2026-03-20 09:33:19.997494761 +0000 UTC m=+3145.153675117" lastFinishedPulling="2026-03-20 09:33:20.410266886 +0000 UTC m=+3145.566447272" observedRunningTime="2026-03-20 09:33:21.271309564 +0000 UTC m=+3146.427489920" watchObservedRunningTime="2026-03-20 09:33:21.272728785 +0000 UTC m=+3146.428909141" Mar 20 09:33:59.761425 master-0 kubenswrapper[18707]: I0320 09:33:59.761349 18707 generic.go:334] "Generic (PLEG): container finished" podID="4d52574c-2e3d-4b9d-b467-687a99955d47" containerID="c4bf75297aedf5844dee7d2000c6606a3c75e10345fcc8eb98e968a78a13ee93" exitCode=2 Mar 20 09:33:59.763255 master-0 kubenswrapper[18707]: I0320 09:33:59.761431 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" event={"ID":"4d52574c-2e3d-4b9d-b467-687a99955d47","Type":"ContainerDied","Data":"c4bf75297aedf5844dee7d2000c6606a3c75e10345fcc8eb98e968a78a13ee93"} Mar 20 09:34:00.776588 master-0 kubenswrapper[18707]: I0320 09:34:00.776517 18707 generic.go:334] "Generic (PLEG): container finished" podID="0e7018f9-065e-4556-8a68-b58fa9fe4402" containerID="a34935b7d4513e8456ee712c4d3f1413861f125194f0b0dd2b59c47aebd22bbd" exitCode=2 Mar 20 09:34:00.777223 master-0 kubenswrapper[18707]: I0320 09:34:00.776713 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" event={"ID":"0e7018f9-065e-4556-8a68-b58fa9fe4402","Type":"ContainerDied","Data":"a34935b7d4513e8456ee712c4d3f1413861f125194f0b0dd2b59c47aebd22bbd"} Mar 20 09:34:01.304414 master-0 kubenswrapper[18707]: I0320 09:34:01.304359 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:34:01.437037 master-0 kubenswrapper[18707]: I0320 09:34:01.436966 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-inventory\") pod \"4d52574c-2e3d-4b9d-b467-687a99955d47\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " Mar 20 09:34:01.437267 master-0 kubenswrapper[18707]: I0320 09:34:01.437057 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-neutron-ovn-metadata-agent-neutron-config-0\") pod \"4d52574c-2e3d-4b9d-b467-687a99955d47\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " Mar 20 09:34:01.437267 master-0 kubenswrapper[18707]: I0320 09:34:01.437133 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-neutron-metadata-combined-ca-bundle\") pod \"4d52574c-2e3d-4b9d-b467-687a99955d47\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " Mar 20 09:34:01.437497 master-0 kubenswrapper[18707]: I0320 09:34:01.437472 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbcnq\" (UniqueName: \"kubernetes.io/projected/4d52574c-2e3d-4b9d-b467-687a99955d47-kube-api-access-zbcnq\") pod \"4d52574c-2e3d-4b9d-b467-687a99955d47\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " Mar 20 09:34:01.437568 master-0 kubenswrapper[18707]: I0320 09:34:01.437547 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-nova-metadata-neutron-config-0\") pod \"4d52574c-2e3d-4b9d-b467-687a99955d47\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " Mar 20 09:34:01.437620 master-0 kubenswrapper[18707]: I0320 09:34:01.437567 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-ssh-key-edpm-b\") pod \"4d52574c-2e3d-4b9d-b467-687a99955d47\" (UID: \"4d52574c-2e3d-4b9d-b467-687a99955d47\") " Mar 20 09:34:01.441923 master-0 kubenswrapper[18707]: I0320 09:34:01.441839 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d52574c-2e3d-4b9d-b467-687a99955d47-kube-api-access-zbcnq" (OuterVolumeSpecName: "kube-api-access-zbcnq") pod "4d52574c-2e3d-4b9d-b467-687a99955d47" (UID: "4d52574c-2e3d-4b9d-b467-687a99955d47"). InnerVolumeSpecName "kube-api-access-zbcnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:34:01.443125 master-0 kubenswrapper[18707]: I0320 09:34:01.443051 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4d52574c-2e3d-4b9d-b467-687a99955d47" (UID: "4d52574c-2e3d-4b9d-b467-687a99955d47"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:34:01.462921 master-0 kubenswrapper[18707]: I0320 09:34:01.462800 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-inventory" (OuterVolumeSpecName: "inventory") pod "4d52574c-2e3d-4b9d-b467-687a99955d47" (UID: "4d52574c-2e3d-4b9d-b467-687a99955d47"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:34:01.463789 master-0 kubenswrapper[18707]: I0320 09:34:01.463167 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "4d52574c-2e3d-4b9d-b467-687a99955d47" (UID: "4d52574c-2e3d-4b9d-b467-687a99955d47"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:34:01.464102 master-0 kubenswrapper[18707]: I0320 09:34:01.464061 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "4d52574c-2e3d-4b9d-b467-687a99955d47" (UID: "4d52574c-2e3d-4b9d-b467-687a99955d47"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:34:01.465177 master-0 kubenswrapper[18707]: I0320 09:34:01.465143 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "4d52574c-2e3d-4b9d-b467-687a99955d47" (UID: "4d52574c-2e3d-4b9d-b467-687a99955d47"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:34:01.540408 master-0 kubenswrapper[18707]: I0320 09:34:01.540339 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:34:01.540408 master-0 kubenswrapper[18707]: I0320 09:34:01.540398 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbcnq\" (UniqueName: \"kubernetes.io/projected/4d52574c-2e3d-4b9d-b467-687a99955d47-kube-api-access-zbcnq\") on node \"master-0\" DevicePath \"\"" Mar 20 09:34:01.540408 master-0 kubenswrapper[18707]: I0320 09:34:01.540419 18707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-nova-metadata-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:34:01.540756 master-0 kubenswrapper[18707]: I0320 09:34:01.540436 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:34:01.540756 master-0 kubenswrapper[18707]: I0320 09:34:01.540453 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:34:01.540756 master-0 kubenswrapper[18707]: I0320 09:34:01.540463 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d52574c-2e3d-4b9d-b467-687a99955d47-neutron-ovn-metadata-agent-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:34:01.789011 master-0 kubenswrapper[18707]: I0320 09:34:01.788789 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" Mar 20 09:34:01.790812 master-0 kubenswrapper[18707]: I0320 09:34:01.790480 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-5g8zh" event={"ID":"4d52574c-2e3d-4b9d-b467-687a99955d47","Type":"ContainerDied","Data":"ad79042b47b23f54f972bcda59e2e8d2f1d29fdb05a10728458eb6c31d121814"} Mar 20 09:34:01.790812 master-0 kubenswrapper[18707]: I0320 09:34:01.790580 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad79042b47b23f54f972bcda59e2e8d2f1d29fdb05a10728458eb6c31d121814" Mar 20 09:34:02.300409 master-0 kubenswrapper[18707]: I0320 09:34:02.300345 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:34:02.362227 master-0 kubenswrapper[18707]: I0320 09:34:02.361402 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-inventory\") pod \"0e7018f9-065e-4556-8a68-b58fa9fe4402\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " Mar 20 09:34:02.362227 master-0 kubenswrapper[18707]: I0320 09:34:02.361483 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-neutron-ovn-metadata-agent-neutron-config-0\") pod \"0e7018f9-065e-4556-8a68-b58fa9fe4402\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " Mar 20 09:34:02.362227 master-0 kubenswrapper[18707]: I0320 09:34:02.361604 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-nova-metadata-neutron-config-0\") pod \"0e7018f9-065e-4556-8a68-b58fa9fe4402\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " Mar 20 09:34:02.362227 master-0 kubenswrapper[18707]: I0320 09:34:02.361714 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4blrz\" (UniqueName: \"kubernetes.io/projected/0e7018f9-065e-4556-8a68-b58fa9fe4402-kube-api-access-4blrz\") pod \"0e7018f9-065e-4556-8a68-b58fa9fe4402\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " Mar 20 09:34:02.362227 master-0 kubenswrapper[18707]: I0320 09:34:02.361744 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-neutron-metadata-combined-ca-bundle\") pod \"0e7018f9-065e-4556-8a68-b58fa9fe4402\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " Mar 20 09:34:02.362227 master-0 kubenswrapper[18707]: I0320 09:34:02.361858 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-ssh-key-edpm-a\") pod \"0e7018f9-065e-4556-8a68-b58fa9fe4402\" (UID: \"0e7018f9-065e-4556-8a68-b58fa9fe4402\") " Mar 20 09:34:02.366969 master-0 kubenswrapper[18707]: I0320 09:34:02.366243 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0e7018f9-065e-4556-8a68-b58fa9fe4402" (UID: "0e7018f9-065e-4556-8a68-b58fa9fe4402"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:34:02.385487 master-0 kubenswrapper[18707]: I0320 09:34:02.385415 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e7018f9-065e-4556-8a68-b58fa9fe4402-kube-api-access-4blrz" (OuterVolumeSpecName: "kube-api-access-4blrz") pod "0e7018f9-065e-4556-8a68-b58fa9fe4402" (UID: "0e7018f9-065e-4556-8a68-b58fa9fe4402"). InnerVolumeSpecName "kube-api-access-4blrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:34:02.396789 master-0 kubenswrapper[18707]: I0320 09:34:02.396720 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "0e7018f9-065e-4556-8a68-b58fa9fe4402" (UID: "0e7018f9-065e-4556-8a68-b58fa9fe4402"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:34:02.398178 master-0 kubenswrapper[18707]: I0320 09:34:02.398123 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "0e7018f9-065e-4556-8a68-b58fa9fe4402" (UID: "0e7018f9-065e-4556-8a68-b58fa9fe4402"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:34:02.398745 master-0 kubenswrapper[18707]: I0320 09:34:02.398705 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "0e7018f9-065e-4556-8a68-b58fa9fe4402" (UID: "0e7018f9-065e-4556-8a68-b58fa9fe4402"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:34:02.399164 master-0 kubenswrapper[18707]: I0320 09:34:02.399124 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-inventory" (OuterVolumeSpecName: "inventory") pod "0e7018f9-065e-4556-8a68-b58fa9fe4402" (UID: "0e7018f9-065e-4556-8a68-b58fa9fe4402"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:34:02.465341 master-0 kubenswrapper[18707]: I0320 09:34:02.465145 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:34:02.465341 master-0 kubenswrapper[18707]: I0320 09:34:02.465252 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:34:02.465341 master-0 kubenswrapper[18707]: I0320 09:34:02.465291 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-neutron-ovn-metadata-agent-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:34:02.465341 master-0 kubenswrapper[18707]: I0320 09:34:02.465317 18707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-nova-metadata-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:34:02.465341 master-0 kubenswrapper[18707]: I0320 09:34:02.465343 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4blrz\" (UniqueName: \"kubernetes.io/projected/0e7018f9-065e-4556-8a68-b58fa9fe4402-kube-api-access-4blrz\") on node \"master-0\" DevicePath \"\"" Mar 20 09:34:02.465783 master-0 kubenswrapper[18707]: I0320 09:34:02.465363 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7018f9-065e-4556-8a68-b58fa9fe4402-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:34:02.803023 master-0 kubenswrapper[18707]: I0320 09:34:02.801741 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" event={"ID":"0e7018f9-065e-4556-8a68-b58fa9fe4402","Type":"ContainerDied","Data":"e4fa23a3e77a658fa3378e7dc8f2f392de73aa81930e0a6ff93a46b81d00a9c6"} Mar 20 09:34:02.803023 master-0 kubenswrapper[18707]: I0320 09:34:02.801807 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4fa23a3e77a658fa3378e7dc8f2f392de73aa81930e0a6ff93a46b81d00a9c6" Mar 20 09:34:02.803023 master-0 kubenswrapper[18707]: I0320 09:34:02.801804 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-8999k" Mar 20 09:39:18.046210 master-0 kubenswrapper[18707]: I0320 09:39:18.045751 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt"] Mar 20 09:39:18.046888 master-0 kubenswrapper[18707]: E0320 09:39:18.046872 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7018f9-065e-4556-8a68-b58fa9fe4402" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:39:18.046940 master-0 kubenswrapper[18707]: I0320 09:39:18.046896 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7018f9-065e-4556-8a68-b58fa9fe4402" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:39:18.047001 master-0 kubenswrapper[18707]: E0320 09:39:18.046975 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d52574c-2e3d-4b9d-b467-687a99955d47" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:39:18.047001 master-0 kubenswrapper[18707]: I0320 09:39:18.046988 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d52574c-2e3d-4b9d-b467-687a99955d47" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:39:18.050210 master-0 kubenswrapper[18707]: I0320 09:39:18.047571 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d52574c-2e3d-4b9d-b467-687a99955d47" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 09:39:18.050210 master-0 kubenswrapper[18707]: I0320 09:39:18.047625 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7018f9-065e-4556-8a68-b58fa9fe4402" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 09:39:18.050210 master-0 kubenswrapper[18707]: I0320 09:39:18.049061 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.055204 master-0 kubenswrapper[18707]: I0320 09:39:18.052957 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 09:39:18.055204 master-0 kubenswrapper[18707]: I0320 09:39:18.053204 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-b" Mar 20 09:39:18.055204 master-0 kubenswrapper[18707]: I0320 09:39:18.053316 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 09:39:18.055204 master-0 kubenswrapper[18707]: I0320 09:39:18.053368 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 09:39:18.055204 master-0 kubenswrapper[18707]: I0320 09:39:18.053331 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 09:39:18.081226 master-0 kubenswrapper[18707]: I0320 09:39:18.079304 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt"] Mar 20 09:39:18.216460 master-0 kubenswrapper[18707]: I0320 09:39:18.216289 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl7n9\" (UniqueName: \"kubernetes.io/projected/e473a96e-f7df-4fc4-855f-82ff7126028a-kube-api-access-wl7n9\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.216738 master-0 kubenswrapper[18707]: I0320 09:39:18.216526 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.216738 master-0 kubenswrapper[18707]: I0320 09:39:18.216662 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.217081 master-0 kubenswrapper[18707]: I0320 09:39:18.217014 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.217261 master-0 kubenswrapper[18707]: I0320 09:39:18.217223 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.217361 master-0 kubenswrapper[18707]: I0320 09:39:18.217280 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.319530 master-0 kubenswrapper[18707]: I0320 09:39:18.319351 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.319530 master-0 kubenswrapper[18707]: I0320 09:39:18.319443 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.319530 master-0 kubenswrapper[18707]: I0320 09:39:18.319531 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.319928 master-0 kubenswrapper[18707]: I0320 09:39:18.319593 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.319928 master-0 kubenswrapper[18707]: I0320 09:39:18.319616 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.319928 master-0 kubenswrapper[18707]: I0320 09:39:18.319749 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl7n9\" (UniqueName: \"kubernetes.io/projected/e473a96e-f7df-4fc4-855f-82ff7126028a-kube-api-access-wl7n9\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.323681 master-0 kubenswrapper[18707]: I0320 09:39:18.323627 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.323681 master-0 kubenswrapper[18707]: I0320 09:39:18.323655 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-ssh-key-edpm-b\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.324406 master-0 kubenswrapper[18707]: I0320 09:39:18.324367 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.324855 master-0 kubenswrapper[18707]: I0320 09:39:18.324803 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.325302 master-0 kubenswrapper[18707]: I0320 09:39:18.325271 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.339655 master-0 kubenswrapper[18707]: I0320 09:39:18.339596 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl7n9\" (UniqueName: \"kubernetes.io/projected/e473a96e-f7df-4fc4-855f-82ff7126028a-kube-api-access-wl7n9\") pod \"neutron-metadata-dataplane-step-2-edpm-b-nzczt\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.380988 master-0 kubenswrapper[18707]: I0320 09:39:18.380932 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:39:18.914690 master-0 kubenswrapper[18707]: I0320 09:39:18.914636 18707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:39:18.928348 master-0 kubenswrapper[18707]: I0320 09:39:18.928246 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt"] Mar 20 09:39:19.909036 master-0 kubenswrapper[18707]: I0320 09:39:19.908976 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" event={"ID":"e473a96e-f7df-4fc4-855f-82ff7126028a","Type":"ContainerStarted","Data":"81970bc0b2e949d5640c812b307c007e379647557e1663caecd0381ce93ee357"} Mar 20 09:39:20.039547 master-0 kubenswrapper[18707]: I0320 09:39:20.039472 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x"] Mar 20 09:39:20.042225 master-0 kubenswrapper[18707]: I0320 09:39:20.042144 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.049645 master-0 kubenswrapper[18707]: I0320 09:39:20.049593 18707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm-a" Mar 20 09:39:20.083313 master-0 kubenswrapper[18707]: I0320 09:39:20.083223 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x"] Mar 20 09:39:20.188403 master-0 kubenswrapper[18707]: I0320 09:39:20.187326 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.188403 master-0 kubenswrapper[18707]: I0320 09:39:20.187477 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8jrj\" (UniqueName: \"kubernetes.io/projected/eedca8c3-095d-40e2-922d-d2cb02edc346-kube-api-access-t8jrj\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.188403 master-0 kubenswrapper[18707]: I0320 09:39:20.187513 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.188403 master-0 kubenswrapper[18707]: I0320 09:39:20.187531 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.188403 master-0 kubenswrapper[18707]: I0320 09:39:20.187610 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.188403 master-0 kubenswrapper[18707]: I0320 09:39:20.187671 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.289885 master-0 kubenswrapper[18707]: I0320 09:39:20.289820 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.290229 master-0 kubenswrapper[18707]: I0320 09:39:20.290169 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8jrj\" (UniqueName: \"kubernetes.io/projected/eedca8c3-095d-40e2-922d-d2cb02edc346-kube-api-access-t8jrj\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.290317 master-0 kubenswrapper[18707]: I0320 09:39:20.290250 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.290317 master-0 kubenswrapper[18707]: I0320 09:39:20.290281 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.290432 master-0 kubenswrapper[18707]: I0320 09:39:20.290409 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.290509 master-0 kubenswrapper[18707]: I0320 09:39:20.290486 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.294126 master-0 kubenswrapper[18707]: I0320 09:39:20.294088 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.294927 master-0 kubenswrapper[18707]: I0320 09:39:20.294813 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.304012 master-0 kubenswrapper[18707]: I0320 09:39:20.295753 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.304012 master-0 kubenswrapper[18707]: I0320 09:39:20.295764 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-inventory\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.305126 master-0 kubenswrapper[18707]: I0320 09:39:20.304587 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-ssh-key-edpm-a\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.308325 master-0 kubenswrapper[18707]: I0320 09:39:20.308307 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8jrj\" (UniqueName: \"kubernetes.io/projected/eedca8c3-095d-40e2-922d-d2cb02edc346-kube-api-access-t8jrj\") pod \"neutron-metadata-dataplane-step-2-edpm-a-9lz8x\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.388092 master-0 kubenswrapper[18707]: I0320 09:39:20.384114 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:39:20.935151 master-0 kubenswrapper[18707]: I0320 09:39:20.935023 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" event={"ID":"e473a96e-f7df-4fc4-855f-82ff7126028a","Type":"ContainerStarted","Data":"d11cff8ac78744355ec020db14b1652eaa77cc5ff188bf6d3a2a0b43e6114a93"} Mar 20 09:39:20.970557 master-0 kubenswrapper[18707]: I0320 09:39:20.970495 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x"] Mar 20 09:39:20.971585 master-0 kubenswrapper[18707]: I0320 09:39:20.971525 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" podStartSLOduration=2.264711789 podStartE2EDuration="2.971506151s" podCreationTimestamp="2026-03-20 09:39:18 +0000 UTC" firstStartedPulling="2026-03-20 09:39:18.914567906 +0000 UTC m=+3504.070748262" lastFinishedPulling="2026-03-20 09:39:19.621362268 +0000 UTC m=+3504.777542624" observedRunningTime="2026-03-20 09:39:20.959445258 +0000 UTC m=+3506.115625634" watchObservedRunningTime="2026-03-20 09:39:20.971506151 +0000 UTC m=+3506.127686507" Mar 20 09:39:21.954557 master-0 kubenswrapper[18707]: I0320 09:39:21.954394 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" event={"ID":"eedca8c3-095d-40e2-922d-d2cb02edc346","Type":"ContainerStarted","Data":"3c6f8595588760372ac77b0af9b4e25750a2209ec10cd8c1c2d3a2e17c4741fe"} Mar 20 09:39:22.974071 master-0 kubenswrapper[18707]: I0320 09:39:22.973923 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" event={"ID":"eedca8c3-095d-40e2-922d-d2cb02edc346","Type":"ContainerStarted","Data":"1706e54745c8531f5df075a651250632a4758764711b9333673ae278566aa3e5"} Mar 20 09:39:23.005565 master-0 kubenswrapper[18707]: I0320 09:39:23.000554 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" podStartSLOduration=1.544741087 podStartE2EDuration="3.000535144s" podCreationTimestamp="2026-03-20 09:39:20 +0000 UTC" firstStartedPulling="2026-03-20 09:39:20.98730709 +0000 UTC m=+3506.143487446" lastFinishedPulling="2026-03-20 09:39:22.443101147 +0000 UTC m=+3507.599281503" observedRunningTime="2026-03-20 09:39:22.99863678 +0000 UTC m=+3508.154817136" watchObservedRunningTime="2026-03-20 09:39:23.000535144 +0000 UTC m=+3508.156715500" Mar 20 09:40:00.495405 master-0 kubenswrapper[18707]: I0320 09:40:00.495224 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" event={"ID":"e473a96e-f7df-4fc4-855f-82ff7126028a","Type":"ContainerDied","Data":"d11cff8ac78744355ec020db14b1652eaa77cc5ff188bf6d3a2a0b43e6114a93"} Mar 20 09:40:00.495405 master-0 kubenswrapper[18707]: I0320 09:40:00.495225 18707 generic.go:334] "Generic (PLEG): container finished" podID="e473a96e-f7df-4fc4-855f-82ff7126028a" containerID="d11cff8ac78744355ec020db14b1652eaa77cc5ff188bf6d3a2a0b43e6114a93" exitCode=2 Mar 20 09:40:02.054168 master-0 kubenswrapper[18707]: I0320 09:40:02.054088 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:40:02.121872 master-0 kubenswrapper[18707]: I0320 09:40:02.121743 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-neutron-metadata-combined-ca-bundle\") pod \"e473a96e-f7df-4fc4-855f-82ff7126028a\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " Mar 20 09:40:02.122107 master-0 kubenswrapper[18707]: I0320 09:40:02.121911 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"e473a96e-f7df-4fc4-855f-82ff7126028a\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " Mar 20 09:40:02.124252 master-0 kubenswrapper[18707]: I0320 09:40:02.122220 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-ssh-key-edpm-b\") pod \"e473a96e-f7df-4fc4-855f-82ff7126028a\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " Mar 20 09:40:02.124252 master-0 kubenswrapper[18707]: I0320 09:40:02.122326 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-inventory\") pod \"e473a96e-f7df-4fc4-855f-82ff7126028a\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " Mar 20 09:40:02.124252 master-0 kubenswrapper[18707]: I0320 09:40:02.122361 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-nova-metadata-neutron-config-0\") pod \"e473a96e-f7df-4fc4-855f-82ff7126028a\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " Mar 20 09:40:02.124252 master-0 kubenswrapper[18707]: I0320 09:40:02.122476 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl7n9\" (UniqueName: \"kubernetes.io/projected/e473a96e-f7df-4fc4-855f-82ff7126028a-kube-api-access-wl7n9\") pod \"e473a96e-f7df-4fc4-855f-82ff7126028a\" (UID: \"e473a96e-f7df-4fc4-855f-82ff7126028a\") " Mar 20 09:40:02.126211 master-0 kubenswrapper[18707]: I0320 09:40:02.126120 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e473a96e-f7df-4fc4-855f-82ff7126028a" (UID: "e473a96e-f7df-4fc4-855f-82ff7126028a"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:40:02.130855 master-0 kubenswrapper[18707]: I0320 09:40:02.130803 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e473a96e-f7df-4fc4-855f-82ff7126028a-kube-api-access-wl7n9" (OuterVolumeSpecName: "kube-api-access-wl7n9") pod "e473a96e-f7df-4fc4-855f-82ff7126028a" (UID: "e473a96e-f7df-4fc4-855f-82ff7126028a"). InnerVolumeSpecName "kube-api-access-wl7n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:40:02.159436 master-0 kubenswrapper[18707]: I0320 09:40:02.159374 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "e473a96e-f7df-4fc4-855f-82ff7126028a" (UID: "e473a96e-f7df-4fc4-855f-82ff7126028a"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:40:02.161142 master-0 kubenswrapper[18707]: I0320 09:40:02.161095 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-ssh-key-edpm-b" (OuterVolumeSpecName: "ssh-key-edpm-b") pod "e473a96e-f7df-4fc4-855f-82ff7126028a" (UID: "e473a96e-f7df-4fc4-855f-82ff7126028a"). InnerVolumeSpecName "ssh-key-edpm-b". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:40:02.179377 master-0 kubenswrapper[18707]: I0320 09:40:02.179314 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "e473a96e-f7df-4fc4-855f-82ff7126028a" (UID: "e473a96e-f7df-4fc4-855f-82ff7126028a"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:40:02.179556 master-0 kubenswrapper[18707]: I0320 09:40:02.179363 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-inventory" (OuterVolumeSpecName: "inventory") pod "e473a96e-f7df-4fc4-855f-82ff7126028a" (UID: "e473a96e-f7df-4fc4-855f-82ff7126028a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:40:02.226665 master-0 kubenswrapper[18707]: I0320 09:40:02.226435 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-neutron-ovn-metadata-agent-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:40:02.226665 master-0 kubenswrapper[18707]: I0320 09:40:02.226487 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-b\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-ssh-key-edpm-b\") on node \"master-0\" DevicePath \"\"" Mar 20 09:40:02.226665 master-0 kubenswrapper[18707]: I0320 09:40:02.226499 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:40:02.226665 master-0 kubenswrapper[18707]: I0320 09:40:02.226509 18707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-nova-metadata-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:40:02.226665 master-0 kubenswrapper[18707]: I0320 09:40:02.226524 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl7n9\" (UniqueName: \"kubernetes.io/projected/e473a96e-f7df-4fc4-855f-82ff7126028a-kube-api-access-wl7n9\") on node \"master-0\" DevicePath \"\"" Mar 20 09:40:02.226665 master-0 kubenswrapper[18707]: I0320 09:40:02.226536 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e473a96e-f7df-4fc4-855f-82ff7126028a-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:40:02.523277 master-0 kubenswrapper[18707]: I0320 09:40:02.523091 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" Mar 20 09:40:02.523475 master-0 kubenswrapper[18707]: I0320 09:40:02.523092 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-b-nzczt" event={"ID":"e473a96e-f7df-4fc4-855f-82ff7126028a","Type":"ContainerDied","Data":"81970bc0b2e949d5640c812b307c007e379647557e1663caecd0381ce93ee357"} Mar 20 09:40:02.523475 master-0 kubenswrapper[18707]: I0320 09:40:02.523392 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81970bc0b2e949d5640c812b307c007e379647557e1663caecd0381ce93ee357" Mar 20 09:40:02.525365 master-0 kubenswrapper[18707]: I0320 09:40:02.525322 18707 generic.go:334] "Generic (PLEG): container finished" podID="eedca8c3-095d-40e2-922d-d2cb02edc346" containerID="1706e54745c8531f5df075a651250632a4758764711b9333673ae278566aa3e5" exitCode=2 Mar 20 09:40:02.525447 master-0 kubenswrapper[18707]: I0320 09:40:02.525368 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" event={"ID":"eedca8c3-095d-40e2-922d-d2cb02edc346","Type":"ContainerDied","Data":"1706e54745c8531f5df075a651250632a4758764711b9333673ae278566aa3e5"} Mar 20 09:40:04.066972 master-0 kubenswrapper[18707]: I0320 09:40:04.066912 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 09:40:04.180176 master-0 kubenswrapper[18707]: I0320 09:40:04.180087 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8jrj\" (UniqueName: \"kubernetes.io/projected/eedca8c3-095d-40e2-922d-d2cb02edc346-kube-api-access-t8jrj\") pod \"eedca8c3-095d-40e2-922d-d2cb02edc346\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " Mar 20 09:40:04.180434 master-0 kubenswrapper[18707]: I0320 09:40:04.180245 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-nova-metadata-neutron-config-0\") pod \"eedca8c3-095d-40e2-922d-d2cb02edc346\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " Mar 20 09:40:04.180434 master-0 kubenswrapper[18707]: I0320 09:40:04.180317 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-inventory\") pod \"eedca8c3-095d-40e2-922d-d2cb02edc346\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " Mar 20 09:40:04.181243 master-0 kubenswrapper[18707]: I0320 09:40:04.181194 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-neutron-ovn-metadata-agent-neutron-config-0\") pod \"eedca8c3-095d-40e2-922d-d2cb02edc346\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " Mar 20 09:40:04.181432 master-0 kubenswrapper[18707]: I0320 09:40:04.181398 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-ssh-key-edpm-a\") pod \"eedca8c3-095d-40e2-922d-d2cb02edc346\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " Mar 20 09:40:04.181506 master-0 kubenswrapper[18707]: I0320 09:40:04.181483 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-neutron-metadata-combined-ca-bundle\") pod \"eedca8c3-095d-40e2-922d-d2cb02edc346\" (UID: \"eedca8c3-095d-40e2-922d-d2cb02edc346\") " Mar 20 09:40:04.185242 master-0 kubenswrapper[18707]: I0320 09:40:04.185138 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eedca8c3-095d-40e2-922d-d2cb02edc346-kube-api-access-t8jrj" (OuterVolumeSpecName: "kube-api-access-t8jrj") pod "eedca8c3-095d-40e2-922d-d2cb02edc346" (UID: "eedca8c3-095d-40e2-922d-d2cb02edc346"). InnerVolumeSpecName "kube-api-access-t8jrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:40:04.187955 master-0 kubenswrapper[18707]: I0320 09:40:04.187902 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "eedca8c3-095d-40e2-922d-d2cb02edc346" (UID: "eedca8c3-095d-40e2-922d-d2cb02edc346"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:40:04.213062 master-0 kubenswrapper[18707]: I0320 09:40:04.212824 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-inventory" (OuterVolumeSpecName: "inventory") pod "eedca8c3-095d-40e2-922d-d2cb02edc346" (UID: "eedca8c3-095d-40e2-922d-d2cb02edc346"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:40:04.213688 master-0 kubenswrapper[18707]: I0320 09:40:04.213614 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-ssh-key-edpm-a" (OuterVolumeSpecName: "ssh-key-edpm-a") pod "eedca8c3-095d-40e2-922d-d2cb02edc346" (UID: "eedca8c3-095d-40e2-922d-d2cb02edc346"). InnerVolumeSpecName "ssh-key-edpm-a". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:40:04.215909 master-0 kubenswrapper[18707]: I0320 09:40:04.215869 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "eedca8c3-095d-40e2-922d-d2cb02edc346" (UID: "eedca8c3-095d-40e2-922d-d2cb02edc346"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:40:04.216828 master-0 kubenswrapper[18707]: I0320 09:40:04.216726 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "eedca8c3-095d-40e2-922d-d2cb02edc346" (UID: "eedca8c3-095d-40e2-922d-d2cb02edc346"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:40:04.290431 master-0 kubenswrapper[18707]: I0320 09:40:04.290361 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-neutron-ovn-metadata-agent-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:40:04.290431 master-0 kubenswrapper[18707]: I0320 09:40:04.290419 18707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-a\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-ssh-key-edpm-a\") on node \"master-0\" DevicePath \"\"" Mar 20 09:40:04.290431 master-0 kubenswrapper[18707]: I0320 09:40:04.290436 18707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 09:40:04.290752 master-0 kubenswrapper[18707]: I0320 09:40:04.290453 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8jrj\" (UniqueName: \"kubernetes.io/projected/eedca8c3-095d-40e2-922d-d2cb02edc346-kube-api-access-t8jrj\") on node \"master-0\" DevicePath \"\"" Mar 20 09:40:04.290752 master-0 kubenswrapper[18707]: I0320 09:40:04.290469 18707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-nova-metadata-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Mar 20 09:40:04.290752 master-0 kubenswrapper[18707]: I0320 09:40:04.290484 18707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eedca8c3-095d-40e2-922d-d2cb02edc346-inventory\") on node \"master-0\" DevicePath \"\"" Mar 20 09:40:04.552389 master-0 kubenswrapper[18707]: I0320 09:40:04.552285 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" event={"ID":"eedca8c3-095d-40e2-922d-d2cb02edc346","Type":"ContainerDied","Data":"3c6f8595588760372ac77b0af9b4e25750a2209ec10cd8c1c2d3a2e17c4741fe"} Mar 20 09:40:04.552389 master-0 kubenswrapper[18707]: I0320 09:40:04.552353 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c6f8595588760372ac77b0af9b4e25750a2209ec10cd8c1c2d3a2e17c4741fe" Mar 20 09:40:04.552389 master-0 kubenswrapper[18707]: I0320 09:40:04.552380 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-step-2-edpm-a-9lz8x" Mar 20 10:01:00.263056 master-0 kubenswrapper[18707]: I0320 10:01:00.262986 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29566681-szrxn"] Mar 20 10:01:00.264915 master-0 kubenswrapper[18707]: E0320 10:01:00.263787 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e473a96e-f7df-4fc4-855f-82ff7126028a" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 10:01:00.264915 master-0 kubenswrapper[18707]: I0320 10:01:00.263813 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e473a96e-f7df-4fc4-855f-82ff7126028a" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 10:01:00.264915 master-0 kubenswrapper[18707]: E0320 10:01:00.263866 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eedca8c3-095d-40e2-922d-d2cb02edc346" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 10:01:00.264915 master-0 kubenswrapper[18707]: I0320 10:01:00.263876 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eedca8c3-095d-40e2-922d-d2cb02edc346" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 10:01:00.264915 master-0 kubenswrapper[18707]: I0320 10:01:00.264248 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eedca8c3-095d-40e2-922d-d2cb02edc346" containerName="neutron-metadata-dataplane-step-2-edpm-a" Mar 20 10:01:00.264915 master-0 kubenswrapper[18707]: I0320 10:01:00.264290 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e473a96e-f7df-4fc4-855f-82ff7126028a" containerName="neutron-metadata-dataplane-step-2-edpm-b" Mar 20 10:01:00.265461 master-0 kubenswrapper[18707]: I0320 10:01:00.265433 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:01:00.288213 master-0 kubenswrapper[18707]: I0320 10:01:00.288126 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566681-szrxn"] Mar 20 10:01:00.372258 master-0 kubenswrapper[18707]: I0320 10:01:00.371260 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-fernet-keys\") pod \"keystone-cron-29566681-szrxn\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:01:00.372258 master-0 kubenswrapper[18707]: I0320 10:01:00.371752 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-config-data\") pod \"keystone-cron-29566681-szrxn\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:01:00.372258 master-0 kubenswrapper[18707]: I0320 10:01:00.372039 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb6qh\" (UniqueName: \"kubernetes.io/projected/9916321f-96d9-474c-ae20-6fb34f4b79c1-kube-api-access-cb6qh\") pod \"keystone-cron-29566681-szrxn\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:01:00.372681 master-0 kubenswrapper[18707]: I0320 10:01:00.372338 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-combined-ca-bundle\") pod \"keystone-cron-29566681-szrxn\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:01:00.475257 master-0 kubenswrapper[18707]: I0320 10:01:00.475149 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb6qh\" (UniqueName: \"kubernetes.io/projected/9916321f-96d9-474c-ae20-6fb34f4b79c1-kube-api-access-cb6qh\") pod \"keystone-cron-29566681-szrxn\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:01:00.475257 master-0 kubenswrapper[18707]: I0320 10:01:00.475277 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-combined-ca-bundle\") pod \"keystone-cron-29566681-szrxn\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:01:00.475736 master-0 kubenswrapper[18707]: I0320 10:01:00.475466 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-fernet-keys\") pod \"keystone-cron-29566681-szrxn\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:01:00.475736 master-0 kubenswrapper[18707]: I0320 10:01:00.475578 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-config-data\") pod \"keystone-cron-29566681-szrxn\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:01:00.493315 master-0 kubenswrapper[18707]: I0320 10:01:00.482849 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-config-data\") pod \"keystone-cron-29566681-szrxn\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:01:00.493315 master-0 kubenswrapper[18707]: I0320 10:01:00.482953 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-fernet-keys\") pod \"keystone-cron-29566681-szrxn\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:01:00.493315 master-0 kubenswrapper[18707]: I0320 10:01:00.483020 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-combined-ca-bundle\") pod \"keystone-cron-29566681-szrxn\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:01:00.496265 master-0 kubenswrapper[18707]: I0320 10:01:00.496207 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb6qh\" (UniqueName: \"kubernetes.io/projected/9916321f-96d9-474c-ae20-6fb34f4b79c1-kube-api-access-cb6qh\") pod \"keystone-cron-29566681-szrxn\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:01:00.586693 master-0 kubenswrapper[18707]: I0320 10:01:00.586625 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:01:01.089210 master-0 kubenswrapper[18707]: I0320 10:01:01.086527 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566681-szrxn"] Mar 20 10:01:01.306670 master-0 kubenswrapper[18707]: I0320 10:01:01.306596 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566681-szrxn" event={"ID":"9916321f-96d9-474c-ae20-6fb34f4b79c1","Type":"ContainerStarted","Data":"f44ba1ad6840be9ef4bd592d23b193be737eaecf50d102a7381b174be63e957d"} Mar 20 10:01:01.306670 master-0 kubenswrapper[18707]: I0320 10:01:01.306665 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566681-szrxn" event={"ID":"9916321f-96d9-474c-ae20-6fb34f4b79c1","Type":"ContainerStarted","Data":"e9fb005b56c197e6903d12f151b7b653b1673b6602a447237454dfd6adb2ce67"} Mar 20 10:01:01.329402 master-0 kubenswrapper[18707]: I0320 10:01:01.329219 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29566681-szrxn" podStartSLOduration=1.329175184 podStartE2EDuration="1.329175184s" podCreationTimestamp="2026-03-20 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:01:01.323970706 +0000 UTC m=+4806.480151062" watchObservedRunningTime="2026-03-20 10:01:01.329175184 +0000 UTC m=+4806.485355540" Mar 20 10:01:05.355310 master-0 kubenswrapper[18707]: I0320 10:01:05.355250 18707 generic.go:334] "Generic (PLEG): container finished" podID="9916321f-96d9-474c-ae20-6fb34f4b79c1" containerID="f44ba1ad6840be9ef4bd592d23b193be737eaecf50d102a7381b174be63e957d" exitCode=0 Mar 20 10:01:05.355310 master-0 kubenswrapper[18707]: I0320 10:01:05.355303 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566681-szrxn" event={"ID":"9916321f-96d9-474c-ae20-6fb34f4b79c1","Type":"ContainerDied","Data":"f44ba1ad6840be9ef4bd592d23b193be737eaecf50d102a7381b174be63e957d"} Mar 20 10:01:06.894472 master-0 kubenswrapper[18707]: I0320 10:01:06.894405 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:01:06.966363 master-0 kubenswrapper[18707]: I0320 10:01:06.966305 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-combined-ca-bundle\") pod \"9916321f-96d9-474c-ae20-6fb34f4b79c1\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " Mar 20 10:01:06.966612 master-0 kubenswrapper[18707]: I0320 10:01:06.966429 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-fernet-keys\") pod \"9916321f-96d9-474c-ae20-6fb34f4b79c1\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " Mar 20 10:01:06.967068 master-0 kubenswrapper[18707]: I0320 10:01:06.967044 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-config-data\") pod \"9916321f-96d9-474c-ae20-6fb34f4b79c1\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " Mar 20 10:01:06.967142 master-0 kubenswrapper[18707]: I0320 10:01:06.967088 18707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb6qh\" (UniqueName: \"kubernetes.io/projected/9916321f-96d9-474c-ae20-6fb34f4b79c1-kube-api-access-cb6qh\") pod \"9916321f-96d9-474c-ae20-6fb34f4b79c1\" (UID: \"9916321f-96d9-474c-ae20-6fb34f4b79c1\") " Mar 20 10:01:06.991837 master-0 kubenswrapper[18707]: I0320 10:01:06.990057 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9916321f-96d9-474c-ae20-6fb34f4b79c1" (UID: "9916321f-96d9-474c-ae20-6fb34f4b79c1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:01:06.991837 master-0 kubenswrapper[18707]: I0320 10:01:06.991427 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9916321f-96d9-474c-ae20-6fb34f4b79c1-kube-api-access-cb6qh" (OuterVolumeSpecName: "kube-api-access-cb6qh") pod "9916321f-96d9-474c-ae20-6fb34f4b79c1" (UID: "9916321f-96d9-474c-ae20-6fb34f4b79c1"). InnerVolumeSpecName "kube-api-access-cb6qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:01:07.022440 master-0 kubenswrapper[18707]: I0320 10:01:07.021539 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9916321f-96d9-474c-ae20-6fb34f4b79c1" (UID: "9916321f-96d9-474c-ae20-6fb34f4b79c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:01:07.060698 master-0 kubenswrapper[18707]: I0320 10:01:07.059723 18707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-config-data" (OuterVolumeSpecName: "config-data") pod "9916321f-96d9-474c-ae20-6fb34f4b79c1" (UID: "9916321f-96d9-474c-ae20-6fb34f4b79c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:01:07.071902 master-0 kubenswrapper[18707]: I0320 10:01:07.071846 18707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 20 10:01:07.071902 master-0 kubenswrapper[18707]: I0320 10:01:07.071906 18707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 20 10:01:07.072763 master-0 kubenswrapper[18707]: I0320 10:01:07.071922 18707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9916321f-96d9-474c-ae20-6fb34f4b79c1-config-data\") on node \"master-0\" DevicePath \"\"" Mar 20 10:01:07.072763 master-0 kubenswrapper[18707]: I0320 10:01:07.071935 18707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb6qh\" (UniqueName: \"kubernetes.io/projected/9916321f-96d9-474c-ae20-6fb34f4b79c1-kube-api-access-cb6qh\") on node \"master-0\" DevicePath \"\"" Mar 20 10:01:07.436780 master-0 kubenswrapper[18707]: I0320 10:01:07.436607 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566681-szrxn" event={"ID":"9916321f-96d9-474c-ae20-6fb34f4b79c1","Type":"ContainerDied","Data":"e9fb005b56c197e6903d12f151b7b653b1673b6602a447237454dfd6adb2ce67"} Mar 20 10:01:07.437097 master-0 kubenswrapper[18707]: I0320 10:01:07.437070 18707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9fb005b56c197e6903d12f151b7b653b1673b6602a447237454dfd6adb2ce67" Mar 20 10:01:07.437352 master-0 kubenswrapper[18707]: I0320 10:01:07.437328 18707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566681-szrxn" Mar 20 10:18:49.810656 master-0 kubenswrapper[18707]: E0320 10:18:49.809994 18707 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:48318->192.168.32.10:37773: write tcp 192.168.32.10:48318->192.168.32.10:37773: write: broken pipe Mar 20 10:27:19.646069 master-0 kubenswrapper[18707]: I0320 10:27:19.645968 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6k5lb/must-gather-d4zbq"] Mar 20 10:27:19.647350 master-0 kubenswrapper[18707]: E0320 10:27:19.646893 18707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9916321f-96d9-474c-ae20-6fb34f4b79c1" containerName="keystone-cron" Mar 20 10:27:19.647350 master-0 kubenswrapper[18707]: I0320 10:27:19.646925 18707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9916321f-96d9-474c-ae20-6fb34f4b79c1" containerName="keystone-cron" Mar 20 10:27:19.647350 master-0 kubenswrapper[18707]: I0320 10:27:19.647289 18707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9916321f-96d9-474c-ae20-6fb34f4b79c1" containerName="keystone-cron" Mar 20 10:27:19.649763 master-0 kubenswrapper[18707]: I0320 10:27:19.649730 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6k5lb/must-gather-d4zbq" Mar 20 10:27:19.652358 master-0 kubenswrapper[18707]: I0320 10:27:19.652309 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6k5lb"/"openshift-service-ca.crt" Mar 20 10:27:19.652754 master-0 kubenswrapper[18707]: I0320 10:27:19.652739 18707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6k5lb"/"kube-root-ca.crt" Mar 20 10:27:19.662366 master-0 kubenswrapper[18707]: I0320 10:27:19.662312 18707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6k5lb/must-gather-88v8x"] Mar 20 10:27:19.665058 master-0 kubenswrapper[18707]: I0320 10:27:19.665012 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6k5lb/must-gather-88v8x" Mar 20 10:27:19.689598 master-0 kubenswrapper[18707]: I0320 10:27:19.689556 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6k5lb/must-gather-d4zbq"] Mar 20 10:27:19.710302 master-0 kubenswrapper[18707]: I0320 10:27:19.710250 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6k5lb/must-gather-88v8x"] Mar 20 10:27:19.718643 master-0 kubenswrapper[18707]: I0320 10:27:19.717224 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e4813c53-b3a8-41d2-becd-6f5ce346956a-must-gather-output\") pod \"must-gather-d4zbq\" (UID: \"e4813c53-b3a8-41d2-becd-6f5ce346956a\") " pod="openshift-must-gather-6k5lb/must-gather-d4zbq" Mar 20 10:27:19.718643 master-0 kubenswrapper[18707]: I0320 10:27:19.717377 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nggvg\" (UniqueName: \"kubernetes.io/projected/e4813c53-b3a8-41d2-becd-6f5ce346956a-kube-api-access-nggvg\") pod \"must-gather-d4zbq\" (UID: \"e4813c53-b3a8-41d2-becd-6f5ce346956a\") " pod="openshift-must-gather-6k5lb/must-gather-d4zbq" Mar 20 10:27:19.823056 master-0 kubenswrapper[18707]: I0320 10:27:19.822971 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nggvg\" (UniqueName: \"kubernetes.io/projected/e4813c53-b3a8-41d2-becd-6f5ce346956a-kube-api-access-nggvg\") pod \"must-gather-d4zbq\" (UID: \"e4813c53-b3a8-41d2-becd-6f5ce346956a\") " pod="openshift-must-gather-6k5lb/must-gather-d4zbq" Mar 20 10:27:19.823951 master-0 kubenswrapper[18707]: I0320 10:27:19.823532 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4a86c969-215f-4347-84ba-39e6e045e605-must-gather-output\") pod \"must-gather-88v8x\" (UID: \"4a86c969-215f-4347-84ba-39e6e045e605\") " pod="openshift-must-gather-6k5lb/must-gather-88v8x" Mar 20 10:27:19.823951 master-0 kubenswrapper[18707]: I0320 10:27:19.823599 18707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxg87\" (UniqueName: \"kubernetes.io/projected/4a86c969-215f-4347-84ba-39e6e045e605-kube-api-access-vxg87\") pod \"must-gather-88v8x\" (UID: \"4a86c969-215f-4347-84ba-39e6e045e605\") " pod="openshift-must-gather-6k5lb/must-gather-88v8x" Mar 20 10:27:19.823951 master-0 kubenswrapper[18707]: I0320 10:27:19.823826 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e4813c53-b3a8-41d2-becd-6f5ce346956a-must-gather-output\") pod \"must-gather-d4zbq\" (UID: \"e4813c53-b3a8-41d2-becd-6f5ce346956a\") " pod="openshift-must-gather-6k5lb/must-gather-d4zbq" Mar 20 10:27:19.824432 master-0 kubenswrapper[18707]: I0320 10:27:19.824393 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e4813c53-b3a8-41d2-becd-6f5ce346956a-must-gather-output\") pod \"must-gather-d4zbq\" (UID: \"e4813c53-b3a8-41d2-becd-6f5ce346956a\") " pod="openshift-must-gather-6k5lb/must-gather-d4zbq" Mar 20 10:27:19.842913 master-0 kubenswrapper[18707]: I0320 10:27:19.842848 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nggvg\" (UniqueName: \"kubernetes.io/projected/e4813c53-b3a8-41d2-becd-6f5ce346956a-kube-api-access-nggvg\") pod \"must-gather-d4zbq\" (UID: \"e4813c53-b3a8-41d2-becd-6f5ce346956a\") " pod="openshift-must-gather-6k5lb/must-gather-d4zbq" Mar 20 10:27:19.926030 master-0 kubenswrapper[18707]: I0320 10:27:19.925884 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4a86c969-215f-4347-84ba-39e6e045e605-must-gather-output\") pod \"must-gather-88v8x\" (UID: \"4a86c969-215f-4347-84ba-39e6e045e605\") " pod="openshift-must-gather-6k5lb/must-gather-88v8x" Mar 20 10:27:19.926030 master-0 kubenswrapper[18707]: I0320 10:27:19.925971 18707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxg87\" (UniqueName: \"kubernetes.io/projected/4a86c969-215f-4347-84ba-39e6e045e605-kube-api-access-vxg87\") pod \"must-gather-88v8x\" (UID: \"4a86c969-215f-4347-84ba-39e6e045e605\") " pod="openshift-must-gather-6k5lb/must-gather-88v8x" Mar 20 10:27:19.927148 master-0 kubenswrapper[18707]: I0320 10:27:19.926767 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4a86c969-215f-4347-84ba-39e6e045e605-must-gather-output\") pod \"must-gather-88v8x\" (UID: \"4a86c969-215f-4347-84ba-39e6e045e605\") " pod="openshift-must-gather-6k5lb/must-gather-88v8x" Mar 20 10:27:19.943364 master-0 kubenswrapper[18707]: I0320 10:27:19.943318 18707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxg87\" (UniqueName: \"kubernetes.io/projected/4a86c969-215f-4347-84ba-39e6e045e605-kube-api-access-vxg87\") pod \"must-gather-88v8x\" (UID: \"4a86c969-215f-4347-84ba-39e6e045e605\") " pod="openshift-must-gather-6k5lb/must-gather-88v8x" Mar 20 10:27:20.018481 master-0 kubenswrapper[18707]: I0320 10:27:20.018412 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6k5lb/must-gather-d4zbq" Mar 20 10:27:20.062736 master-0 kubenswrapper[18707]: I0320 10:27:20.062675 18707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6k5lb/must-gather-88v8x" Mar 20 10:27:20.754899 master-0 kubenswrapper[18707]: I0320 10:27:20.754859 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6k5lb/must-gather-d4zbq"] Mar 20 10:27:20.765867 master-0 kubenswrapper[18707]: I0320 10:27:20.765809 18707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:27:20.841358 master-0 kubenswrapper[18707]: I0320 10:27:20.840937 18707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6k5lb/must-gather-88v8x"] Mar 20 10:27:21.746451 master-0 kubenswrapper[18707]: I0320 10:27:21.746369 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6k5lb/must-gather-d4zbq" event={"ID":"e4813c53-b3a8-41d2-becd-6f5ce346956a","Type":"ContainerStarted","Data":"348ae3de85d07e99506e369d139e75e14a3aaab152d0b368395ba2f28c101407"} Mar 20 10:27:21.749453 master-0 kubenswrapper[18707]: I0320 10:27:21.749412 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6k5lb/must-gather-88v8x" event={"ID":"4a86c969-215f-4347-84ba-39e6e045e605","Type":"ContainerStarted","Data":"e2971304e122aa5025f93144f466b900cb21191b2ab98c6c549d739fada6a399"} Mar 20 10:27:22.770808 master-0 kubenswrapper[18707]: I0320 10:27:22.769916 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6k5lb/must-gather-d4zbq" event={"ID":"e4813c53-b3a8-41d2-becd-6f5ce346956a","Type":"ContainerStarted","Data":"d7dc529475dfd0a6aecc83835df8bd93dc89bb8c9daa2c32364031537ae85dde"} Mar 20 10:27:23.790379 master-0 kubenswrapper[18707]: I0320 10:27:23.790121 18707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6k5lb/must-gather-d4zbq" event={"ID":"e4813c53-b3a8-41d2-becd-6f5ce346956a","Type":"ContainerStarted","Data":"085ca59f9a8460c1d1b63a3c4e4eb491df63a8c254161c5493995317d01c0c2d"} Mar 20 10:27:23.833502 master-0 kubenswrapper[18707]: I0320 10:27:23.833389 18707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6k5lb/must-gather-d4zbq" podStartSLOduration=3.407587011 podStartE2EDuration="4.833370783s" podCreationTimestamp="2026-03-20 10:27:19 +0000 UTC" firstStartedPulling="2026-03-20 10:27:20.76572557 +0000 UTC m=+6385.921905926" lastFinishedPulling="2026-03-20 10:27:22.191509342 +0000 UTC m=+6387.347689698" observedRunningTime="2026-03-20 10:27:23.82622146 +0000 UTC m=+6388.982401826" watchObservedRunningTime="2026-03-20 10:27:23.833370783 +0000 UTC m=+6388.989551139" Mar 20 10:27:24.551176 master-0 kubenswrapper[18707]: I0320 10:27:24.551099 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-7d58488df-qmm8h_1375da42-ecaf-4d86-b554-25fd1c3d00bd/cluster-version-operator/1.log" Mar 20 10:27:25.372329 master-0 kubenswrapper[18707]: I0320 10:27:25.371876 18707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-7d58488df-qmm8h_1375da42-ecaf-4d86-b554-25fd1c3d00bd/cluster-version-operator/0.log"